Linear Regression
Fundamental supervised learning algorithm modeling relationship between dependent variable and independent variables using linear equation. Foundational for predictive modeling and statistics.
Visualization
Interactive visualization for Linear Regression
Interactive visualization with step-by-step execution
Implementation
1class LinearRegression {
2 private weights: number[] = [];
3 private bias: number = 0;
4
5 fit(X: number[][], y: number[], learningRate: number = 0.01, epochs: number = 1000): void {
6 const m = X.length;
7 const n = X[0].length;
8 this.weights = new Array(n).fill(0);
9
10 // Gradient descent
11 for (let epoch = 0; epoch < epochs; epoch++) {
12 const predictions = X.map(x => this.predictOne(x));
13
14 // Update weights
15 for (let j = 0; j < n; j++) {
16 let gradient = 0;
17 for (let i = 0; i < m; i++) {
18 gradient += (predictions[i] - y[i]) * X[i][j];
19 }
20 this.weights[j] -= (learningRate / m) * gradient;
21 }
22
23 // Update bias
24 const biasGradient = predictions.reduce((sum, pred, i) => sum + (pred - y[i]), 0);
25 this.bias -= (learningRate / m) * biasGradient;
26 }
27 }
28
29 private predictOne(x: number[]): number {
30 return x.reduce((sum, val, i) => sum + val * this.weights[i], this.bias);
31 }
32
33 predict(X: number[][]): number[] {
34 return X.map(x => this.predictOne(x));
35 }
36
37 score(X: number[][], y: number[]): number {
38 const predictions = this.predict(X);
39 const yMean = y.reduce((a, b) => a + b) / y.length;
40
41 const ssRes = predictions.reduce((sum, pred, i) => sum + (y[i] - pred) ** 2, 0);
42 const ssTot = y.reduce((sum, val) => sum + (val - yMean) ** 2, 0);
43
44 return 1 - (ssRes / ssTot); // R² score
45 }
46}Deep Dive
Theoretical Foundation
Fits line y = β₀ + β₁x₁ + ... + βₙxₙ to minimize squared residuals. Uses Ordinary Least Squares (OLS) or gradient descent. Normal equation: β = (XᵀX)⁻¹Xᵀy. Assumes linear relationship, independence, homoscedasticity, normality of residuals.
Complexity
Time
O(nm) per epoch
O(nm×epochs)
O(nm×epochs)
Space
O(m)
Applications
Industry Use
House price prediction
Stock market analysis
Sales forecasting
Medical diagnosis (risk factors)
Marketing campaign effectiveness
Economic modeling
Quality control in manufacturing
Use Cases
Related Algorithms
K-Nearest Neighbors (KNN)
Simple, instance-based learning algorithm that classifies new data points based on k closest training examples. Non-parametric, lazy learning method used for classification and regression.
Logistic Regression
Binary classification algorithm using sigmoid function to model probability of class membership. Despite name, it's classification not regression. Foundation for neural networks.
Decision Tree
Tree-based model making decisions through sequence of if-else questions. Splits data based on feature values to create hierarchical structure. Interpretable and handles non-linear relationships.
K-Means Clustering
Unsupervised learning algorithm partitioning n observations into k clusters. Each observation belongs to cluster with nearest mean. Widely used for data segmentation and pattern discovery.