ML

Bias Variance Tradeoff

TL;DR Resaon Example affect Model’s complexity ⬆️ Model’s complexity ⬇️ Bias wrong assumption assume a quadratic model to be linear underfitting ⬇️ ⬆️ Variance excessive sensitivity to small variations high-degree polynomial model overfitting ⬆️ ⬇️ Inreducible error noisy data Explaination A model’s generalization error can be expressed as the sum of three very different errors:

Linear Regression

Linear Regression Model A linear model makes a prediction $\hat{y}_i$ by simply computing a weighted sum of the input $\boldsymbol{x}_i$, plus a constant $w_0$ called the bias term: For single sample/instances $$ \hat{y}_i = f \left( \boldsymbol{x} \right) = w_0 + \sum_{j=1}^{D}w_{j} x_{i, j} $$

Objective Function

Objective function overview