#### You May Also Enjoy

## Multivariable Gradient Descent

Multivariable gradient descent is similar to single-variable gradient descent, except that we replace the derivative $f’(x)$ with the gradient $\nabla f(\vec x)$ as follows: ** Read more...**

## Single-Variable Gradient Descent

Another method for fitting models to data, called **gradient descent**, involves minimizing the error between the model and the data by using the gradient vector from multivariable calculus. The idea is that we construct a function (like the RSS) that represents the error between the model and the data set and then minimize that function using the gradient descent procedure. ** Read more...**

## Overfitting, Underfitting, Cross-Validation, and the Bias-Variance Tradeoff

We have previously described a model as “accurate” when it appears to match closely with points in the data set. However, there are issues with this definition that we will need to remedy. In this section, we will expose these issues and develop a more nuanced understanding of model accuracy by way of a concrete example. ** Read more...**

## Power, Exponential, and Logistic Regression via the Pseudoinverse

Previously, we learned that we can use the pseudoinverse to fit any regression model that can be expressed as a linear combination of functions. Unfortunately, there are a handful of useful models that *cannot* be expressed as a linear combination of functions. Here, we will explore $3$ of these models in particular. ** Read more...**

## Leave a Comment