Master Linear Regression with NumPy: Step-by-Step Guide to Building and Optimizing Your First Model!

moubarakmohame4

Mubarak Mohamed

Posted on July 11, 2024

Master Linear Regression with NumPy: Step-by-Step Guide to Building and Optimizing Your First Model!

Linear regression is a simple yet powerful method in machine learning used to model the relationship between a dependent variable (target) and one or more independent variables (predictors). In this article, we will implement a simple linear regression using NumPy, a powerful library for scientific computing in Python. We will cover the different equations necessary for this implementation: the model, the cost function, the gradient, and gradient descent.

1. Linear Regression Model
The linear regression model can be represented by the following equation:

Image description
y=Xθ
where:

  • X is the matrix of predictors.
  • θ is the vector of parameters (coefficients).

2. Cost Function
The cost function in linear regression is often the sum of squared errors (mean squared error). It measures the difference between the values predicted by the model and the actual values.

Image description

3. Gradient
The gradient of the cost function with respect to the parameters
θ is necessary to minimize the cost function using gradient descent. The gradient is calculated as follows:

Image description

4. Gradient Descent
Gradient descent is an iterative optimization method used to minimize the cost function. The parameter update equation is:

Image description

5. Implementation with NumPy
importing libraries

import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
Enter fullscreen mode Exit fullscreen mode

Generate example data

np.random.seed(42)
m = 100
x = 2 * np.random.rand(m, 1)
y = 4 + 3 * x + np.random.randn(m, 1)
plt.figure(figsize=(12, 8))
plt.scatter(x, y)
Enter fullscreen mode Exit fullscreen mode

Image description

Add a bias term (column of 1s) to X

X = np.hstack((x, np.ones(x.shape)))
X.shape
Enter fullscreen mode Exit fullscreen mode

(100, 2)
Initialize the parameters θ to random values

theta = np.random.randn(2, 1)
Enter fullscreen mode Exit fullscreen mode

Model function

def model(X, theta):
    return X.dot(theta)

plt.scatter(x[:, 0], y)
plt.scatter(x[:, 0], model(X, theta), c='r')
Enter fullscreen mode Exit fullscreen mode

Image description
Cost function

def cost_function(X, y, theta):
    m = len(y)
    return 1/(2*m) * np.sum((model(X, theta) - y)**2)

cost_function(X, y, theta)
Enter fullscreen mode Exit fullscreen mode

16.069293038191518

Gradient et Gradient descent

def grad(X, y, theta):
    m = len(y)
    return 1/m * X.T.dot(model(X, theta) - y)

def gradient_descent(X, y, theta, learning_rate, n_iter):
    cost_history = np.zeros(n_iter)
    for i in range(0, n_iter):
        theta = theta - learning_rate * grad(X, y, theta)
        cost_history[i] = cost_function(X, y, theta)

    return theta, cost_history
Enter fullscreen mode Exit fullscreen mode
n_iterations = 1000
learning_rate = 0.01

final_theta, cost_history = gradient_descent(X, y, theta, learning_rate, n_iterations)

final_theta
Enter fullscreen mode Exit fullscreen mode

array([[2.79981142],
[4.18146098]])

predict = model(X, funal_theta)

plt.scatter(x[:, 0], y)
plt.scatter(x[:, 0], predict, c='r')
Enter fullscreen mode Exit fullscreen mode

Image description

Learning curve

plt.plot(range(n_iterations), cost_history)
Enter fullscreen mode Exit fullscreen mode

Image description

This implementation demonstrates how the fundamental concepts of linear regression, such as the model, cost function, gradient, and gradient descent, can be implemented using NumPy. This basic understanding is essential for advancing to more complex machine learning models.
Feel free to explore further by adjusting hyperparameters, adding more features, or trying other optimization techniques to improve your linear regression model.

💖 💪 🙅 🚩
moubarakmohame4
Mubarak Mohamed

Posted on July 11, 2024

Join Our Newsletter. No Spam, Only the good stuff.

Sign up to receive the latest update from our blog.

Related