Join Our 5-Week ML/AI Engineer Interview Bootcamp 🚀 led by ML Tech Leads at FAANGs

Back to Questions

9. Linear Regression

medium
GoogleGoogle
manager

Implement linear regression from scratch using NumPy, trained with batch gradient descent to minimize mean squared error. Provide fit to learn weights and bias and predict to output continuous values.

Requirements

Implement a class LinearRegressionGD:

python

Rules:

  • Optimize Mean Squared Error (MSE) using batch gradient descent.
  • Parameters: weights vector w and bias b; do not add an intercept column to X.
  • __init__ must set defaults and initialize params (e.g., self.w = np.array([]), self.b = 0.0).
  • fit learns w and b; predict returns X @ w + b.
  • Use NumPy only with vectorized operations.

Default hyperparameters:

  • learning_rate = 0.1
  • n_iters = 1000

Example

python

Output:

python
Input Signature
ArgumentType
X_testlist
X_trainlist
y_trainlist
Output Signature
Return NameType
valuelist

Constraints

  • NumPy only; no sklearn/statsmodels

  • Vectorized batch GD; no per-sample loops

  • No intercept column; use separate bias b

Hint 1

Start by converting X to a 2D np.ndarray and y to a 1D array (y.reshape(-1)), then initialize w with zeros of length n_features and b = 0.0.

Hint 2

Use a fully vectorized forward pass each iteration: y_pred = X @ w + b and error = y_pred - y. Batch gradient descent means compute gradients using all samples each step (no loops over rows).

Hint 3

For MSE, the gradients are dw = (1/n) * (X.T @ error) and db = (1/n) * error.sum(). Update with w -= lr * dw, b -= lr * db, and ensure predict(X) returns X @ w + b without adding an intercept column.

Roles
Data Scientist
ML Engineer
AI Engineer
Quantitative Analyst
Companies
GoogleGoogle
AmazonAmazon
Levels
manager
staff
senior
Tags
oop
python class
numpy
gradient descent
linear regression
44 people are solving this problem
Python LogoPython Editor
Ln 1, Col 1

Input Arguments

Edit values below to test with custom inputs

You need tolog in/sign upto run or submit