Join Our 5-Week ML/AI Engineer Interview Bootcamp 🚀 led by ML Tech Leads at FAANGs
Implement a function that computes the Mean Squared Error (MSE) between two NumPy arrays: ground-truth values (y_true) and predicted values (y_pred).
The MSE is defined as the average of the squared differences between corresponding elements:
Your function should return the MSE as a Python float.
| Argument | Type |
|---|---|
| y_pred | np.ndarray |
| y_true | np.ndarray |
| Return Name | Type |
|---|---|
| value | float |
Inputs are NumPy arrays of dtype=float
Return value must be a Python float
Compute squared error efficiently with vectorization: (y_true - y_pred) ** 2, then take np.mean(...).