Join Our 5-Week ML/AI Engineer Interview Bootcamp 🚀 led by ML Tech Leads at FAANGs

Back to Questions

226. Ensemble averaging

easy
GeneralGeneral
senior

Implement ensemble averaging to combine predictions from multiple ML models into one final prediction. You’ll take a list of per-model predictions and return the averaged prediction, which is a simple but common baseline for improving stability.

Ensemble averaging is defined as:

y^=1M∑m=1My^(m)\hat{y} = \frac{1}{M}\sum_{m=1}^{M}\hat{y}^{(m)}

Requirements

Implement the function

python

Rules:

  • Compute the element-wise mean across models (average over the first dimension).
  • Return a NumPy array of floats with the same length as a single model’s prediction list.
  • Do not use any prebuilt ensembling utilities (e.g., from scikit-learn).
  • Use NumPy for the numeric computation.
  • Keep it as a single Python function.

Example

python

Output:

python
Input Signature
ArgumentType
predictionsnp.ndarray
Output Signature
Return NameType
valuenp.ndarray

Constraints

  • Use NumPy; no sklearn ensembling utilities.

  • Average across models dimension (axis=0).

  • Return np.ndarray[float].

Hint 1

Ensure predictions is a 2D NumPy array of float.

Hint 2

Use np.mean(..., axis=0) to average across models (rows) for each item (column).

Hint 3

Return the result as a NumPy array; ensure output length equals one model’s list.

Roles
ML Engineer
AI Engineer
Companies
GeneralGeneral
Levels
senior
entry
Tags
numpy-mean
ensemble-averaging
vectorization
aggregation
28 people are solving this problem
Python LogoPython Editor
Ln 1, Col 1

Input Arguments

Edit values below to test with custom inputs

You need tolog in/sign upto run or submit