Join Our 5-Week ML/AI Engineer Interview Bootcamp 🚀 led by ML Tech Leads at FAANGs

Back to Questions

178. Softmax

easy
GeneralGeneral
senior

Implement the softmax function, a common output layer in deep learning that converts raw scores (logits) into a probability distribution. Softmax is defined as:

softmax(zi)=ezi∑j=1nezj\text{softmax}(z_i) = \frac{e^{z_i}}{\sum_{j=1}^{n} e^{z_j}}

Requirements

Implement the function

python

Rules:

  • Ensure the input logits is a float array.
  • Use a numerically stable approach by subtracting max(logits) before exponentiating.
  • Return the result as a NumPy array.
  • Do not use any prebuilt softmax utilities (e.g., scipy.special.softmax).
  • Keep it as a single function with no helper classes.

Example

python

Output:

python
Input Signature
ArgumentType
logitsnp.ndarray
Output Signature
Return NameType
valuenp.ndarray

Constraints

  • Use NumPy.

  • Subtract max logit before np.exp.

  • Return ndarray.

Hint 1

Ensure inputs are NumPy arrays so you can apply vectorized operations like np.max, np.exp, and np.sum.

Hint 2

For numerical stability, shift the logits before exponentiating: shifted = arr - np.max(arr) (this doesn’t change the final probabilities).

Hint 3

Compute exp_vals = np.exp(shifted), then normalize: probs = exp_vals / np.sum(exp_vals).

Roles
ML Engineer
AI Engineer
Companies
GeneralGeneral
Levels
senior
entry
Tags
softmax
numerical-stability
numpy
probability-normalization
47 people are solving this problem
Python LogoPython Editor
Ln 1, Col 1

Input Arguments

Edit values below to test with custom inputs

You need tolog in/sign upto run or submit