Join Our 5-Week ML/AI Engineer Interview Bootcamp 🚀 led by ML Tech Leads at FAANGs
Implement the softmax function, a common output layer in deep learning that converts raw scores (logits) into a probability distribution. Softmax is defined as:
Implement the function
Rules:
logits is a float array.max(logits) before exponentiating.scipy.special.softmax).Output:
| Argument | Type |
|---|---|
| logits | np.ndarray |
| Return Name | Type |
|---|---|
| value | np.ndarray |
Use NumPy.
Subtract max logit before np.exp.
Return ndarray.
Ensure inputs are NumPy arrays so you can apply vectorized operations like np.max, np.exp, and np.sum.
For numerical stability, shift the logits before exponentiating: shifted = arr - np.max(arr) (this doesn’t change the final probabilities).
Compute exp_vals = np.exp(shifted), then normalize: probs = exp_vals / np.sum(exp_vals).