Join Our 5-Week ML/AI Engineer Interview Bootcamp 🚀 led by ML Tech Leads at FAANGs

Back to Questions

11. Activation function relu

easy
GeneralGeneral
senior

Implement the ReLU activation function, a common nonlinearity in deep learning that keeps positive values and zeroes out negative values.
ReLU is defined elementwise as:

ReLU(x)=max(0,x)\text{ReLU}(x) = \max(0, x)

Requirements

Implement the function

python

Rules:

  • Compute ReLU for each element in x.
  • Do not use any deep learning libraries (e.g., PyTorch, TensorFlow).
  • Prefer vectorized NumPy operations over Python loops when possible.
  • Keep the function pure: no printing, no in-place mutation required.

Example

python

Output:

python
Input Signature
ArgumentType
xnp.ndarray
Output Signature
Return NameType
valuenp.ndarray

Constraints

  • Return NumPy array

  • Use NumPy vectorization; avoid deep learning libs

  • No prints; no in-place mutation

Hint 1

ReLU works elementwise: each output is max(0, x_i).

Hint 2

If you use NumPy, np.maximum(0.0, arr) applies ReLU to every element at once.

Roles
ML Engineer
AI Engineer
Companies
GeneralGeneral
Levels
senior
entry
Tags
relu
numpy-vectorization
activation-function
elementwise-operation
13 people are solving this problem
Python LogoPython Editor
Ln 1, Col 1

Input Arguments

Edit values below to test with custom inputs

You need tolog in/sign upto run or submit