Join Our 5-Week ML/AI Engineer Interview Bootcamp 🚀 led by ML Tech Leads at FAANGs

Back to Questions

94. Perplexity

easy
GeneralGeneral
senior

Compute perplexity for a simple NLP language model, which measures how surprised the model is by a sequence of tokens. You’ll be given token-level probabilities for the true next token and should return a single perplexity score.

Requirements

Implement the function

python

Rules:

  • Use the definition PPL=exp(1Nt=1Nlog(pt))\mathrm{PPL} = \exp\left(-\frac{1}{N}\sum_{t=1}^{N}\log(p_t)\right) where (p_t) is the probability assigned to the true token.
  • Use natural log (np.log) and natural exponential (np.exp).
  • Return a single float (not a list).
  • Don’t use any prebuilt NLP/LM evaluation utilities.

Example

python

Output:

python
Input Signature
ArgumentType
next_token_probsnp.ndarray
Output Signature
Return NameType
valuefloat

Constraints

  • Use natural log/exp (np.log, np.exp).

  • Return single float, not list.

  • No pretrained NLP evaluation utilities.

Hint 1

Perplexity is the exponential of the average negative log probability: compute mean(-log(p_t)) over the array, then exp it.

Hint 2

Use np.mean on np.log(next_token_probs).

Hint 3

Watch edge cases: p_t must be in (0, 1].

Roles
ML Engineer
AI Engineer
Data Scientist
Quantitative Analyst
Companies
GeneralGeneral
Levels
senior
entry
Tags
perplexity
log-likelihood
numerical-stability
language-model-evaluation
26 people are solving this problem
Python LogoPython Editor
Ln 1, Col 1

Input Arguments

Edit values below to test with custom inputs

You need tolog in/sign upto run or submit