Join Our 5-Week ML/AI Engineer Interview Bootcamp 🚀 led by ML Tech Leads at FAANGs

Back to Questions

77. Precision And Recall

easy
AppleApple
senior

Using the appstore_transaction_risk DataFrame, what are the precision and recall when classifying transactions as positive if risk_score > 0.80, with true_label = 1 as the positive class? Return precision_score and recall_score rounded to 2 decimal places.

appstore_transaction_risk
Column NameType
transaction_idint64
risk_scorefloat64
true_labelint64
Expected Output Schema
Column NameType
precision_scorefloat64
recall_scorefloat64

Constraints

  • Use DataFrame columns: risk_score, true_label.

  • Threshold strictly greater than 0.80.

  • Round precision/recall to 2 decimals.

Hint 1

Create a binary prediction column: pred = (risk_score > 0.80).astype(int).

Hint 2

Compute confusion counts with boolean masks: TP, FP, FN using pred vs true_label.

Hint 3

Use precision = TP/(TP+FP) and recall = TP/(TP+FN); handle zero denominators; round to 2 decimals.

Roles
Data Scientist
Data Analyst
Data Engineer
Companies
AppleApple
Levels
senior
entry
Tags
pandas
classification-metrics
precision-recall
thresholding
confusion-matrix
43 people are solving this problem
Python LogoPython Editor
Ln 1, Col 1

Input Arguments

Edit values below to test with custom inputs

You need tolog in/sign upto run or submit