Join Our 5-Week ML/AI Engineer Interview Bootcamp 🚀 led by ML Tech Leads at FAANGs
Using the appstore_transaction_risk DataFrame, what are the precision and recall when classifying transactions as positive if risk_score > 0.80, with true_label = 1 as the positive class? Return precision_score and recall_score rounded to 2 decimal places.
| Column Name | Type |
|---|---|
| transaction_id | int64 |
| risk_score | float64 |
| true_label | int64 |
| Column Name | Type |
|---|---|
| precision_score | float64 |
| recall_score | float64 |
Use DataFrame columns: risk_score, true_label.
Threshold strictly greater than 0.80.
Round precision/recall to 2 decimals.
Create a binary prediction column: pred = (risk_score > 0.80).astype(int).
Compute confusion counts with boolean masks: TP, FP, FN using pred vs true_label.
Use precision = TP/(TP+FP) and recall = TP/(TP+FN); handle zero denominators; round to 2 decimals.