AI / ML & Autonomy
Adjust detector sensitivity and observe how true positives, false positives, true negatives, and false negatives change — and choose a threshold that fits the mission risk profile.
Student can explain the trade-off between sensitivity and false alarm rate, read a confusion matrix, and justify a sensitivity setting based on mission risk tolerance.
Adjust detector sensitivity and observe how true positives, false positives, true negatives, and false negatives change — and choose a threshold that fits the mission risk profile.
Open Confidence and False Alarms at `/twin/learn/activities/aiml_simple_fault_rules` — interactive sensitivity slider with confusion-matrix-style counts and operational notes (teaching-grade).
Draw a 2×2 confusion matrix; fill it in for two different threshold settings; calculate precision and recall for each.
Teaching-grade software activity slot — not a flight simulator or certified propagator.
Step 1 — Adjust sensitivity
Drag the slider to change how sensitive the anomaly detector is. Watch how TP, FP, TN, and FN change.
Balanced (Default)
Balanced threshold — reasonable trade-off for most classroom scenarios.
Confusion matrix — 35 test cases
True Positives (TP)
12
Real faults detected correctly
False Positives (FP)
3
Normal cases flagged as faults
True Negatives (TN)
17
Normal cases correctly ignored
False Negatives (FN)
3
Real faults missed
Operational note
3 false alarms and 3 missed faults. A reasonable starting point — both costs are low and the operator remains informed.
Self-check · Local only
Local-only. No submission, no grade. Answers revealed here only.
If you increase detector sensitivity, what typically happens to false positives and false negatives?
What is 'alarm fatigue' in an operations context?
A confusion matrix shows TP=14, FP=7, TN=13, FN=1. What is the precision of the classifier?
Evidence capture · Local only
Local-only. No submission, no backend, no grade. Copy or screenshot to share.
Expected outputs learners should be able to show after the lab (Phase 9 evidence engine preview available).
Adjust the sensitivity slider across five levels; record TP, FP, TN, FN at each level; choose a setting and justify it for a stated mission risk profile.
Responses are not persisted in this preview unless a specific activity component adds storage later.
A detector has TP=14, FP=7, TN=13, FN=1. Calculate precision and recall, and state which setting would cause alarm fatigue and why.
Ask: 'Would you rather miss one real fault, or get seven false alarms per day?' Use student answers to surface the mission-specific trade-off.
Suggested progression from the mission learning path. Links avoid missing activity routes.