Screen Shot 2019-02-08 at 11.14.40 PM.png

False Positives and False Negatives - Explanation and examples

How to distinguish evidence from reality.

 

What are false positives and false negatives?

False positives and negatives occur when the outcome of an experiment does not accurately reflect what happened in reality. Suppose you are going through airport security and, being the law-abiding citizen that you are, you haven’t brought any prohibited items such as a knife, gun, or your favourite flame-thrower. Yet when you walk through the scanner, the alarm goes off anyway. We’d call this result a false positive: the outcome of the test is positive (i.e. sirens blaring ‘this person is carrying dangerous items’)...but it is false (in reality you’ve done nothing wrong).

Now say you’re on your way back from holiday, but you’ve forgotten that you bought an aerosol of deodorant and have been keeping it in your hand luggage. You pass through security with no trouble, then when you’re freshening up on the plane you realise you still have it. The scanner must have produced a false negative result: the scanner never went off when you walked through it (a negative outcome) even though you did accidentally bring a contraband aerosol onto the plane (making the outcome of the test false).

More examples of false positives and false negatives

A mammogram is a test that identifies whether someone has breast cancer. A false positive result would incorrectly diagnose that a patient has breast cancer, while a false negative one would fail to detect a patient who does have it.

Given that 1% of women will have breast cancer, if the test correctly identifies women with breast cancer 90% of the time but incorrectly identifies breast cancer 8% of the time, what's the likelihood that a woman with a positive mammogram result has breast cancer?

Seems like around 90% right? This is not a trick question, but, worryingly, it defeats about 2/3 of doctors when asked. The intuitive (and wrong) line of thinking is to start with the 90% true positive rate and then slightly round down “because the test is wrong sometimes”. But in fact, the likelihood that it’s cancer is way, way lower. It’s at around 10%!

Let's say 1000 women are given a mammogram. 10 will have breast cancer (1% of 1000), but the test will only pick up on this 90% of the time, so 1 woman will have a false negative result.

Now there are 990 women left who do not have cancer; but since the test incorrectly identifies breast cancer 8% of the time, 79 women will have a false positive result (8% of 990).

Of all the 88 people where the test has come out positive, 79 of them are false positives!

So if you've got a positive reading, the likelihood that you really have breast cancer is ~10% (9 / 9+79 * 100).

Screen Shot 2019-02-08 at 11.33.18 PM.png

Get one concept every week in your inbox