Saw this as a comment here in relation to false positives.
Bayesian conditional probability is being completely ignored. Let’s say a COVID-19 test is 90% accurate and 1 out of 100 people have COVID-19. You test positive for COVID-19, what’s the chance the test is correct? 100-1=99 people don’t have COVID-19 but 99×10%= ~10 will falsely test positive. The probability that you actually have COVID-19 is only 1/(1+10)= ~9%. ~90% of positive cases/deaths are FALSE!
This led to a comment that did much to clarify the point.
Seems highly counterintuitive.
But consider:
(a) if 100 people get tested, and only 1 has COVID, approx 10 or 11 will still test positive, with 9 or 10 positives being false;
(b) so if you’re one of the 10 or 11 positives, then (even ignoring for the present that all the positive tests might be false and the real COVID sufferer a false negative) there’s only a 9 or 10% chance you have COVID;
(c ) so if “90% accurate” and “1 in 100 has it” are real stats, then the reasoning seems prima facie sound.
Which then led to this comment.
Conditional probability is a thing. I watched the video and worked it out myself at the point where he said “pause the video and try to work it out”. I didn’t look up the formula either … did it from first principles.
This was the video.
However, even more to the point is this discussion of Bayes’ Theorem. Debated for 250 years but only now becoming accepted as a legitimate means of assessing probability. If you get it, you get it, but it is not all that intuitive, but when you really think it through, it should become obvious.
Pingback: Bayesian conditional probability and false positives - The Rabbit Hole