Every so often I wander into a dark corner of the internet and find myself surrounded by creatures who speak to one another at length in a language I can hardly recognize. Except that’s not right–libertarians, skeptics, and transhumanists all speak perfectly fine English; I just can’t quite understand what would lead them to say the things they say, or to care about the things they care about. At least not with some effort. Though there was a time when I, too, spoke a dialect of nerd, and I can still sympathize with those who speak the lingo of open source, speculative fiction, and mathematics.
I found myself in such a dark wood this weekend; specifically, it was the dark wood of LessWrong, “a community blog devoted to refining the art of human rationality.” It’s essentially a bunch of traditional rationalists who have come to the realization that human minds are really terrible at being rational, and who are trying to develop strategies 1) to cope with that fact and 2) to bring on the Singularity in the form of Friendly AI. Their cult leader is a fellow named Eliezer Yudkowsky, who has a number of interestingly, ultimately wrong thoughts about how to define intelligence. Strangely, he also writes a Harry Potter fanfiction in which Harry was raised to worship the scientific method. Surprisingly, it’s worth reading–in fact, the world-building is better than in J.K. Rowling’s version. I suspect Eliezer intended to write Harry as a Mary Sue, but somewhat missed his target due to his philosophy being in fact rather tragically pathetic to anyone with a decent sense of the postlapsarian condition.
G.K. Chesterton wrote that “Every heresy is a truth taught out of proportion.” Eliezer Yudkowsky overstates the importance of Bayes’ Theorem, but he’s right that it’s quite important, and he’s right that it’s difficult to make intuitive. I encourage everyone, even (especially?) those not mathematically inclined, to read his introduction to Bayesian probability theory. Otherwise, you will answer the following question wrongly. You won’t be alone–“studies show” 85% of people answer wrongly–but you’ll still be wrong.
Here’s a story problem about a situation that doctors often encounter:
1% of women at age forty who participate in routine screening have breast cancer. 80% of women with breast cancer will get positive mammographies. 9.6% of women without breast cancer will also get positive mammographies. A woman in this age group had a positive mammography in a routine screening. What is the probability that she actually has breast cancer?
Most people say a number between seventy and eighty percent, that is, three out of every four, but the correct answer is more like one out of every thirteen. If this surprises you, you will misinterpret the results of almost every scientific study you hear about.