This post is inspired by thoughts from two actual researchers into the psychology of belief: Stuart Vyse, author of the engaging book Believing in Magic: The Psychology of Superstition, and Bruce Hood, who gave a talk at this year's Edinburgh Science Festival. I think everyone should learn something about the psychology of belief, as an inoculant against falling for common pitfalls in thinking. But that's a topic for another post.
I recently had a disquieting train of thought about the difference between superstitious and rational beliefs.
At their root, superstitions are simply over-eager assignments of cause and effect.
A famous baseball player (says Vyse) once had a great game after eating chicken. Thereafter, he always ate chicken before a game, in case not eating chicken caused him to have a bad game. This behaviour survived (one infers) countless games where he ate chicken but didn't win.
Then there's the pigeon experiments. Put a pigeon in a room, and it'll do its pigeon thing. Then start to introduce rewards (food pellets, for example) at random intervals, not affected by the pigeon's behaviour. They will begin to obsessively repeat a single behaviour (say, turning left) that they happened to be doing when the first pellet appeared. Apparently, they think that doing that causes the reward to appear, even though pellets don't appear all the time they exhibit the behaviour.
It seems that superstitious behaviour - generalizing cause and effect from a single accidental coincidence - is a natural consequence of the mental makeup of widely-different vertebrate species. And at its root is the slightly over-productive instinct that, if A is followed by B, then A caused B. It's a useful heuristic in evolutionary terms, but it's also a classic post-hoc logical fallacy.
On the other hand, in science, we try to distinguish real patterns from coincidences in a reliable way. We run trials. A team of scientists suspects that eating chicken causes winning, so they set out to test it. They run an experiment. In one part, athletes eat chicken before the game 500 times. In the other part, athletes eat something else before the game 500 times. The team can then see whether meals with chicken are, in fact, followed by wins (compared with how often chicken-free meals are followed by wins).
But science is a lot of work, and everyday life is filled with situations where we need to decide causation. We can't always get rigorous evidence about whether chicken is reliably followed by winning.
So at what point does the not-quite-scientific belief turn into a superstitious belief? The pigeon or the athlete infers causation on the basis of a single event: that is superstitious. The researcher infers causation on the basis of a thousand-subject scientific trial: that is not being superstitious. Where exactly is the crossover between science and superstition?
The troubling fact is, of course, that there is no crossover. There is no point at which superstition suddenly becomes science.
I probably have several beliefs about causation which are much closer to superstition than they are to science on the continuum. From beliefs about people (first impressions - this person is shallow, that person is reliable) to ideas about how to get a computer to perform a rare task (like getting the headphones to work on my Linux machine at school).
This is disquieting indeed, since science and scientific thinking form a central part of my ethical outlook on life. Am I actually just as guilty as my superstitious ancestors of basing my beliefs on bad evidence?
Not completely. There is some reassurance to be had. It has to do with certainty, and I get to quote a Scottish philosopher at the end.
Facts that have been studied and established with scientific experiments deserve a high degree of certainty. (Never absolute certainty - that is reserved for instincts and delusions.) Facts that come from extensive experience but haven't been rigorously tested scientifically deserve confidence, but less than scientifically-tested facts. And so on, down to say, me clicking a box in the control panel of my computer and finding that my headphones work again. It's only happened once, and I'm not prepared to undo it to see if it works again, but I will maintain the belief that, should the problem present itself again, the same solution will work again. This is a very tentative belief - it would take very little evidence to persuade me that it is false.
Is it still a belief then, if it's so tenuous? Of course it is. It will guide my actions should my headphones stop working again.
So here is the conclusion to this troubling line of thought: No matter who you are, you almost certainly hold some beliefs that are superstitious, in that they are based on the same sort of evidence as the athlete's chicken=winning belief. This is unavoidable - not just biologically but practically in our busy lives.
But it's not a bad thing, so long as you (in Hume's words) "proportion [your] belief to the evidence."
Those people who suggest that you cannot live entirely on sound, empirical scientific beliefs are right. It would just be too much work.
Those who suggest that you have to commit yourself wholeheartedly to unsupported claims are wrong. There is nothing wrong with admitting you don't know. It doesn't preclude acting decisively, and it is a good inoculant against unjustified dogmatism and fundamentalism.
[edit 2007 November 1]
Just a day after writing the above, I got to the section in Richard Dawkins' book, Unweaving the Rainbow, where he discusses just this. He also reminded me of the source of the pigeon experiments - they were performed by the great psychologist B. F. Skinner. The boxes in which the pigeons underwent these experiments are called Skinner boxes.