Saturday, October 12, 2013
Friday, October 4, 2013
Thursday, September 12, 2013
Imagine you are a guest on the classic game-show Let's make a deal, and the host, Monty Hall, presents you with the following offer: In front of you are three doors, one of them concealing a car, the other two concealing goats (!). All you have to do to win the car is pick the right door. Once you make your choice Monty opens one of the two remaining doors, revealing a goat. You are then asked if you want to switch to the last remaining door or stick with your original choice.
So, assuming you really do want the car, should you go for the last door, should you stick with your first choice or doesn't it matter? If you are like most people, myself included, your intuitive first reaction is that it doesn't matter, since there are only two doors and the car is behind one of them. That means the odds are fifty fifty either way, right?
But wait a minute! Before you made your choice you knew that no matter which door you picked there was a 2/3 chance it concealed a goat. If you picked a goat, as you probably did, then Monty just eliminated the other goat, leaving only the door concealing the car. Thus by switching to the last door you increase your chances of winning the car from 1/3 to 2/3. Let's say the car is behind door 3. If you chose to stick with your original choice, there are 3 possibilities:
- You pick door 1 and Monty opens door 2. You stick with door 1 and win a goat.
- You pick door 2 and Monty opens door 1. You stick with door 2 and win a goat.
- You pick door 3 and Monty opens door 1 or 2. You stick with door 3 and win the car.
Now let's look at what happens to the odds if you accept the offer to change your choice. Again there are 3 possibilities:
- You pick door 1 and Monty opens door 2. You switch to door 3 and win the car.
- You pick door 2 and Monty opens door 1. You switch to door 3 and win the car.
- You pick door 3 and Monty opens door 1 or 2. You switch and win one of the goats.
Most people's initial response to the Monty Hall problem (i.e. the odds are fifty fifty either way) is an ideal example of intuitive thinking. An answer comes to mind almost immediately, it doesn't take conscious effort or reasoning, and the answer seems obviously right at first sight. As we have just seen, the process is otherwise far from infallible, and, as we shall see later, when it goes wrong, it tends to do so in systematic ways.
The kind of thinking that helped us arrive at the correct answer is called analytical thinking and has (among other things) the following characteristics: It's comparatively slow, it takes conscious effort/reasoning, and the answer does not seem obvious at first sight. The process also tends to produce more reliable conclusions, but not invariably. It is also worth noting that the process is far too slow and inefficient to be our standard mode of thinking in everyday life.
Intuitive and analytical thinking are sometimes attributed to separate cognitive systems called system 1 and system 2 respectively. As long as we keep in mind that these are abstractions and don't refer to distinct parts of the brain, these are both useful shorthands. There's a lot more to it, but this will do for now. On this blog I will frequently return to the difference between what seems intuitively right to our system 1 and what we can rationally infer using our system 2.
Sunday, August 4, 2013
We're living in an age when information is more accessible than ever. On the other hand it's becoming increasingly difficult so separate good from bad - reliable from unreliable - information. The mental tools needed to sort through this tsunami of conflicting claims doesn't come automatically with more information. The same technologies that have made it easier than ever to spread good information have also made it easier than ever to spread bad information. If we want to distinguish science from pseudoscience, facts from opinions, reality from superstition, we cannot trust others to do it for us. Unless we, ourselves, have the ability to critically evaluate information, there a very good chance that those who shout loudest get heard.
The entries in this series are largely based on material from my old blog in Norwegian. In it we will explore the psychology of belief and self-deception and the various ways in which our own intuitions work against us when trying to arrive at accurate descriptions of reality. We will also learn to recognize the most common logical fallacies while making an effort to avoid the phenomenon of "fallacy-naming" as a substitute for engaging with the actual substance of an argument. In the section labeled Meta-skepticism* we will study some of the pitfalls associated with critical thinking itself and how to avoid turning the tools of skepticism into just another means of rationalization. In the Science vs. pseudoscience category we will explore some of the common features of pseudoscience and how they differ from those of real science. I will occasionally also explore topics related to the scientific method(s) in general as well as questions of epistemology from the perspective on an interested lay-person. Finally, in the category What's the harm? I will explore why the commonly heard advice to just mind our own business and "let people believe whatever they want" is not as good as it sounds.
* Due to some of the unfortunate "baggage" that has come to be associated with the word "skepticism" (or, perhaps more accurately, "movement skepticism"), I generally prefer to talk about "critical thinking", however, I will occasionally use both expressions as if they were synonymous.