Rat Choice, Social Science and Random Linkage
I've spent the last couple of days reading a Critical Review symposium on Green and Shapiro's Pathologies of Rational Choice. (For anyone interested, and with access to the SSL, it's JC 585.CRI) Some of it is way over my head, but much of it makes me wish I'd learned more from our Philosophy of Social Science classes - particularly when Kenneth Shlepsle cited similar classes he'd taken as a grad student.
It's also amazing how many day to day things this ties into. The other day I'd just read about how few people act 'rationally' unless 'trained' to do so (e.g. by an economics course), and how consequently economists are often more selfish. I've heard similar before, and indeed I think my own three years studying economics (A level and Prelims) has affected my attitudes and behaviour. I also remember Rob telling me how in the 'Game Theory and Negotation' class that was also an (optional) part of our research training, the group comprised mainly of political theorists reached different bargaining equilibria governed by norms of fairness rather than pure self-interest. Anyway, Milan brings up something related here.
Now I've just read a post by Scott Adams on the confirmation bias - one of the pathologies with which Green and Shapiro charge rat choicers. His characterisation is rather crude, but makes the point.
That's a rather crude characterisation of confirmation bias. More generously, it's simply the fact that people are more inclined to seek out information that supports their pet hypothesis, and more inclined to give such evidence greater weight, while dismissing what doesn't fit as 'flawed' or 'irrelevant', on often spurious grounds.
The fact is though, that often both sides are guilty. Take Creationists vs Evolutionists. The Evolutionionists point to fossils, and ask the Creationists to explain. Creationists can just tweak their theory (e.g. 'fossils are put there to test our faith'). This makes Creationism sound rather dodgy to many (non-Creationists), but it is consistent even if gerry-mandered. On the other hand, if the real issue is belief in God, (moderate) Creationism is compatible with evolution, whereas secular Evolutionists can't give an answer to what accounts for the start of the process or the process itself. After surveying a good body of evidence, the fact is most people will tend to interpret it to support whatever it was they were already inclined to believe. This could simply be (in part) because the evidence is inconclusive.
What I find really puzzling is how this confirmation bias can be empirically demonstrated. Surely researchers must go out looking for examples of confirmation bias, but if their theory is true you'd think they might be guilty of it themselves (e.g. attaching too much signifance to a few supporting cases). Or is the beauty of the theory that it thereby guarantees its own truth?)
One of the existing comments on his piece also made me think:
1) Logic is stupid because you can make a valid (but not sound) argument for almost anything.
True, the paradox of entailment means that from 2+2=5 (premise) one can validly conclude that the Pope is a Muslim, the Genesis story is literal truth and the moon is made of cheese. I remember we all found this quite disturbing in first year logic. What our tutor could probably have explained better is that validity is not all that matters in philosophical argument: a good argument is sound (i.e. it starts from plausible premises, as well as being valid). One criticism often made of Nozick's Anarchy, State & Utopia is that he starts with such libertarian premises his conclusions are hardly surprising. Returning to the general theme of Green and Shapiro, however, the fact that some arguments are bad doesn't invalidate the ones that are good.
2) In the real world the guy on the block with the biggest gun and the will to use it always wins. History, not logic, has proved this point time and time again.
This takes a rather cynical view of what it is to win an argument. Those of us in philosophy and/or social science would generally like to think we aim at the truth. (Of course, there are some who think it's fashionable to deny such a thing as truth - but it's not clear whether statements like 'there are no truths' are meaningful or self-refuting). Someone with a big gun may oppress everyone into agreeing with him, but it doesn't make him right. Might is certainly not an epistemic reason for belief. Suppose the Church threatened anyone who didn't believe the Sun revolved around the Earth. Even if they could command universal assent, it would have no bearing on astronomical reality.