Rationalization

Jonah Lehrer (author of How We Decide and Proust Was a Neuroscientist) writes in Wired of possible inherent flaws in the human reasoning process. This is important. The mere act of trying to reason something out may lead to errors like confirmation bias.


Much of it stems from this paper (PDF) by Hugo Mercier and Dan Sperber. The abstract (the emphasis is Lehrer's):



Reasoning is generally seen as a mean to improve knowledge and make better decisions. Much evidence, however, shows that reasoning often leads to epistemic distortions and poor decisions. This suggests rethinking the function of reasoning. Our hypothesis is that the function of reasoning is argumentative. It is to devise and evaluate arguments intended to persuade. Reasoning so conceived is adaptive given human exceptional dependence on communication and vulnerability to misinformation. A wide range of evidence in the psychology or reasoning and decision making can be reinterpreted and better explained in the light of this hypothesis. Poor performance in standard reasoning tasks is explained by the lack of argumentative context. When the same problems are placed in a proper argumentative setting, people turn out to be skilled arguers. Skilled arguers, however, are not after the truth but after arguments supporting their views. This explains the notorious confirmation bias. This bias is apparent not only when people are actually arguing but also when they are reasoning proactively with the perspective of having to defend their opinions. Reasoning so motivated can distort evaluations and attitudes and allow the persistence of erroneous beliefs. Proactively used reasoning also favors decisions that are easy to justify but not necessarily better. In all of these instances traditionally described as failures or flaws, reasoning does exactly what can be expected of an argumentative device: look for arguments that support a given conclusion, and favor conclusions in support of which arguments can be found.



I'm always trying to fight this instinct in my own thinking, but it's easier said than done. To have no personal attachment to your own ideas, to be purely rational and logical, is like trying to be purely altruistic. When someone disagrees with your idea, it takes intellectual courage to evaluate their argument at face value and reassess your own thoughts.


But this probably explains why I distrust people who never change their minds and why I try to surround myself with people with a broad spectrum of viewpoints. That is, I try to put checks and balances into the structure of my environment.


I suspect the reason few people past a certain age come up with revolutionary ideas is that people's ideas inevitably calcify, not just because they gain real world empirical data to support one view or another but because it's just less taxing to resort to rules-of-thumb or shortcuts, to past patterns, than to take each idea at face value in the current context.


Massive context shifts are brutal.