I've just come across a very thoughtful post from Zac Astin, from the Southern Cross Bioethics Institute in Adelaide, Australia. It's a reaction to the "argumentative theory of reason" that I discussed here before going on vacation. As a philosopher, Astin finds the theory believable, but also troubling, as it makes him out to be (in his words) an "evolutionary anomaly." But what's most valuable in the post is his pointing out why we're lucky to have such "anomalies":
In my experience, people who have studied philosophy or related disciplines are often bewildered by the confidence with which others take hold of a belief and assert it as truth; or alternatively, we are bemused by the hidden logic with which people “argue to win” or shift and change their positions so as to score points without any apparent concern for the truth. On the contrary, a philosopher should – to borrow a phrase – work out their knowledge in fear and trembling.
Exactly. I've been meaning to say something about how, although we all are susceptible to "motivated reasoning," there are types of training that may make us less susceptible to at least some of the worst reasoning biases and errors. In journalism, for instance, you learn how to fact check--or at least, you're supposed to learn it. And believe you me, if fact checking exposes a falsehood then a good journalist will get it the hell out of the story. The interesting question is why. One might argue that journalism imparts a "habit of truth," but I think just as important is the awareness that if you get something important factually wrong, there may be consequences. You may be called out for it, having to answer not only to the public but to one's editors. Similarly, in philosophy, there are intellectual standards, enforced from the outside by one's peers. This is what leads one to construct one's arguments painstakingly. Because you know who is watching. A professional training, and the experiences that go along with it, can thus serve as a check on some reasoning biases--but there is no reason to think this is a complete defense. In an environment in which a journalist does not face dire consequences for getting it wrong--or, in which that journalist can claim that critics of his facts are just "biased"--journalistic fact checking may seriously suffer. Relatedly, in an environment in which media "balance" is the norm, your peers don't necessarily expect you to check facts as much as quote two different sides about what they are. Similarly, in an environment in which philosophers or thinkers are all supposed to adhere to some core views (for instance, a Christian college, an ideological think tank), you won't expect all ideas to be subjected to the same degree of withering doubt. And in an environment in which "scientists" (or before that, "natural philosophers") all believe the same thing--the geocentric theory, for instance--it will be very hard for someone to come along with new "anomalous" facts and initiate a paradigm shift. Clearly, then, the environment, and the intellectual peers, matter. This is why biased media channels and blog echo chambers can be so problematic--if the prevailing norms suggest it is okay to, say, swear and engage in personal attacks, then you are setting up a situation where traditional journalistic values (in this case, decorum) will quickly be eroded. Nobody's perfect when it comes to checking one's biases, then. But there's still such a thing as better and worse.