Great article in the latest Scientific American by Daniel T. Willingham. He is a professor of psychology at the University of Virginia and the author of “Why Don’t Students Like School?”. His article discusses why so many people choose not to believe what scientists say, so is of direct interest to skeptics.
A friend of mine has long held that a vaccination his son received as an infant triggered his child’s autism. He clings to this belief despite a string of scientific studies that show no link between autism and vaccines. When the original paper on such a link was recently discredited as a fraud, my friend’s reaction was that it will now be more difficult to persuade people of the dangers of vaccination. He is not alone: nearly half of all Americans believe in the vaccine-autism link or are unsure about it.
The paradox goes deeper. My friend insists that he trusts scientists—and again, in this respect, he is like most Americans. In a 2008 survey by the National Science Foundation, more respondents expressed “a great deal” of confidence in science leaders than in leaders of any other institution except the military. On public policy issues, Americans believe that science leaders are more knowledgeable and impartial than leaders in other sectors of society, such as business or government. Why do people say that they trust scientists in general but part company with them on specific issues?
Many individuals blame the poor quality of science education in the U.S. If kids got more science in school, the thinking goes, they would learn to appreciate scientific opinion on vaccines, climate, evolution and other policy issues. But this is a misconception. Those who know more science have only a slightly greater propensity to trust scientists. The science behind many policy issues is highly specialized, and evaluating it requires deep knowledge—deeper than students are going to get in elementary and high school science classes. A more direct approach would be to educate people about why they are prone to accept inaccurate beliefs in the first place.
You can read the rest of the article here.
I like to suggest that there are other additional factors to toss into the mix as well. Communication skills – some are rather prone to spout reams of techno-babble that means a lot to immediate SME’s (Subject matter experts) and others familiar with the conversation, but tends to baffle many others. However, I’m not suggesting that pure clarity would help all that much.
I do indeed ponder why smart people appear to select and embrace specific ideas and reject others for no logical reason. He appears to understand this when he observes, “In reconciling our rational and irrational motives for belief, we have become good at kidding ourselves.”, and appears to suggest that we can tackle this by imparting an understanding both the motivations for belief and of science as a method of knowing, but darn it, smart folks do still retain a very high degree of irrationality, so I’m not convinced that better education is a solution either.
I may of course be wrong, perhaps better clearer communication is the answer, what do you think? Why do smart folks embrace some irrational ideas as “truth”, and how do we tackle this?
Personally I have no easy answers, and am still pondering over it all.