Thursday, May 5, 2011

Why So Many People Choose Not to Believe what Scientists Say

A friend of mine has long held that a vaccination his son received as an infant triggered his child’s autism. He clings to this belief despite a string of scientific studies that show no link between autism and vaccines.

When the original paper on such a link was recently discredited as a fraud, my friend’s reaction was that it will now be more difficult to persuade people of the dangers of vaccination. He is not alone: nearly half of all Americans believe in the vaccine-autism link or are unsure about it. 

The paradox goes deeper. My friend insists that he trusts scientists—and again, in this respect, he is like most Americans.

In a 2008 survey by the National Science Foundation, more respondents expressed “a great deal” of confidence in science leaders than in leaders of any other institution except the military.

On public policy issues, Americans believe that science leaders are more knowledgeable and impartial than leaders in other sectors of society, such as business or government.

Why do people say that they trust scientists in general but part company with them on specific issues?  Many individuals blame the poor quality of science education in the U.S. If kids got more science in school, the thinking goes, they would learn to appreciate scientific opinion on vaccines, climate, evolution and other policy issues.

But this is a misconception. Those who know more science have only a slightly greater propensity to trust scientists. The science behind many policy issues is highly specialized, and evaluating it requires deep knowledge—deeper than students are going to get in elementary and high school science classes.

A more direct approach would be to educate people about why they are prone to accept inaccurate beliefs in the first place.