More threads by David Baxter PhD

David Baxter PhD

Late Founder
We Only Trust Experts If They Agree With Us
by Christie Nicholson, Scientific American
September 18, 2010

We only consider scientists to be experts when their argument is in line with our own previously held beliefs.

We think we trust experts. But a new study finds that what really influences our opinions, more than listening to any expert, is our own beliefs.

Researchers told study subjects about a scientific expert who accepted climate change as real. Subjects who thought that commerce can be environmentally damaging were ready to accept the scientist as an expert. But those who came into the study believing that economic activity could not hurt the environment were 70 percent less likely to accept that the scientist really was an expert.

Then the researchers flipped the situation. They told different subjects that the same hypothetical scientist, with the same accreditation, was skeptical of climate change. Now those who thought that economic activity cannot harm the environment accepted the expert, and the other group was 50 percent less likely to believe in his expertise. The study was published in the Journal of Risk Research.

The investigators found similar results for various other issues, from nuclear waste disposal to gun control. Said one of the authors, ?People tend to keep a biased score of what experts believe, counting a scientist as an 'expert' only when that scientist agrees with the position they find culturally congenial."
 

David Baxter PhD

Late Founder
Science often disbelieved, study finds

Science often disbelieved, study finds
by Karen Franklin, PhD
September 19, 2010

How many times have you found yourself in court, having to defend basic information that is virtually undisputed and noncontroversial among scientists? As it turns out, no matter how knowledgeable you are, or how great your credentials, judges or jurors may disbelieve the scientific evidence you are presenting if it does not match their social values.

That's no big surprise, given decades of social psychology research into cognitive dissonance. But a study funded by the National Science Foundation and scheduled for publication in the Journal of Risk Research sheds new light on why "scientific consensus" fails to persuade.

Study participants were much more likely to see a scientist with elite credentials as an "expert" on such culturally contested issues as global warming, gun control, and the risks of nuclear waste disposal if the expert's position matched the participant's own political leanings.

"These are all matters on which the National Academy of Sciences has issued 'expert consensus' reports," said lead author Dan Kahan, a law professor at Yale University. "Using the reports as a benchmark, no cultural group in our study was more likely than any other to be 'getting it right,' i.e., correctly identifying scientific consensus on these issues. They were all just as likely to report that 'most' scientists favor the position rejected by the National Academy of Sciences expert consensus report if the report reached a conclusion contrary to their own cultural predispositions."

The findings suggest that mere education alone will not increase people's willingness to accept scientific consensus as accurate, said co-author Donald Braman, a law professor at George Washington University. "To make sure people form unbiased perceptions of what scientists are discovering, it is necessary to use communication strategies that reduce the likelihood that citizens of diverse values will find scientific findings threatening to their cultural commitments."

Unfortunately, trends in public consumption of news may make this task increasingly difficult. Although people are spending at least as much time as ever on the news, they are less likely to read the daily newspaper and more likely to get their information from television and online sources including, most recently, their telephones, according to an informative new survey by the Pew Research Center for the People and the Press. This decreases our common knowledge base and makes it easier for ideologically slanted information sources to influence public opinion.

091510.jpg

Indeed, the Pew researchers found ideology inextricably linked with people's choices of news sources. For example, here in the United States, Republicans, conservatives, and so-called "Tea Party" enthusiasts were much more likely than the general public to watch Fox News and listen to Rush Limbaugh. In contrast, the researchers found, supporters of gay rights make up large shares of regular readers of the New York Times and listeners at National Public Radio.

In an interesting analysis of the mainstreaming of extremism, alternative journalist Arun Gupta points out the ease with which political pundits for whom facts are irrelevant can indoctrinate the uninformed. A respondent committed to rational scientific inquiry becomes like a dog chasing its tail: In the time it takes to deconstruct one fraudulent news story, the pundits have concocted five more.

Top myths of popular psychology

For a great myth-busting tool, I recommend Scott Lilienfeld's latest, 50 great Myths of Popular Psychology. Lilienfeld and co-authors Steven Jay Lynn, John Ruscio, and the late Barry Beyerstein provide dozens of examples of entrenched popular beliefs that have been debunked by high-quality research, many relevant to forensic practice. A few examples:
  • Human memory works like a tape recorder or video camera, and accurately records the events we have experienced
  • Abstinence is the only effective treatment for problem drinking
  • Criminal profiling helps solve crimes
Given the public's increasingly atomized sources of information, it behooves us to be knowledgeable about both ideological influences and common myths. What an expert witness might naively regard as established science may, after all, be subject to disbelief.


References:
 
Replying is not possible. This forum is only available as an archive.
Top