Sun Nov 18 23:00:34 EST 2012

Misinformation and Its Correction

The following concerning the widespread prevalence and persistence of misinformation was published in the SAGE journal Psychological Science in the Public Interest December 2012 vol. 13 no. 3 106-131:

        Misinformation and Its Correction: Continued Influence and Successful Debiasing
        Stephan Lewandowsky,
        Ullrich K. H. Ecker,
        Colleen M. Seifert,
        Norbert Schwarz and
        John Cook

It's a bit lengthy but is worthwhile reading in its entirety and to whet your appetite these are the sub-headings:
    • The Societal Cost of Misinformation
    • Origins of Misinformation
    • From Individual Cognition to Debiasing Strategies
    • Assessing the Truth of a Statement: Recipients' Strategies
    • The Continued Influence Effect: Retractions Fail to Eliminate the Influence of Misinformation
    • Reducing the Impact of Misinformation
    • Corrections in the Face of Existing Belief Systems: Worldview and Skepticism
    • Debiasing in an Open Society
    • Future Directions
    • Concluding Remarks: Psychosocial, Ethical, and Practical Implications

After reading it I emailed the first author the following comment:
Although what your wrote makes complete sense to me, I think an important consideration is missing. Namely, how the human brain functions depends on "software" and "hardware" and it seems to me you just consider the former. To clarify what I mean, consider two extreme examples.

(1) Optical Illusions I'm sure you are aware of the numerous examples of where what you see does not match up with what is really there. Even after you make the measurement that proves you are seeing incorrectly, you can still cannot "see the truth".

(2) Mental Illness Someone who is mentally retarded may not be able to reason well enough to know what is true or false and a schizophrenic often cannot tell what is real and not real.

In both cases how the brain functions prevents the person from knowing what is true and (at least usually) no amount of reasoning will fix the perceptual problem. My hypothesis is that although these are extreme cases, there are a wide range of conditions of varying degrees of impairment that limit the brain's ability to perceive reality and know what is true. Not totally unlike a computer bug, it may be software or hardware and only sometimes can software compensate for a hardware problem.

Over the years I have thought about these issues in areas like economics, politics and religion where there are fewer facts, but the cognitive issues you describe seem to be the same as for misinformation. For example, after the prediction about the end of the world by a religious cult fails to come true, "cognitive dissonance" causes members to become even more religious and this seems similar to what you call the “backfire” effect.

I am less optimistic than you that any of these problems can be solved. Are you familiar with:
        Why Do Humans Reason? Arguments for an Argumentative Theory
        Hugo Mercier and Dan Sperber
        http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1698090 and,
        http://edge.org/conversation/the-argumentative-theory

The claim is made that human reasoning evolved to "devise and evaluate arguments intended to persuade" and not to discover the truth. If that is the case then it may be vary hard to change how evolution has "wired" us.

The only response I received was:
Owing to the volume of traffic I apologize in advance if I cannot issue a personalized response to your message.

Posted by mjm | Permanent link | Comments
comments powered by Disqus