In a useful July 23, 2008 commentary on his blog, “The Brain Who Mistook a Joke for a Fact,” Princeton biology professor Sam Wang has a very interesting take on the infamous New Yorker cover. First he notes just how dumb the various New Yorker editors are about how information is dealt with by the human brain:
. . . the editor of The New Yorker, David Remnick, has pointed out that the magazine’s liberal leanings are well-known. He wonders: can’t people take a joke? The short answer is that we would, but our brains won’t let us. After our brains store a fact, the information does not rest. Instead, as a piece of information is recalled, it may be “written” down again as part of the process of strengthening it. Along the way, the fact is gradually separated from the context in which it was originally learned.
Of course, the New Yorker folks likely did not check with researchers or with the Black community before running the so-called satire. Wang then adds:
Most of the time this trick is useful. . . . But the same trick can lead people to forget whether a statement is even true. A false statement from a noncredible source that is at first not believed can gain credibility during the months it takes to reprocess memories from short-term to longer-term storage. As the source is forgotten, the message and its implications gain strength. So any intended satire in the magazine cover may eventually be forgotten, leaving people to recall vaguely that Barack Obama is somehow un-American.
False memories develop especially easily in situations like the New Yorker cover:
Daniel Gilbert and his colleagues have shown that if people aren’t given enough time to think, they tend to automatically accept a statement as being true. Visual information is processed particularly rapidly. And what’s more immediate than a caricature?
The emotional dimension of a presentation is also very important:
. . . ideas can spread by emotional selection, rather than by their factual merits, encouraging the persistence of falsehoods. Indeed, unscrupulous campaign strategists know that if their message is initially memorable, its impression will persist long after it is debunked.
Lesson one about framing is that by repeating a false framing a lot, even in countering it, one only reinforces it:
In covering the controversy . . . virtually every major TV journalist repeated the stereotyped charges against the candidate . . . before noting that the beliefs were false. . . . In television, which above all else is a visual medium, image can easily trump verbal content.
The old cliche works here: A picture is worth a thousand words. How then do we handle this type of negative presentation? Wang suggests this:
If journalists are to avoid adding to the public’s misinformation, they need to find other strategies, such as offering an equally competing, true storyline. . . . rather than repeating the false belief then denying that Obama is a Muslim, a less misleading approach would be to report on the candidate’s discovery of Christianity after a secular youth.
Offer the true storyline! Frame analysis would put this a bit clearer: You do not accept the framing of the negative story of your opponent or other false presentation, and you do not repeat words and images from that inaccurate or negative frame, but you reframe an issue from another and accurate frame.