Business News

What magic teaches us about misinformation

[ad_1]

“The things right in front of us are often the hardest to see,” declares Apollo Robbins, the world’s most famous theatrical pickpocket. “The things you look at every day, that you’re blinded to.”

As he says these words, he’s standing on stage at a TED conference in 2013. He invites the audience to close their eyes, then to try to recall what he’s wearing. It’s not easy. We imagine that we would have filed all those details away, after a couple of minutes of looking at him speaking. And indeed we could have done. But we didn’t. When we open our eyes we see he’s wearing a dark waistcoat and jacket, a striped tie and a dark-purple shirt.

Robbins ambles into the audience, finding a volunteer — Joe — and leading him on stage. For the next three minutes, Robbins proceeds to bewilder Joe. He announces that he’s trying to steal Joe’s watch, but then asks Joe to check his pockets. In that instant of distraction, the watch is gone. It reappears a moment later on Robbins’s wrist.

Robbins’s larcenous skills are legendary — he once stole actress Jennifer Garner’s engagement ring, and the badges of Jimmy Carter’s secret service bodyguards. Poor Joe didn’t stand a chance.

But it is the final flourish of this talk that is most intriguing. After sending Joe back to the audience, Robbins asks everyone, this time keeping their eyes open, what he is wearing. He has been in plain view of a thousand people the whole time — quite literally in the spotlight. And yet somehow the shirt is now pale and checked, not plain and dark. The tie and waistcoat have gone.

As he says: often the hardest things to see are right in front of us.

It’s difficult for any of us not to be fascinated by Robbins’s skill and particularly by that final act of stagecraft. But for me, after more than a decade dabbling in the field of fact-checking and fighting misinformation, there was an important truth in the disappearance of the waistcoat: we pay less attention than we think.


Why do people — and by “people” I mean “you and I” — accept and spread misinformation? The two obvious explanations are both disheartening.

The first is that we are incapable of telling the difference between truth and lies. In this view, politicians and other opinion-formers are such skilled deceivers that we are helpless, or the issues are so complex that they defy understanding, or we lack basic numeracy and critical-thinking skills.

The second explanation is that we know the difference and we don’t care. In order to stick close to our political tribe, we reach the conclusions we want to reach.

There is truth in both these explanations. But is there a third account of how we think about the claims we see in the news and on social media — an account that, ironically, has received far too little attention? That account centres on attention itself: it suggests that we fail to distinguish truth from lies not because we can’t and not because we won’t, but because — as with Robbins’s waistcoat — we are simply not giving the matter our focus.

What makes the problem worse is our intuitive overconfidence that we will notice what matters, even if we don’t focus closely. If so, the most insidious and underrated problem in our information ecosystem is that we do not give the right kind of attention to the right things at the right time. We are not paying enough attention to what holds our attention.

The art of stage magic allows us to approach this idea from an unusual angle: Gustav Kuhn’s recent book, Experiencing the Impossible, discusses the psychology of magic tricks. “All magic can be explained through misdirection alone,” writes Kuhn, a psychologist who runs the Magic Lab at Goldsmiths, University of London.

Such a strong claim is debatable, but what is beyond debate is that the control and manipulation of attention are central to stage magic. They are also central to understanding misinformation. The Venn diagram of misinformation, misdirection and magic has overlaps with which to conjure.


Consider the following headline, a false claim that circulated online in 2018:

“President Trump Readies Deportation of Melania After Huge Fight At White House”.

It was among 36 headlines which were shown to a thousand experimental participants in a study conducted by psychologists Gordon Pennycook, Ziv Epstein, Mohsen Mosleh and others, and published recently in the scientific journal Nature. Half of the headlines were true and half false, some favouring narratives from the political right and some from the left.

Some participants were asked which headlines they would consider sharing on social media. Others were asked instead which headlines were accurate, unbiased descriptions of real events.

Recall the two leading explanations of why people spread misinformation: first, that they aren’t capable of distinguishing between truth and lies; second, that for partisan reasons they don’t want to.

In this experiment, most people had no trouble distinguishing truth from lies: false headlines were usually spotted and the true headlines were very likely to be identified as such, even when they clashed with a participant’s political preconceptions.

A committed Democrat might savour the idea that Donald Trump was about to deport his own wife, but nevertheless both Republicans and Democrats had no trouble figuring out that the headline was implausible. When asked to sift truth from lies, participants did just that.

But when asked instead which headlines they would consider sharing, people suddenly seemed blind to the difference between truth and lies: they happily shared the headlines that fit with their political sympathies, with false headlines scarcely penalised relative to the truth.

Does this mean that people knowingly spread false information? No doubt some do, but Pennycook and his colleagues think this is not typical. When the participants were asked what they valued “when deciding whether to share a piece of content on social media”, the most popular answer was overwhelmingly clear. Not surprise, not political alignment, not humour. It was that whatever they shared, it should be true.

A puzzle, then: people share material based on political tribalism rather than truth, despite being able to distinguish truth from lies; yet people say that they value accuracy above all else when deciding whether to share. What explains the apparent contradiction?

“People are lazy,” says Pennycook, a psychologist at Regina university. “People don’t engage.”

Is it that simple? We don’t like to think of ourselves as lazy and disengaged. But Pennycook’s research yields clues pointing in that direction.

“If you force people to give intuitive responses, responses that they aren’t permitted to really think that much about, it makes people worse at recognising false content,” explains Pennycook.

A study he published jointly with David Rand of MIT, titled “Lazy, not biased”, found that the ability to pick out fake news headlines from real ones was correlated with performance on a “cognitive reflection test”, which measures people’s tendency to stop and think, suppressing a knee-jerk response.

This suggests that we share fake news not because of malice or ineptitude, but because of impulse and inattention. It is not so different from Robbins’s disappearing waistcoat and tie. Can people spot the difference between a man in formal attire and one with an untucked, open-necked shirt? Of course we can, just as we can spot the difference between real news and fake news. But only if we pay attention.

Elena Xausa

In their 1999 book Magic in Theory, Peter Lamont and Richard Wiseman explore the links between psychology and magic. Wiseman, a professor of psychology at the University of Hertfordshire, warns against drawing too close a link between stage magic and the everyday misdirection we experience in the media and social media ecosystem. For him, a truly effective stage illusion requires a combination of specialised methods. One cannot simply rely on a broad psychological tendency.

Wiseman is fascinated by “change blindness”, which includes our tendency to overlook the disappearance of Robbins’s waistcoat and tie. (Wiseman pioneered his own version of the stunt.) But change blindness only goes so far; not everyone will overlook the change.

“If you are a magician and half the audience notices what you’re up to,” says Wiseman, “then you’re having a bad day.”

Yet if your aim is to get people to remember a political talking point, or to share a video on social media, then you merely need to fool some of the people some of the time. Crude misdirection can work, and the approach has a name in political communications: the “dead cat strategy”. If a dinner party conversation turns awkward, simply toss a dead cat on to the table. People will be outraged but you will succeed in changing the subject. Trump had an unrivalled gift for producing a dead cat whenever he wanted to.

As Boris Johnson faces damaging accusations of accepting large undeclared donations to pay for a lavish refurbishment of his Downing Street flat, one cannot help but wonder about recent leaks claiming that the prime minister had said “let the bodies pile high in their thousands”. Another dead cat on the dinner table?

Magicians have a rather more pleasing approach. Lamont and Wiseman note in Magic in Theory that “a moment of strong natural misdirection occurs when a dove is produced and is allowed to fly upwards. All eyes naturally follow the flight of the dove.”

“At that point,” one magician told Lamont and Wiseman, “you can do anything you want.”

Dead cat or white dove, either attracts our attention. And when we are focused on the distraction, the real tricks can begin.

Elena Xausa

Watching Robbins at work, one is struck by his shamelessness: he announces that he is a pickpocket, and then proceeds to invade the personal space of his chosen victim, fiddling with lapels, touching shoulders and wrists, and patting pockets. It’s clear that he’s up to something, but wherever you look, the larceny is occurring somewhere else.

Among those who study misinformation, these tactics have a parallel: the “firehose of falsehood”. The firehose strategy is simple: barrage ordinary citizens with a stream of lies, inducing a state of learnt helplessness where people shrug and assume nothing is true. The lies don’t need to make sense. What matters is the volume — enough to overwhelm the capabilities of fact-checkers, enough to consume the oxygen of the news cycle. People know you’re lying, but there are so many eye-catching lies that it feels pointless to try to sift for the truth.

The firehose of falsehood was perfected by 21st-century Russian propagandists, but also seemed to characterise the behaviour of the Trump administration, which would lie about anything, no matter how inconsequential or easily disproved — from the size of the crowd at Trump’s inauguration (underwhelming, but who cares?) to whether he won the popular vote in 2016 (no, although in the US electoral system the answer is irrelevant) to whether the 2020 election apparatus in Georgia was run by Democrats (anyone can verify that the secretary of state Brad Raffensperger is a life-long Republican).

I cannot help but be reminded of Robbins. He isn’t trying to escape suspicion: instead, he overwhelms your senses with so many questionable pokes and pinches that you simply cannot see the moment he lifts your watch and straps it on his own wrist.


The silent half of Penn and Teller is not so silent when it comes to the theory of magic. In a piece for Smithsonian magazine, Teller explained the power of letting people leap to their own false conclusions. For example, the early 20th-century magician David P Abbott used to make a golden ball float around his parlour for guests, using an unseen thread to support the ball. The real magic came when Abbott would wander off to fix drinks, leaving the ball behind. Guests would scurry across to examine it, and discover to their astonishment that the ball was much heavier than it looked. The real trick was not only to plausibly disguise the thread; it was to swap the lightweight ball for the hefty duplicate.

“When a magician lets you notice something on your own,” writes Teller, “his lie becomes impenetrable.”

I have often seen the same tendency in the way we interpret information and misinformation based on data definitions that seem intuitive but aren’t. We observe a statistical trend or a policy pledge, and then we leap to conclusions that turn out to be quite mistaken. Why? Because the trend or the pledge is based on an underlying definition we had misunderstood. In my book, How To Make The World Add Up, I call this ill-fated leap “premature enumeration”.

For example, early in 2020, the UK’s home secretary, Priti Patel, defended her plans to restrict “unskilled immigration” by saying that instead UK employers would be able to recruit “economically inactive” UK residents. That all sounds rather progressive, until you realise that “economically inactive” is a definition that includes students and people who are chronically sick — and “unskilled immigration” typically means “paid less than £26,500 a year”, a category that happens to include early-career radiographers, physiotherapists and paramedics.

We approve of reducing unskilled immigration and employing economically inactive people, as long as we never realise that means banning the immigration of medics and hoping students will step up to do the job instead.

Or, to quote Teller, “Nothing fools you better than the lie you tell yourself.”

Elena Xausa

There is hope. Where our actions are based on reflex, a nudge towards making an active choice can make a difference. Alice Pailhès studies the psychology of stage magic with Kuhn at Goldsmiths. One of her experiments examines a “positional force”, in which the magician lays four cards in a line on the table and invites the subject to pick a card. It’s well-known that people tend to gravitate to the third card if right-handed, and the second card if left-handed, plausibly because these are simply the most convenient options.

In the experiment, Pailhès sometimes says, “Push a card towards me”, and sometimes, “Choose a card and then push it towards me”, more explicitly framing it as a decision. That subtle distinction makes a big difference. The first instruction induces 60 per cent of people to pick the expected target out of the four cards. The second instruction, with the faintest hint of encouragement to actively decide, causes the forcing technique to collapse: only 36 per cent of people choose the target card.

Could a similarly subtle reframing work to combat misinformation?

Consider the subjects studied by the team including Pennycook, Epstein and Mosleh. Remember that those subjects displayed a puzzling contradiction: they were well able to distinguish fake news from true headlines, they said that they valued truth above everything else when considering what to share, and yet they were nearly as likely to share lies as true claims.

It does not take much to change this. In one follow-up study, the researchers primed people’s attention by asking them to rate the truth of a headline. After this priming question, people were substantially less likely to share false headlines than a control group shown the same headlines. People care about the truth, and they can discern the difference between truth and lies — but they do not always think about the truth. Invite them to focus on truth, just for a moment, and they start living up to their professed beliefs. They start paying attention to what is true.

Just as with Pailhès’s subtle prompt to make an active choice of card, when Pennycook and colleagues subtly prompted people to focus on truth, the difference was stark.

That difference is observable not just in a survey about hypothetical behaviour but in the wilds of social media. Pennycook’s team sent a direct message to more than 5,000 Twitter users who had recently shared information from highly partisan websites.

The message simply showed people a non-political headline and asked for their opinion as to whether the headline was true or not. This message primed people to think about accuracy.

Different users received the message on different days, but in the 24 hours after receiving the message, users were more likely to share headlines from serious news sources such as The New York Times and less likely to share content from Breitbart and The Daily Caller. They were also more likely to practise what is sometimes called “engaged sharing” — adding comments rather than simply pressing a button to retweet.

A lone prompt to ponder whether a single headline was true then influenced what people shared all day. It is a striking demonstration that sometimes what we need is not more facts, more numeracy and less partisanship, desirable though all that might be. Sometimes what we need is to pay more attention to the truth.

Elena Xausa

Paying attention is not so hard, but we first need to realise that there is a problem. And the overarching lesson of the psychology of misdirection is this: we are blind to our own blindness. The psychologists Lars Hall and Petter Johansson of Lund University and their research team collaborated with professional magicians to devise an intriguing experiment. Hall and Johansson would repeatedly show research subjects a pair of portrait photographs and ask them which of the two faces they found more attractive. They would then hand over the photograph and ask the subjects to explain why. Participants gazed again at the photographs and had no trouble justifying their choices:

“I like his smile.”

“I’m a photographer, I like the way she’s lit.”

“I don’t know, looks a little bit like a Hobbit.”

All plausible reasons, but Johansson would often use sleight of hand to swap the photographs. Experimental subjects did not detect the swapping method, and rarely noticed that they were now gazing at the very face they had rejected just seconds previously. The justifications they used were indistinguishable from those they used when no swap had taken place.

Hall and Johansson repeated the trick with policy questions: they quizzed people about their voting intentions and asked them to place crosses indicating their positions on 12 different policy questions from “strongly opposed” to “strongly in favour”. Using an elegant bit of trickery, the researchers flipped the responses, showing people the exact reverse of their choices. While occasionally nonplussed, respondents usually did not notice that anything was amiss and produced plausible justifications for whatever they had “chosen”. (This technique turns out to be quite effective at shifting voting intentions, too.)

It is a remarkable finding: we will argue fluently in favour of a policy position that we did not hold simply because a conjuring trick has persuaded us that we did hold it. And as with Robbins’s waistcoat, the surprise is not just that we do not notice. It is that we are so certain that we would.

We retweet misinformation because we don’t think for long enough to see that it is misinformation. We obsess over bold lies, not realising that their entire purpose is to obsess us. We see one thing and assume it is another, even though we are only deceiving ourselves. We will argue in favour of policies that we opposed seconds ago, as long as we can be distracted long enough to flip our political identities in a mirror.

And behind all this is the grand meta-error: we have no intuitive sense that our minds work like this. We fondly imagine ourselves to be sharper, more attentive and more consistent than we truly are. Our own brains conspire in the illusion, filling the vast blind spots with plausible images.

It all seems relentlessly depressing, but there is plenty of hope in this account of why we fall for misinformation. It turns out that we can tell the difference between truth and lies, and that our political opinions are less stubbornly tribal than one might think. But we need to slow down and pay attention.

If Teller decides to slip a lemon under a cup during a cups and balls routine, or Robbins decides to remove your watch, you don’t have much of a chance. These professional performers are too skilled; their methods are too well-honed.

But if you decide to think carefully about the headlines, or the data visualisations that adorn news websites, or the eye-catching statistics that circulate on social media, you may be surprised: statistics aren’t actually stage magic. Many of them are telling us important truths about the world, and those that are lies are usually lies that we can spot without too much trouble. Pay attention; get some context; ask questions; stop and think.

Misinformation doesn’t thrive because we can’t spot the tricks. It thrives because, all too often, we don’t try. We don’t try, because we are confident that we already did.

Tim Harford’s book ‘How To Make The World Add Up’ is published in paperback this week

Follow @FTMag on Twitter to find out about our latest stories first.



[ad_2]

Source link

Related Articles

Leave a Reply

Back to top button