Home Health Why Does Misinformation Spread? Human Behaviour Plays A Big Part

Why Does Misinformation Spread? Human Behaviour Plays A Big Part

Why Does Misinformation Spread? Human Behaviour Plays A Big Part | Thrive 50Plus Magazine

There are many answers to this question. Much of the time, however, misinformation spreads and sticks for a simple reason: It gives the believer something that they wanted to believe in the first place.

Why do people believe information [or misinformation] that is untrue?

Misinformation - First Person: Dannagal Young - YouTube

Dannagal Young, professor of communication and political science, University of Delaware

advertisment
Australia's No.1 Caravan Accessories Store

People put feelings over facts

“The reason that misinformation is such a problem is because we demand it,” says Dannagal Young, a professor of communication and political science at the University of Delaware. Her research includes the effects of political media and the psychology of misinformation.

Misinformation is a false belief that’s shared, often but not always with the intent to deceive. “It’s not random lies,” says Young. “The falsehoods that people believe in satisfy certain needs on the part of the individual. They either make [them] feel smart or safe or confident or appreciated or in the know.”

Misinformation succeeds, in layman’s terms, because it makes people feel good. People believe stories that reinforce or reward the way that they see the world. They share stories that boost their ego or make them feel like part of a team.

But as misinformation plays a growing role in U.S. civic life, its negative effects can be seen in everything from QAnon conspiracy theories to anti-mask and vaccine protesters resisting effective COVID-19 public health measures. While the beliefs these adherents cling to range from misinterpretation of data to the blatantly false, they inspire real, and often dangerous, action — and the people who believe in them can be difficult to disabuse.

Canadian Society for Brain, Behaviour & Cognitive Science: Dr. Gordon Pennycook

Gordon Pennycook, assistant professor of behaviour science, University of Regina’s Hill/Levene School of Business

Online, misinformation and entertainment overlap

“The problem with trying to convince people about what’s true is you’re constrained by the truth. In most cases the truth is just not as interesting as what’s made up,” says Gordon Pennycook, an assistant professor of behaviour science at the University of Regina’s Hill/Levene School of Business.

Pennycook’s work largely focuses on understanding reasoning and decision-making. Through recent research he conducted on misinformation and fake news, he saw that social media created an environment that is harmful to critical thinking. When people were more reflective, they were less likely to believe false headlines, even if those otherwise aligned with their political or religious ideology. But when given less time to consider the content, people said more false headlines were true.

The problem, according to Pennycook, is that people don’t come to social media to think critically — they come for entertainment. “If you think about the choice people make about whether to share something, it’s just a different thing [than] whether they’re critically assessing it,” he says. “If the task was to figure out what’s true, that’s a different sort of task than to figure out what people will like.”

Don't blame bots, fake news is spread by humans | Sinan Aral | TEDxCERN - YouTube

Sinan Aral, professor, MIT Sloan School of Management, and author of “The Hype Machine”

Plus, when it comes to sharing material online, the truth may not be as appealing as fiction. In fact, a 2018 study out of MIT conducted by professor Sinan Aral found that, on Twitter, false news is 70% more likely to be re-shared than true stories. Aral’s research suggests that misinformation gives people a thrill because the ideas feel more novel.

“Especially in an attention economy,”– that is, the type of environment that social media creates, where a seemingly endless stream of content holds and commodifies our attention — “the first step is getting people to watch or engage with what you want,” says Pennycook. “If you can just make stuff up, that’s a lot easier.”

For the believer, embracing a story may also reinforce a sense of status or feed existing fears. Spreading that misinformation, says Pennycook, has its own rewards. It confers social status that comes with the appearance of knowledge.

“Often people share things and they don’t even seem to consider whether it’s true or false,” says Pennycook. “They’re thinking about, is this important? Are people going to like this? Or is this going to make me look good?”

People connect information to identity

Not every falsity shared online takes off. Misinformation, when it succeeds, is grounded in something relevant to the people it reaches.

Older people more likely to share fake news and misinformation on Facebook, study finds | Facebook | The Guardian

According to Dannagal, this is why so much misinformation targets politics and political identities.

“We are in a moment, a political and cultural moment that makes it even more likely that we’re demanding misinformation. And all of these demands have to do with identity, with who we are,” she says. “And there are complex reasons why this is true, [but] our political identities have become these encompassing identities.”

And, says Dannagal, once members of a community begin to accept a story, it takes on a life of its own. As the story spreads, readers want to believe because it reinforces their sense of identity and membership in that community.

This tendency to believe information that confirms one’s existing world views is called “confirmation bias.” People are not only quicker to believe information that aligns with their beliefs, but they seek it out and find fault with information that does not. Pretty soon, it can seem like contrary evidence doesn’t even exist.

“The goal,” Dannegal adds, “becomes not about being right, but feeling right.”

The is almost a paradox at the heart of misinformation. People come to believe in the untruths they read and share online, but they don’t start out intending to do so.

A reader might start out curious about reported side effects of the coronavirus vaccines, and soon enough find themselves surrounded by people who claim the vaccines are unsafe. As the CDC itself found in its report on health information, “many people who share misinformation aren’t trying to misinform. Instead, they may be raising a concern, making sense of conflicting information, or seeking answers to honest questions.”

 


Original article on the Boston Globe