Fake News Finds Fertile Ground Online

Fake News Finds Fertile Ground Online

education
Get Paid to Share Your Expertise

Help shape the future of business through market research studies.

See Research Studies

Fake news — a term seldom heard just a year ago — has come to dominate the global political news cycle since being widely blamed for Hillary Clinton’s surprise loss in the US presidential election. Stories based on false or misleading information found a particularly welcoming home online, where they were shared widely by real users and a virtual army of automated bots.

Senator Mark Warner (D-VA), vice chair of the Senate Intelligence Committee, has claimed that the Russian government hired as many as 1,000 people to create fake news stories to be circulated online. Researchers from the Project on Computational Propaganda at Oxford University found that in key battleground states the ratio of accurate to misleading news stories shared online was remarkably high, reaching parity in Michigan, where then-candidate Donald Trump eked out a victory over Clinton by less than half of a percentage point. Since then, false or misleading news stories have rattled the French election earlier this spring, upset the primary process in the Kenyan election, and even been cited as a cause of violence in South Sudan.columbia-business-school

Two weeks after the 2016 presidential election, speaking at a news conference in Germany, President Obama pinned the rise of fake news on its ability to camouflage itself. “In an age where there’s so much active misinformation and it’s packaged very well and it looks the same when you see it on a Facebook page or you turn on your television… if everything seems to be the same and no distinctions are made, then we won’t know what to protect. We won’t know what to fight for,” he said.

New research shows, however, that part of the power of fake news online stems not from its appearance, but from a sense of the presence of others that social media sites can create — and the impact of that sense on our psychology.

“We found consistently that people tend to fact-check less when they feel the presence of other people,” Gita Johar, the Meyer Feldberg Professor of Business at Columbia Business School and an author of the study, explains. Johar, along with Columbia Business School doctoral students Youjung Jun and Rachel Meng, examined individuals’ propensity to fact-check information through a series of eight experiments in which participants were presented with headlines on simulated news sites and Facebook feeds and asked to flag those needing fact checking.

What they found was striking: participants flagged roughly 35 percent fewer statements for review on a simulated news site when they believed themselves to be working in a group setting than when they were presented the same statements in isolation. What’s more, simply displaying false or ambiguous information as part of a Facebook feed caused people to behave as if they were in a group setting, erasing any difference in their likelihood to flag headlines for fact-checking.

This effect occurs independently of individuals’ tendency to believe information that aligns with their personal ideology — the tendency of political liberals to affirm statements that conform to a left-wing worldview, for example — another effect which contributes to the spread of false partisan information.

One theory as to why this happens is that people feel safety in numbers. “Animals hide out in herds and feel safer in herds, and similarly we feel safer in a crowd, and it manifests in lower fact-checking,” Johar says. Individuals may also choose to fact-check less in group situations because they assume someone else is looking out for false data — a version of the “bystander effect” — but the researchers point out that there’s nothing in their data to directly indicate this. In one experiment fact-checking fell even when participants were told that they would be working on the task and incentivized independently, indicating that the mere perception of the presence of others reduces individual vigilance.

Diminishing the impact of fake news has become a priority for social media companies in the wake of the 2016 election, but many are grappling with the implications — and costs — of becoming arbiters of truth on their platforms. “We take misinformation seriously,” Facebook CEO Mark Zuckerberg wrote in a Facebook post on November 19, 2016. “The problems here are complex, both technically and philosophically. We believe in giving people a voice, which means erring on the side of letting people share what they want whenever possible. We need to be careful not to discourage sharing of opinions or to mistakenly restrict accurate content. We do not want to be arbiters of truth ourselves, but instead rely on our community and trusted third parties.”

Since then, the social networking platform has made changes to try to slow the spread of fake news by allowing users to flag potentially fake stories — similar to the process Johar used in her study — and linking to articles by trusted third parties, like the Associated Press or Snopes, directly below contested links, although this hasn’t always been effective.

Still, Johar’s research indicates that one of the best ways for a service like Facebook to fight fake news is to encourage users to be more skeptical of what they encounter on the platform. “What Facebook is doing by giving people the ability to flag information could increase people’s sense of guard,” Johar says.

Indeed, when Johar and her fellow researchers told people that their flagging of potentially false articles would be visible to the group, fact-checking rose to 18 percent, from 11 percent for those who believed their flagging would be anonymous. Moreover, the researchers found that simply reminding people of their past and present duties, responsibilities, and obligations outside of the immediate task caused them to double their rate of fact-checking in group settings. Ultimately, “making people aware that there is information out there that could be false, that by itself is going to make you more vigilant,” Johar says, and inducing a greater sense of vigilance in users encountering news claims may be an effective to combat the impact of fake news.

“When we made people more accountable for their actions, they fact-checked as much on the social settings as the traditional settings,” Johar says. “It indicates that people feel it’s the right thing to do.”

Professor Johar is the faculty director of the Design Your Innovation Blueprint program at Columbia Business School Executive Education.


Ivy Exec is proud to announce its partnership with Columbia Business School, to bring an insightful collection of thought leadership pieces for the modern-thinking strategist in finance, leadership and more to its platform.

Columbia Business School
About the Author
Columbia Business School

For over 100 years, Columbia Business School has been perfecting the art of blending scholarly research and practical approaches, solidifying its place at the forefront of innovation. Subscribe to Columbia Business School’s Executive thought leadership or explore the School’s business school offerings.

Similar Articles

Show more