Scientists Gear Up for a Battle Against Fake News: Faye Flam
published Mar 27th 2017, 2:00 pm, by Faye Flam
(Bloomberg View) —
People aren’t getting dumber, despite what a prolific writer of fake news told the Washington Post last fall, but something funny is going on with American media. There’s been an apparent surge in fabricated stories, while the president has accused the New York Times and other traditional journalism outlets of producing “fake news.” With facts seemingly up for grabs, scientists are starting to see evidence that both ends of the political spectrum have splintered off into alternative realities.
But it’s not just a matter of social media isolating conservatives and liberals in echo chambers. Instead, researchers who study how people share news via Facebook and Twitter say concerted efforts to misinform the public are becoming a threat. New forms of social media help deceivers reach a far larger audience than they could find using traditional outlets. So behavioral and computer scientists are searching for solutions.
Part of the problem dates back to our evolution as social animals, they say. “We have an innate tendency to copy popular behaviors,” said Filippo Menczer, a professor at the Center for Complex Networks and Systems Research at Indiana University, and one of several speakers at a recent two-day seminar on combating fake news.
That tendency can get people to notice and repeat not just fake news, but fake news from fake people — software creations called bots. Bots, which automatically post messages to social media, get their strength in numbers, making it look like thousands of people are tweeting or retweeting something. Menczer, who has a background in both behavioral and computer science, has studied the way bots can create the illusion that a person or idea is poplar. He and his colleagues estimate that between 9 percent and 15 percent of active Twitter users are bots.
The phenomenon he described reminded me of experiments with animals that engage in a behavior biologists call “mate copying.” In certain bird species, for example, females prefer males who are already getting attention from other females. Such species are prime targets for manipulation with fake birds. In an experiment on a bird called a black grouse, scientists surrounded otherwise unremarkable males with decoy females, after which real females mobbed the popular-looking males like groupies. (The males were also fooled, in that they immediately tried to mate with the decoys.)
In studying how this works with Twitter users, Menczer and his colleagues created a program to distinguish bots from people. What he learned was that ideas being promoted by bots can hit the popularity jackpot if they get retweeted from a well-connected or prominent human. Such people often get a lot of bots vying for their attention for just that reason, Menczer said. Shortly after the November election, he said, Donald Trump was inundated with bots telling him that 3 million illegal aliens voted for his opponent. Trump later tweeted this same information. A human source has been connected to the rumor, but the bots could have made it look like it had the backing of hundreds more people, as well.
Others mapping the social-media landscape see different patterns of deception on the right and left. Yochai Benkler, co-director of the Berkman Klein Center for Internet and Society at Harvard, has seen political asymmetry using an open-source system called Media Cloud, which follows how stories circulate on social media. Mapping the flow of more than a million stories, he found that people who share left-leaning partisan news also tend to share news from the New York Times, Wall Street Journal, CNN and other sources with traditions of accountability. Those who shared items from right-leaning sites such as Breitbart were much less likely to circulate stories from such mainstream news sources.
In a piece Benkler co-authored in the Columbia Journalism Review, he said his data revealed a pattern of deception among many right-leaning sites. “Rather than ‘fake news’ in the sense of wholly fabricated falsities,” he and his co-authors wrote, “many of the most-shared stories can more accurately be understood as disinformation: the purposeful construction of true or partly true bits of information into a message that is, at its core, misleading.”
In an ironic twist of fate, Indiana’s Menczer became the subject of just such a hodgepodge of true and false statements. He’d already received some media attention in the Wall Street Journal and other publications for his work on the way ideas, or “memes,” spread through social media. None of the mainstream stories suggested he was up to anything sinister. But then, in 2014, the Washington Free Beacon published a story headlined Feds Creating Database to Track ‘Hate Speech’ on Twitter.
The problem was that there was no database, and nobody had tried to define either hate speech or misinformation.
The true part was that Menczer got some money from the National Science Foundation. And in one of his grant applications, he had proposed that the results might be useful for people trying to study the spread of misinformation or hate speech.
The misleading story got echoed through conservative sites such as Breitbart, which suggested Menczer’s project was a plot to suppress free speech. It then reached Fox News’s Megyn Kelly, who devoted part of her show “The Kelly File” to the issue. It offered no additional facts, but it did ramp up the innuendo and outrage. How did an academic who studies the flow of information on social media morph into a stooge for “bureaucrats” engaged in policing free speech? Kelly never said.
These days you don’t have to be paranoid to worry about government spying, but Twitter is a public forum, and there’s no reason to keep scientists from studying how it influences public opinion. Science Magazine and Columbia Journalism Review eventually debunked the scandal. Still, if anyone might see a silver lining in being part of a media attack, it would be someone who studies the media.
Menczer said that, unfortunately, having to defend himself and colleagues against disinformation wasn’t particularly enlightening — not enough to be worth the time it took away from his research on the way disinformation spreads. From his perspective, it was about as helpful as a flu researcher contracting the flu. But the incident did provide a piece of anecdotal evidence for the contention that false accusations tend to stick to people even after they’re debunked.
Even other researchers sometimes consider the project “controversial,” Menczer said. In that way, the critics won, since they left the public with an indelible association between his work, politicized research, and the Obama administration censoring free speech. “Never mind that none of this was true,” he said.
This column does not necessarily reflect the opinion of the editorial board or Bloomberg LP and its owners.
Faye Flam is a Bloomberg View columnist. She was a staff writer for Science magazine and a columnist for the Philadelphia Inquirer, and she is the author of “The Score: How the Quest for Sex Has Shaped the Modern Man.”
To contact the author of this story: Faye Flam at fflam1@bloomberg.net To contact the editor responsible for this story: Tracy Walsh at twalsh67@bloomberg.net
For more columns from Bloomberg View, visit Bloomberg view
COPYRIGHT© 2017 Bloomberg L.P
No Comment