House Intelligence Committee Chairman Adam Schiff opens the virtual hearing “Misinformation, … [+]
On Thursday afternoon, the House Intelligence Committee held a virtual hearing on the online spreading of misinformation, conspiracy theories and ‘infodemics.’ Among other things, the hearing examined the role of social media platforms in the proliferation and deceleration of false information and what private and public actors might be able to do to counter it.
But as more than one expert testifying at the hearing pointed out, we’re now less than three weeks away from the election, and misinformation campaigns from foreign and domestic actors have been simmering on social media platforms for a decade. As the heavy hearing adjourned, one is left to wonder, how likely is it that any meaningful change can be accomplished in time to salvage election integrity?
“Such a polluted online landscape unfortunately remains ripe for exploitation by foreign adversaries,” said Representative Adam Schiff (D-CA), chairman of the committee, during his opening statement.
“Americans are voting right now in the midst of a pandemic. It could take days or weeks to count all the votes after election day,” said Schiff. “And that period will be especially susceptible to conspiracy theories and misinformation. Especially if amplified by maligned foreign actors which could potentially cast doubt on the legitimacy of the electoral outcome itself, or make us doubt the viability of our free and open democracy. That scenario goes to the heart of our national security, and evokes the oversight mandate of Congress on this committee.“
Domestic disinformation runs rampant as well, according to multiple witnesses testifying at the hearing, serving to do U.S. adversaries’ work for them and leave the country vulnerable to continued manipulation.
The virtual hearing lasted about 90 minutes, and no Republican representatives from the committee joined, signaling just how politicized the issue of misinformation and conspiracy theories is today. The topic of misinformation is a nuanced one. It’s not always clear who is responsible for controlling the spread of falsehoods and how we as a society can work together to attempt to reinvent social media in a way that promotes safety, equity and democracy.
Let’s break down some of the main points, problems and potential solutions that were discussed at the hearing.
Lack of regulation and responsibility among tech giants
Dr. Joan Donovan, research director of the Shorenstein Center of Media,Politics and Public Policy at Harvard Kennedy School of Business, started out her testimony with a statement published by Facebook in January this year. “In the absence of regulation, Facebook and other companies are left to design their own policies. We have based ours on the idea that people should hear from those who wish to lead them, warts and all.”
Dr. Joan Donovan, Research Director of the Shorenstein Center on Media, Politics and Public Policy, … [+]
Donovan and others noted that lack of regulation means that social media platforms design their policies based around addiction and monetization, as the algorithms that regularly push incendiary and viral content demonstrate.
“But what happens when political and media elites coordinate to blanket social media with falsehoods?” said Donovan. “In these cases, advertising isn’t necessary for spreading lies to millions, because all they need is the platform to work exactly as designed. Who then is responsible for explaining why a consumer was exposed to certain falsehoods? Who is responsible for making certain corrections when falsehoods are identified?”
The costs of misinformation at scale: Who cleans up after the mess?
Donovan said that her team had come up with four clear impact zones that need to deal with the damages caused by unmoderated, unregulated and unmanageable misinformation and conspiracy theories. The first is journalists, who constantly have to divert newsroom resources towards researching and combating misinformation, and who are accused of peddling fake news in the process.
With the pandemic still in force, public health and medical professionals have also had to fight medical misinformation.
“Doctors should not have to become online influencers in order to correct misinformation pushing miracle cures and bogus medical advice,” said Donovan.
WASHINGTON, DC – SEPTEMBER 23: Anthony Fauci, director of National Institute of Allergy and … [+]
Civil society at large also has to consistently step up to the plate, as racialized disinformation is used by both domestic and foreign adversaries to boost polarization on wedge issues. Finally, law enforcement personnel and first responders have to shoulder the burden of rumors as real life violence ensues due to calls to action on social media, as we saw with the Kenosha shooting.
Content moderation alone is not enough
And it’s certainly not enough to stem the tide of misinformation when companies aren’t enforcing their own policies with high profile accounts, according to Nina Jankowicz, Wilson Center disinformation fellow.
Nina Jancowicz, disinformation fellow at the Wilson Center, testifies at House Intelligence … [+]
Jankowicz also pointed to the increased use of information laundering, a common practice used by actors like Russia of using authentic local voices and organizations to conceal the origin and lend legitimacy to a given malign narrative. This presents a challenge to successful content moderating.
So does conspiracy convergence, a sort of rat king of conspiracy theories, fueled by adherents of one conspiracy theory being introduced to and encouraged to spread others. And it’s often happening behind digital closed doors, like private Facebook groups, which makes it even harder to police.
QAnon, for example, owes much of its popularity to the Facebook groups recommendation algorithm. But as Graphika head of analysis Melanie Smith pointed out, QAnon is a highly adaptable beast that’s rooted in white nationalism but is closely linked to the news cycle, which helps it reach new audiences.
Melanie Smith, Head of Analysis at Graphika, Inc., testifies on Thursday, October 15, at the House … [+]
While studying networks of QAnon supporters around the world, Smith found conversations about topics that are seemingly irrelevant.
“The fire at Notre Dame Cathedral, yoga, the Nike boycott, Brexit and alternative medicine, to name a few,” said Smith. “This means that these new audiences are likely to be exposed to conspiracy content through more benign topics that act as an unfortunate gateway.”
HAUPPAUGE, NEW YORK – OCTOBER 4: QAnon supporters attend a Trump rally hosted by Long Island and New … [+]
Corbis via Getty Images
The shape shifting nature of QAnon makes it incredibly difficult for social media platforms to track and moderate, said Smith.
Women and minorities are disproportionately targeted by disinformation campaigns
“We know that [Russia’s Internet Research Agency] disproportionately targeted black voters to suppress the black vote,” said Jankowicz, who also cited the Kremlin’s targeted campaigns against women to discourage them from engaging in public life, as well as the sharp increase in demeaning and false narratives against Sen. Kamala Harris ahead of election day.
Democratic vice presidential candidate Sen. Kamala Harris, D-Calif., responds to Vice President Mike … [+]
“During the vice presidential debate we were tracking instances of sexualized and gendered disinformation against Senator Harris, and on [Parlor and 4chan] those instances increased 631% and 1,078%,” she said. “These campaigns are meant to affect American women and minorities’ participation in the democratic process. Every American should categorically reject them.”
Misinformation leads to real world harm
Whether targeted content that affects people’s participation in the democratic process, disinformation that’s meant to threaten the integrity of our elections or actual calls to violence, as we’ve seen with the plot to kidnap the Michigan Gov. Gretchen Whitmer, misinformation and conspiracy theories spread on social media lead to real world harm.
Social media companies like Facebook and Twitter are working to combat misinformation everyday, but Alethea Group vice president Cindy Otis said they’re still lagging behind on figuring out what real world harm is and what kind of content can lead to it.
Cindy Otis, VP of Analysis at Alathea Group, testifies on Thursday, October 15, at the House … [+]
“A diet of false, misleading, sensational content that encourages violence does lead to real world violence and also might lead to things like people not voting,” said Otis.
So is there hope for the future?
The Trump administration and others who seek to regulate big tech platforms have looked to repeal or amend Section 230, the law that protects social media platforms from liability for any information published on their platforms. When questioned on this matter by Representative Peter Welch, Donovan didn’t advise restructuring 230 without a plan for how to go ahead.
“Social media provides an enormous public good,” she said. “It’s the features that are becoming the problem, the ways in which information is sorted, the way people can pay to play. They can pay to push their information or ‘news’ across these platforms.”
So what do the experts suggest?
Donovan and Jankowicz both said they’d be in favor of empowering a new oversight agency that would at least ensure the rules and regulations platforms create for themselves are enforced. Donovan also pointed to harsher regulation of advertising on social media, as some bad actors are financially motivated by disinformation and offer it up as a service to political campaigns.
In 2019, Facebook generated 98% of its revenue, or $68.7 billion from ads.
Facebook CEO Mark Zuckerberg speaks about “News Tab” at the Paley Center, Friday, Oct. 25, 2019 in … [+]
Otis pointed to more investments in nonprofit organizations that are doing the work for social media platforms by studying and researching the shape and effects of misinformation, as well as doing significant work in community education around digital literacy. She also asked that more data from these companies, specifically from Facebook, be made available to analysts. Every week or so, Facebook announces that it’s removed harmful or false content and groups, but they don’t provide specific names of groups, nor do they make the harmful content available to researchers.
The witnesses and some representatives collectively called for social media platforms to reckon with their algorithms that cause addiction and fuel incendiary content. Each platform has shown that when it wants, it can remove and ban content, making many suspicious that they’re not doing so now out of concern for their bottom line.
“When our information ecosystem gets flooded with highly salient junk that keeps us scrolling and commenting and angrily reacting, civil discourse suffers,” said Jankowicz. “Our ability to compromise suffers. Our willingness to see humanity in one another suffers. Our democracy suffers. But Facebook profits.”