Just how polluted is the news environment during the 2020 presidential election ? A new study, reporting in DefenseOne, a news platform for the military industrial complex, concludes: Very.
The study, by Zignal Labs, is careful to distinguish disinformation–deliberate deceptions–from misinformation–erroneous statements.
These falsehoods were consumed by audiences across the country, but unevenly, especially targeting swing states. The top three states for mentions of misinformation were the swing states of Pennsylvania, Michigan, and Florida. It is still far too early to tell, but this consumption of misinformation may help explain why pollsters’ estimates about the election have been off
The Internet giants–Facebook, Twitter, and YouTube are implementing more aggressive policies to limit the spread of baseless claims at the same time their business models incentivize them.
social media platforms remain a battlefield on which too much misinformation thrives and those who push it find too little resistance. Twitter, YouTube, Facebook do not sufficiently coordinate their policies, allowing disinformation dealers to “forum shop” and take advantage of the policy seams. Newer platforms such as TikTok, meanwhile, have not built up the same anti-misinformation processes, and remain relative “Wild West” hubs for false claims of election fraud. (#riggedelection , for example, was garnering well over 1 million plays a day.)
The data indicated that, despite many warnings–and some overhyped stories–the role of Russian intelligence state actors in the 202o U.S. election was not significant.
As Nina Burleigh reported here on DEEP STATES, Trump’s misinformation tactics had far more impact than Putin’s.
In 2016, Russia drove U.S. media narratives through hack-and-dump operations, and then shaped online discussion via thousands of bots and trolls. But 2020 election-related misinformation was mostly a domestic affair. Iran was accused of mounting a campaign in Florida and Russia was documented to have amplified QAnon, but their effect on the overall election appeared modest.
The Iran story, it turns out, was mostly unfounded. Iranian intelligence agencies and associated hackers, made sure not to provoke the United States, relying instead on a strategy of “strategic patience.” (See Maysam Baveresh’s piece, for DEEP STATES, “Iran Awaits U.S. Vote With Strategic Patience, Tactical Interference.’)
The most important finding is that the social media companies are going to have to crack down on pages “urging physical protest and violence.”
The misinformation data points to a need for continued policy adjustment by the platform companies and vigilance by those in government, especially as this activity tips towards urging physical protest and violence.
It is going to be difficult to police this policy. There’s nothing wrong with “physical protest” as long as its peaceful. But what about a Facebook calling for a rally of gun-toting protesters to “stop the steal” qualify as violence? Not until the protesters engage in violence, at which point, it will be too late.
The social media companies may not be able to control the misinformation that feeds their bottom lines.
Source: Misinformation 2020: What the Data Tells Us About Election-Related Falsehoods – Defense One