How bad IS disinfo? Meta forces a question.

How the next wave of technology is upending the global economy and its power structures
Jan 09, 2025 View in browser
 
POLITICO Digital Future Daily Newsletter Header

By Laurie Clarke

A smart phone screen displays a new policy on Covid-19 misinformation with a Facebook website in the background.

A circa-2021 Meta policy on pandemic misinformation. | Andrew Caballero-Reynolds/AFP via Getty Images

Meta’s decision to close down its US fact-checking operations marks the end of an eight-year era bookended by Trump presidencies.

And though it has caused a predictable wave of panic in liberal-leaning tech-accountability circles, the decision is also forcing a reconsideration among experts about what the real risks of online mis- and dis-information are — and how best to handle them.

Political disinformation seemed like a new and serious disease of the social media age when it first hit the public radar almost a decade ago, a problem that could be fixed by algorithm tweaks and more rigorous fact-checking. It has turned out to be more slippery, subtler and more difficult to root out.

“The situation is more complex than some are willing to let on,” said Felix Simon, communication researcher and research fellow in AI and Digital News at the Reuters Institute for the Study of Journalism.

“While there is evidence that fact-checking works, the effects were often small and fleeting,” he said, “and least effective for those most likely to see and believe false or misleading information.”

Meta’s fact-checking operation has its roots in the panic over disinformation that exploded in the wake of the 2016 election. Much of the worry surrounded deliberate Russian efforts to monkeywrench Western politics by sowing discord online.

A vibrant cottage industry — dubbed “Big Disinfo” — sprang up to fight back. NGOs poured money into groups pledging to defend democracy against merchants of mistruth, while fact-checking operations promised to patrol the boundaries of reality.

At first, it seemed like a largely technical exercise. The mission statement of the Stanford Internet Observatory, one of the newly created bodies, highlighted “the negative impact of technology” and promised to study “the abuse of the internet.”

Not everyone was convinced of the scale of the threat, however.

Tech CEOs started out skeptical: In the days after the 2016 election, Facebook CEO Mark Zuckerberg said there was “a profound lack of empathy in asserting that the only reason someone could have voted the way they did is because they saw fake news.”   

Under pressure, after congressional hearings and exposes, his company agreed to new policies and outside oversight.

Eight years later, after Trump’s decisive second victory, Zuckerberg’s earlier view is newly resonant.

In part, this is because researchers themselves say there isn’t convincing evidence for the idea that misinformation sways voters.

Research instead has shown that consumers of misinformation tended to be those who were highly motivated and already conspiratorially inclined, with most of us surprisingly resilient to far-fetched and unfounded notions.

This, combined with difficulties connecting exposure to misinfo to subsequent political beliefs or behaviors, has prompted a “revisionist view” in the field that “maybe [misinfo] isn’t the biggest danger we’re facing,” Matthew Baum, professor of global communications at Harvard University, told POLITICO.

For a recent article, I spoke to Baum and a number of other researchers, and found them surprisingly open about the idea that disinformation is not the bugbear that it seemed a decade ago.

“I’ve been thinking about this a lot lately … about how the frame of disinformation has failed us and what we can do differently,” said Alice Marwick, director of research at Data & Society, a nonprofit research institute.

Reece Peck, associate professor of journalism and political communication at the City University of New York, said, “The current emphasis on algorithms and tech moguls like Zuckerberg and Musk often obscures a key reality: the most effective online political content draws heavily on narrative techniques and performative styles pioneered by Limbaugh, Fox, and Drudge.”

That’s not to say that experts would cheer the end of fact-checking, which Zuckerberg himself acknowldged “means we’re going to catch less bad stuff.”

“While the Community Notes approach pioneered by X has shown promise, it is not a like-for-like replacement, nor is it intended to be,” said Rasmus Nielsen, professor at the Department of Communication of the University of Copenhagen.

So what’s next? Eight years into the grand online misinformation fighting experiment, some figures show content moderation may have increased polarization instead of building bridges. Trust in the media among Republicans hovered around 30% before 2016, but plummeted post-Trump. Last year it was 12%.

The Stanford Internet Observatory, which conducted high profile work on election-related misinformation, closed after being targeted by lawsuits and subpoenas from congressional Republicans.

Marwick at Data & Society said researchers should move past the idea of countering binary “units of fact,” and instead look at how age-old false narratives – for example about immigrant criminality – are cynically wielded.

Peck offered as an example the podcaster Joe Rogan, who dismissed the COVID vaccine and whose claims that U.S. intelligence agencies helped orchestrate the Jan. 6, 2021 attack on the U.S. Capitol were dubbed “dumb and irresponsible” on the Poynter site.

Rogan’s endorsement of Trum p this election cycle carried huge weight among his young male listenership.

But the idea that Rogan "is giving people bad science, and if we gave people good science, we could defeat him … that’s kind of misplacing where Joe Rogan gains his cultural authority — where the trust is between him and his audience,” Peck said.

Instead, he advocates going beyond the binary of true and false, and taking a more humanistic view of why certain messages resonate. “Communication scholars must reject the notion that we can overcome the dynamics of ‘culture war’ politics through fact-checking or platform reforms,” said Peck.

crypto's big risk

One of Washington’s top regulators is continuing his campaign for crypto rules as the incoming presidential administration is almost certain to give the industry more leeway.

Commodity Futures Trading Commission Chair Rostin Behnam spoke to POLITICO’s Declan Harty for Pro subscribers today, and said the regulatory void around crypto risks another debacle for investors like that created by the collapse of FTX — this time with greater peril for the broader economy.

Behnam warned the U.S. is “obviously in a very sensitive macroeconomic environment,” and that “if you do have a series of events from a macroeconomic perspective that might push asset prices down — and you start to see that correlation with crypto and bitcoin — then maybe that’s when you have people who are [leveraged] start to have to dump assets … and then that’s when you start to see it spiral.”

Behnam said that while crypto isn’t big enough to spark a wider financial crisis, a large enough move in the market "could be part of a larger shift in financial markets that could have an impact and accelerate a crisis or a change in market volatility.”

The interview came a day after Behnam publicly called on Congress to take action on crypto. He has announced he will leave the agency next month.

jim <3 meta

Rep. Jim Jordan is pictured. | POLITICO

Rep. Jim Jordan (R-Ohio). | John Shinkle/POLITICO

Meta has an unlikely new friend: House Judiciary Chair Jim Jordan (R-Ohio).

POLITICO’s Hailey Fuchs spoke with Jordan Wednesday, who said the company’s new global affairs chief Joel Kaplan gave him a heads-up on Meta dropping its third-party fact-checking partnerships a day before he announcing the moves on “Fox & Friends.”

“Remember that a few years ago, [the social media platforms] all kicked President [Donald] Trump off the platform. Now, they’re all going to Mar-a-Lago to visit with him because they know like this Trump administration is pro-First Amendment, and pro-free speech, and so it’s a dramatic change,” Jordan said. “We’re really appreciative of what Meta decided to do.”

As Judiciary chair, Jordan has antagonized the company for perceived bias against Republicans, at one point threatening to hold CEO Mark Zuckerberg in contempt. An anonymous Meta lobbyist told Hailey that reactions from Republicans to their recent policy changes have been positive, and that its efforts to police false information on the platform were politically untenable.

post OF THE DAY

Doesn't seem like I'm the one with the problem here frankly

The Future in 5 links

Stay in touch with the whole team: Derek Robertson (drobertson@politico.com); Mohar Chatterjee (mchatterjee@politico.com); Steve Heuser (sheuser@politico.com); Nate Robson (nrobson@politico.com); Daniella Cheslow (dcheslow@politico.com); and Christine Mui (cmui@politico.com).

If you’ve had this newsletter forwarded to you, you can sign up and read our mission statement at the links provided.

 

Follow us on Twitter

Daniella Cheslow @DaniellaCheslow

Steve Heuser @sfheuser

Christine Mui @MuiChristine

Derek Robertson @afternoondelete

 

Follow us

Follow us on Facebook Follow us on Twitter Follow us on Instagram Listen on Apple Podcast
 

To change your alert settings, please log in at https://login.politico.com/?redirect=https%3A%2F%2Fwww.politico.com/settings

This email was sent to salenamartine360.news1@blogger.com by: POLITICO, LLC 1000 Wilson Blvd. Arlington, VA, 22209, USA

Unsubscribe | Privacy Policy | Terms of Service

Post a Comment

Previous Post Next Post