TikTok’s future hangs in the balance as the U.S. Senate weighs whether to take up the forced sale of the platform passed last week by the House of Representatives. Despite the frequent talk of the law as a ban — and it might be, if China pulls the plug rather than letting ByteDance sell the app — there are plenty of reasons to believe that TikTok will survive more or less as-is. The Senate might never take the bill up; a signed law could lose a constitutional challenge in court. In either case, the immensely popular video platform would continue to expand its role in global politics. So even with the app under threat, it’s worth wondering what a political future that includes TikTok will actually look like. There’s less research on TikTok’s impact on politics than on its more well-established forbearers like Facebook or Twitter. On Sunday, however, POLITICO’s European arm published a report that revealed some hard numbers: Analyzing European Union parliamentarians’ followers and likes on the platform, they found that the far right and far left punch far above their electoral weight there. In other words, as on other social media platforms, extremism pays. This largely comports with my own experience. For a full day in 2023, I experimented for POLITICO Magazine with receiving my news exclusively via TikTok. I very quickly learned that the World Economic Forum’s Klaus Schwab was planning to confiscate my property and force-feed me insects, distracting me all the while by orchestrating the war in Ukraine. (I am not making these storylines up.) So if your sole reason for being on TikTok is to do politics, you are probably doing some form of extreme politics — or at the very least imitating the extremists. (See lifelong centrist bellwether President Joe Biden’s TikTok account, which kicked off with a tongue-in-cheek nod to right-wing conspiracy theories and the extremist-aping “Dark Brandon” meme.) This is because the platform rewards novelty, to an even greater extent than platforms like Facebook or Twitter — in part due to its user base skewing extremely young; in part because of the hyper-targeted, id-revealing nature of its algorithm. Sometimes “novelty” is a fun dance move that goes viral. Sometimes, however, it is the propaganda of Osama bin Laden, which went viral on the platform in the aftermath of the Oct. 7 Hamas attacks. Eye-catching content about everyday life is one thing when it’s a kitchen cleaning “hack,” it’s another when it’s a user standing in her kitchen morosely pondering the experience of “Trying to go back to life as normal after reading Osama bin Laden’s ‘Letter to America’ and realizing everything we learned about the Middle East, 9/11, and ‘terrorism’ was a lie,” as the New York Times reported in November. Is this all a plot by Beijing? That’s what critics say, but the evidence showing deliberate manipulation of the TikTok algorithm is tangential at best. TikTok’s greater disruptive power may well come from tapping into and amplifying what’s latent in American culture, seizing on user behavior and giving them more of what they want, no matter which company owns its data. In our present media environment, marked by a lack of any form of gatekeeping, the same desire for novelty, titillation and outrage that made bin Laden go viral will inevitably feed back on itself. “TikTok is my Google," said one caller to the House of Representatives protesting the forced sale after prompting within the app. "I don't even use Safari or Google anymore... I learn on TikTok more than I learn in school." So if policymakers are serious about reducing the amount of misinformation on TikTok, and the current blunt-force approach doesn’t work to kill it off, what should they do? One potential lever would be simply to speak to the company. Facebook and Twitter (in its former, pre-Musk incarnation) made major concessions to government requests for moderation of false information surrounding elections or the Covid-19 pandemic, setting an effective precedent for platforms to play nice in the era of the nascent “techlash.” That window, however, isn’t open right now: Since October the Supreme Court has fielded no fewer than five cases challenging the constitutionality of government pushback on social media content — scaring off officials who might be tempted to do some direct jawboning. The most recent case against the administration might seem doomed to failure, but the court isn’t likely to rule for a few more months. And the cases reflect an overall backlash against content moderation from the right and from many in the tech community, most notably in the form of Elon Musk deciding to gut Twitter’s safety and moderation offices after his takeover — and publication of the emails of his former executives. Musk himself is no fan of the current move against TikTok, writing on X that it’s “not just about TikTok, it is about censorship and government control!” (emphasis his). It’s not surprising that a futurist and born-again ideologue (not to mention major Chinese business partner) like Musk would welcome the more chaotic, id-driven future of media and politics that platforms like TikTok promise. Given both the way the app works and the increasingly laissez-faire attitude of the courts and tech companies, right now the future TikTok ushers in is likely to be even more extreme than its present.
|