The “content wars” might be back in Washington, and Big Tech is looking for a way out. The new regulatory regime under incoming President Donald Trump appears set to take on Big Tech for its alleged silencing of conservative voices — and some technologists think the industry might be able to innovate its way out of the problem. The innovation in question is a form of “middleware,” software built into tech platforms that gives users the power (and responsibility) that platforms currently wield to moderate and curate their own social media experiences. A report published this morning opens a window into what middleware might do, and how it could affect the public argument about online content. “Shaping the Future of Social Media With Middleware,” from the right-leaning tech think tank the Foundation for American Innovation and the Stanford Cyber Policy Center, evaluates the current state of the technology and potential policy tripwires around it. “Middleware has the potential to provide greater choice around the content individual users see, and to address over-moderation concerns,” the authors write. “By decentralizing control and enhancing user autonomy, middleware may also help to reduce the potential for abuse of power by platforms, fostering a more just and equitable digital ecosystem.” When conservatives like incoming Federal Communications Chair Brendan Carr complain of what they call social media platforms’ selective or unfair speech enforcement, tantamount to censorship, they’re really complaining about moderation decisions. Tech companies all have policies for deleting or limiting the reach of certain posts deemed harmful, and an argument inevitably erupts around how they choose to use that power. If users, empowered by middleware tools, had more control over their own digital experience, that could potentially decrease the power of online speech as a political issue. But do people actually want this responsibility? Are the existing tools powerful and flexible enough to meaningfully change the online experience? How broadly could it work? (Because middleware requires developers to have access to platforms’ APIs, it’s now mostly available for decentralized platforms like BlueSky, Mastodon and Meta’s Twitter competitor, Threads.) DFD spoke today with Renée DiResta, a digital speech expert and professor at the McCourt School of Public Policy at Georgetown University who co-edited the report. DiResta described how these new kinds of software could provide a way forward from an unhappy-for-everyone status quo. An edited and condensed version of the conversation follows: How could middleware tools change the social media experience? Let’s say you want to have a very different experience on your “for you” feed, and you don't like what the platform is showing you. Right now you're usually just given two choices, a reverse chronological feed or what the platform is deciding to algorithmically curate for you. Often it's going to show you something sensational, to keep you there. What middleware envisions is a world in which you have agency over both of those things, so you can do more to say, “Today I think I don't want to see news, I just want to see sports content,” or “today I just want to see posts from people who don't post very frequently.” It offers you the opportunity to change your experience on the platform in a way that benefits you. Why don’t we have this already? Platforms are not incentivized to offer it. There are some concerns about privacy. One of the most public social media scandals was Cambridge Analytica, where a third-party product scraped the entire site. But one of the reasons that we wrote this paper now is because there are newer, decentralized, protocol-based social media experiences that are potentially and perhaps a little bit surprisingly gaining widespread adoption. And let’s be totally honest: this is not because people are in love with decentralization, or even realize what a protocol-based platform implies — they just feel that the vibes are better in these emerging spaces. You have people who are just looking for an alternative space to spend their time. Some of them, like BlueSky, are prioritizing giving users this experience of control over their feeds and moderation. I'm not saying that this is going to lead to a wholesale shift in user behaviors, where all of a sudden people are going to realize that “Hey, the defaults aren't great and we can do better.” But I do think that some people will, and I do think that it's going to create a different set of experiences that enable people to realize what is actually possible on social media and that we're operating under a paradigm that we've all become accustomed to as users over time. What are some of the policy concerns surrounding middleware? In order to have a market that supports these third-party tools, they have to understand where they sit in the legal and policy realm. This in the paper is where we try to delve more into specifics around how different regulatory models will shape these markets as they emerge. It seems like a world where the onus is more on users and developers, and less on these platforms making high-profile speech and moderation calls, would be pretty appealing to platforms that have become political footballs. Yes, absolutely. The platforms don't want to be moderators. There's an amazing paper Kate Klonick wrote called “The New Governors.” One of the points she makes, which is still 100 percent true today, is that platforms have to moderate for business incentives. Expressed and revealed preferences both show time and time again that it’s not an enjoyable experience for these platforms to be unmoderated free-for-alls. And then, depending on which party is in charge and applying the greatest pressure, you see them shifting moderation policies and curation dynamics, as with Facebook's “trending topics” controversy. They got rid of the entire feature rather than fight about curation and human involvement, or just allow this bullshit machine to continue operating uninterrupted. That's where you see the effects of political pressure on attempts to play referee. Every party in every country tries to exert its will in some way, to try to maximize its own chances. Platforms don't want to deal with this. So while they recognize the business need to do some degree of moderation, they do not necessarily want to be getting into the weeds of these decisions. Giving users more control is a way to lay off some of that responsibility.
|