Hello, and welcome to this week’s installment of the Future in Five Questions. This week I interviewed Will Rinehart, a fellow at the American Enterprise Institute and author of the Dispatch’s Techne newsletter on technology and where it intersects with governance. Will discussed his surprise at how suddenly Silicon Valley swung toward the right after President-elect Donald Trump’s election, why the federal regulatory code needs to be “debugged,” and how politicians don’t think enough about opportunity costs. An edited and condensed version of the conversation follows: What’s one underrated big idea? Perhaps I have to say this as an economist, but opportunity cost remains an underrated concept. It’s the value of the next best alternative that you didn’t take. I like how economist David R. Henderson captured the idea by saying that if “you spend time and money going to a movie, you cannot spend that time at home reading a book, and you can’t spend the money on something else.” Crack open an introductory economics textbook and the notion of opportunity cost is usually covered near the beginning. It's fundamental to the science. Thinking about opportunity costs attunes you to think about what could have been, the counterfactual. I wish counterfactual thinking were more present in politics and policy. “We need to regulate AI” has become a common refrain among tech leaders, policy experts, and politicians. But focusing solely on this kind of legislative action forces myopia. While Congress hasn't passed dedicated AI legislation, both the executive branch through agency rulemaking and enforcement, and the judiciary through court cases are actively shaping how AI technologies can be developed and deployed. AI regulation is already unfolding through multiple channels, which I would like to see played out for a while. Instead, there is a push for new statutes, especially at the state level, which is likely to be a significant hindrance to commercial development. What’s a technology that you think is overhyped? For the past decade, I've remained bearish on autonomous vehicles (AVs). The core challenge lies in codifying human knowledge into software. As I wrote recently, “Traversing an urban setting, following traffic signs, and adjusting to the erratic driving behaviors of other vehicles is difficult for people,” but translating this tacit knowledge into reliable autonomous systems has proven even more daunting than initially thought. The complexity of human decision-making in unpredictable environments continues to be a fundamental barrier. But this is a fundamental problem in all automated systems from AVs to AI. We need to properly estimate the complexity of human intuition and our ability to replicate it through technology because they are often mismatched. What book most shaped your conception of the future? I got my start in tech policy by working as an intern for Adam Thierer one summer, and he suggested that I read Aaron Wildavsky's “Searching for Safety” since I had an interest in information economics and risk. The book blew me away. On the very first page, he pointed out: "To those who see safety as a goal we know how to achieve, the task of decision making is simple: choose the safer path. There is no need to look further, since we already know what to do; no act designed to secure safety could have the opposite effect. Imagine instead, however, safety as largely an unknown for which society has to search. An act might help achieve that objective, or it might not. Often, we would not know which, safety or danger, had in fact been achieved until afterwards." Wildavsky opened up the world of search costs to me, which is one of the major throughlines of my research. Right now, I'm working on a paper that applies these insights to AI safety regulation, where the challenge isn't just crafting rules, but understanding how to search for effective governance in an environment of radical uncertainty. What could the government be doing regarding technology that it isn’t? I think there should be a massive effort to read through the federal regulatory code to get rid of outdated regulations. We need to refactor the regulatory code. The Department of Health and Human Services had a pilot project during the first Trump administration that should be expanded to other agencies. It has long been the goal of turning law into code and this would be a good first step. When regulations are translated into code, we can better understand their interactions, dependencies, and cumulative effects. Think of it like debugging software—until you can see how all the pieces work together, it's nearly impossible to identify conflicts or inefficiencies. We know regulatory compliance imposes significant costs on businesses and citizens, so having a computational model of our regulatory system would finally let us measure these impacts with precision and identify opportunities for streamlining. What has surprised you the most this year? Although we are in the first couple of weeks of 2025, I was surprised by Meta CEO Mark Zuckerberg’s pullback on DEI and fact checking. Allegedly, Mark Zuckerberg pinned the blame on Sheryl Sandberg during his visit to Mar-a-Lago, saying that she “encouraged employees’ self-expression in the workplace.” Some believe that Zuckerberg has bent the knee to President Trump, but it could be that Zuck was truly bending his knee before. Either that, or Zuck simply doesn’t have hard preferences for DEI initiatives. Regardless of how you cut it, to me, this is a story about markets and voice. We've seen this pattern before. During COVID-19, Facebook faced pressure about content moderation, which eventually spawned a Supreme Court case. It held the line on fact checking only to end it now. That Meta changed course shows that the pendulum of tech governance continues to swing between different approaches to content and workplace policies, often driven less by ideology than by practical business considerations. At the end of the day, this latest shift suggests that the era of tech companies acting as cultural arbiters might be waning in favor of a more market-driven approach.
|