Why Federal Preemption of AI Regulation Would Be a Dangerous Mistake
The case for state sovereignty in the age of artificial intelligence
A troubling idea has been circulating through Washington’s corridors: federal regulation of AI that would preempt states from enacting their own rules. At first glance, this might sound reasonable—a unified national approach to emerging technology. But dig deeper, and the proposal reveals itself as both impractical and dangerous.
The Fatal Flaw of One-Size-Fits-All
America isn’t a monolith. What makes sense for New York City’s dense urban environment doesn’t translate to Casper, Wyoming’s wide-open spaces. Salt Lake City’s priorities aren’t Austin’s priorities. This geographic and cultural diversity isn’t a bug in our system—it’s a feature that’s served us for nearly 250 years.
Consider how we already handle contentious issues. Gabapentin isn’t a controlled substance at the federal level, yet several states regulate it as one. Marijuana remains a Schedule III controlled substance federally, while dozens of states have legalized it for adults over 21. The sky hasn’t fallen. Instead, we’ve created natural laboratories of democracy where different approaches can be tested and evaluated.
Why should AI be any different?
The Corruption Problem
Washington operates at the speed of lobbying dollars, not the speed of innovation. Corporate interests have deep pockets and patient timelines. They can afford to wait out legislative cycles, to hire armies of lobbyists, to shape regulations that serve shareholders rather than citizens.
Federal preemption would create a single point of failure—one Congress, one set of agencies, one rulebook that could be captured by the very companies it’s meant to regulate. States, by contrast, offer 50 different battlegrounds where ordinary people have more voice and where local concerns can’t be drowned out as easily by national corporate campaigns.
We need regulations that protect humans: citizens, parents, children, workers, and employers. Not just whoever has the President’s ear this week.
The Speed Mismatch
AI development moves at Silicon Valley speed. Federal regulation moves at DMV speed—and often in the wrong direction entirely. By the time Congress finishes debating the specifics of today’s AI capabilities, the technology will have evolved three generations beyond what’s being regulated.
States can move faster. They can experiment, fail, adjust, and try again. They can respond to local needs and local harms in real-time, not in congressional-hearing time.
The Laboratory of Democracy in Action
Imagine this scenario: one state decides to ban AI applications altogether out of concerns about job displacement and privacy. Another state takes the opposite approach, implementing zero restrictions to maximize innovation and economic growth. A third finds a middle path with targeted consumer protections.
What happens? Over time, citizens vote with their feet and their businesses vote with their investments. We observe real-world outcomes. We learn what works and what doesn’t. We adjust. This is how federalism is supposed to function—not as bureaucratic chaos, but as competitive governance that drives better solutions.
The United States is, in many ways, 50 different countries that chose to share a currency, a passport, and a Bill of Rights. We’re meant to be a federation, not a monolithic nation-state with one central bureaucracy hemorrhaging money while providing diminishing value. Our diversity is our strength. Under one flag, we can test different approaches and discover which ones actually succeed.
What the Federal Government Should Actually Do
Federal preemption is the wrong answer. But that doesn’t mean the federal government has no role. Here’s what Washington should focus on:
Protect Interstate Commerce Without Crushing Local Control
States should have the authority to regulate AI products and services consumed within their borders. This is basic consumer protection. However, states shouldn’t be able to regulate data centers within their boundaries when those centers serve consumers elsewhere, nor should they obstruct the legal transfer of data across state lines. We need clear rules that prevent a patchwork of conflicting technical requirements while preserving states’ ability to protect their own residents.
Safeguard Constitutional Rights in the Age of AI
The federal government’s primary job is protecting the Bill of Rights, and that responsibility doesn’t stop at AI. First Amendment protections must apply—AI systems shouldn’t be conscripted into censoring lawful speech. Second Amendment rights matter too, particularly given that AI is already being weaponized offensively. Americans have the right to use AI defensively to protect themselves against AI-enabled threats, which are inevitable. This isn’t science fiction; it’s simply ensuring our constitutional framework remains meaningful as technology evolves.
Create a Federal-State Partnership, Not a Federal Takeover
Establish an independent board composed of both specialists and representatives from each state. This board would direct federal resources toward two goals: protecting constitutional rights in the context of automation and AI, and helping states enforce consumer protections when the threat comes from actors the federal government is best positioned to handle—foreign bad actors, international criminals, and overseas offenders beyond any single state’s reach.
The Path Forward
Federal preemption of AI regulation isn’t just bad policy. It’s a rejection of the fundamental structure that’s made American governance resilient and adaptive for centuries. It concentrates power where corruption thrives, slows response time when speed is essential, and eliminates the competitive pressure that drives better solutions.
We don’t need Washington to save us from the messy reality of 50 different approaches. We need Washington to protect the framework that makes those 50 approaches possible—and then get out of the way.
The future of AI regulation shouldn’t be written in a single congressional bill. It should emerge from the democratic laboratories of 50 states, each finding solutions that work for their citizens, each learning from the others’ successes and failures.
That’s not chaos. That’s federalism. And it’s exactly what we need.
