The Car That Watches You
How quietly mandated AI in our vehicles is rewriting what it means to own the thing parked in your driveway
Let me start with a confession. I love cars. I love what they represent — the open road, the choice of when to leave and where to go, the most universal symbol of modern personal freedom we ever invented. So when I tell you that the car you buy in 2027 will, by federal mandate, be watching you, judging you, and (under the right conditions) refusing to drive you, I want you to understand: I am not anti-car, anti-safety, or anti-progress. I am pro-driver. And the driver — the actual human being behind the wheel — is the one this story is about.
You may have heard whispers about a “kill switch“ in the 2021 Bipartisan Infrastructure Law. You may have also heard from Snopes and PolitiFact that the “kill switch” framing is misleading, that no such phrase appears in the statute, that no remote-off button is being installed by the federal government. Both things, oddly enough, are true. And both miss the larger point.
So far so good. Let me explain.
What the law actually says (and why the words matter less than the architecture)
Tucked into Section 24220 of the Infrastructure Investment and Jobs Act — the HALT Drunk Driving Act, named for the Abbas family of five killed by a wrong-way drunk driver in 2019 — Congress instructed the National Highway Traffic Safety Administration (NHTSA) to require that every new car sold in America come equipped with technology that can do two things:
“Passively and accurately monitor the performance of a driver” to identify whether that driver “may be impaired,” and
“Prevent or limit motor vehicle operation if an impairment is detected.”
That’s it. Two clauses. No mention of “kill switches,” no mention of police access, no mention of remote disablement. The statute is, on its face, narrow. The defenders of the law — including MADD and Rep. Debbie Dingell — insist (and they’re not lying) that the rule contemplates only local judgment by the car itself, with “no one outside the car” able to operate it.
Here is where things get interesting.
What the law mandates is not a remote off button. What it mandates is every car in America containing the hardware and software capable of refusing to drive based on an algorithm’s assessment of you. That is a different thing. That is an architecture. And once an architecture exists in 290 million vehicles, the political, commercial, and regulatory gravity that pulls it toward expanded use is — let me be charitable — historically irresistible.
Why you may ask? Because we have seen this movie before. Many, many times.
The technology doesn’t even work — and the regulator admits it
In its February 2026 Report to Congress, NHTSA itself conceded — in plain English — that the technology is not ready. The agency wrote that “no in-vehicle technologies in production” can passively measure blood-alcohol content reliably, and that even a 99.9% accuracy rate would still produce millions to tens of millions of false readings every year in the United States.
Let that sink in. The federal regulator tasked with writing this rule is on the record stating that the underlying technology, in its current form, would falsely strand sober drivers — possibly tens of millions of times annually. That’s not a marginal failure mode. That is the core of the system.
NHTSA missed its statutory deadline of November 15, 2024 for issuing a final rule. The rulemaking received over 18,000 public comments. And in January 2026, the House voted 164–268 against Rep. Thomas Massie‘s amendment to defund the implementation. Massie’s standalone repeal bill, H.R. 1137 — the No Kill Switches in Cars Act — sits in committee. The fight is alive. But the technology is being built on the assumption that the mandate stands.
To be clear: I am not cheering for drunk drivers. Roughly 12,000 Americans die every year in alcohol-impaired crashes. The grief is real. The Abbas family’s loss is real. But the question is not whether drunk driving is bad. The question is whether the right response is to bolt a half-built, admittedly-unreliable AI judge into every private vehicle in the country. There are other tools — ignition interlocks for convicted offenders, better public transit, sobriety checkpoints with proper constitutional guardrails, automatic emergency braking that already works — that target the actual problem without conscripting 290 million sober drivers into a continuous biometric audit.
Meanwhile, in Europe, the future has already arrived
Just imagine waking up tomorrow and discovering that every new car sold in your country must — by law — watch your eyes, monitor your steering, record your every input, and reactivate its speed-limiting software every single time you start the engine, even if you switched it off the day before.
You wouldn’t have to imagine if you lived in Europe. You’d just be living it.
Since July 2024, the EU’s General Safety Regulation 2 — formally Regulation (EU) 2019/2144 — has required every newly registered car to come with Intelligent Speed Assistance (ISA), driver drowsiness and attention warning (DDAW), event data recorders, alcohol interlock installation facilitation, and emergency lane keeping. From July 2026, Advanced Driver Distraction Warning (ADDW) — camera-based monitoring of where your eyes are pointing — joins the list.
ISA reactivates every time you restart the car. The vehicle must store data on whether you used or overrode it. And the European Commission has explicitly stated that the regulation’s effectiveness will be re-assessed by December 31, 2025, with revisions to follow.
In contrast, in America we are still pretending this is hypothetical. It isn’t. It is just slower.
The cars are already spying on you — and selling the data for the price of a gumball
Here is where the conversation usually stops being abstract.
In September 2023, the Mozilla Foundation reviewed 25 major car brands for their *Privacy Not Included guide. Every single brand — Ford, Toyota, Volkswagen, Tesla, Hyundai, all of them — failed. Mozilla called cars the worst product category they had ever reviewed. Their phrase, not mine: “privacy nightmares on wheels.”
Some highlights (or, more accurately, lowlights):
84% of brands share or sell driver data.
76% explicitly reserve the right to sell your personal information.
56% will share it with government or law enforcement on a “request” — not a court order, not a warrant. A request.
One brand reserves the right to collect data on your “sexual activity.” Another references “sex life.” Six brands say they may collect “genetic information.”
I will not claim to cover all the implications here. But take a breath and read that list again.
Then, in March 2024, Kashmir Hill of The New York Times broke the story that General Motors had been quietly funneling fine-grained driving data — every hard brake, every fast acceleration, every trip — from millions of OnStar-connected vehicles to data brokers LexisNexis Risk Solutions and Verisk, who packaged it into “driving scores” and sold those scores to insurers. People who had never been in an accident saw their premiums jump 21%. Hill discovered her own car had been auto-enrolled in the program without her informed consent.
How much did it cost to buy your soul, in raw dollar terms? Per a July 2024 letter from Senators Ron Wyden and Ed Markey to the FTC, Hyundai received about 61 cents per car; Honda got 26 cents.
Twenty-six cents. The price of a gumball.
Now, ask yourself one question: if this is what the auto industry does with the data it already has — sloppily, opportunistically, and without your meaningful consent — what do you imagine it will do with mandatory eye-tracking, breath chemistry, and head-pose telemetry?
What “kill switch” actually means in 2026
Here is the bait and switch I want to name explicitly.
When defenders of the HALT Act say “there is no kill switch,” they are technically correct about the statute. They are deeply, dangerously misleading about the reality of the modern car.
Because remote disablement isn’t coming. It’s already here. It’s just not federally mandated yet.
Tesla has, over the years, remotely revoked Autopilot from used vehicles, throttled battery range on used Model S cars, and — in April 2026 — wiped Full Self-Driving from roughly 10,000 cars worldwide whose owners were running unauthorized region-bypass dongles. No refunds. No appeal.
OnStar has, since 2009, allowed police (with the owner’s consent) to remotely cut a fleeing vehicle’s engine.
Subprime auto lenders have for over a decade installed GPS-enabled “starter interrupt” devices in roughly two million American cars, bricking them when payments are missed. A 19-year-old in North Carolina was once stranded at her workplace because her engine wouldn’t start. Wisconsin banned this. Most states haven’t.
Manufacturers’ over-the-air updates can already disable, throttle, or transform features on your car without your participation. You agreed to it in the terms of service you didn’t read.
Eric Peters — the libertarian transportation writer at EPautos — has been making this point for years: “buy here, pay here” subprime kill switches are the template the federal mandate is now scaling to every new car. Imagine, he writes, how that infrastructure could be used to enforce alleged unpaid taxes, parking tickets, or child support claims. Imagine an algorithm — not a judge, not a jury — making that call before your morning commute.
In short: when defenders insist there is “no kill switch,” what they mean is the federal government does not currently hold the kill switch. The kill switch exists. We have just outsourced who holds it. So far.
Now imagine what comes next
Let me indulge in a little speculation. Not paranoia — extrapolation. The kind a reasonable person does after watching how every prior surveillance technology has actually been used.
Just imagine a public health emergency — pandemic, riot, hurricane, terrorist alert. The governor declares a 9 p.m. curfew. Now that every new car phones home over 5G and contains a federally mandated “prevent or limit operation” mode, enforcing that curfew is no longer a matter of police on street corners. It is a single API call. Boom — your minivan will not start at 9:01.
Just imagine your insurance company paying for access (legally, of course — you “consented” in the terms of service) to the same eye-tracking and attention data the federal mandate produces. Drowsy steering at 7:14 a.m.? Premium up. Glanced at the radio twice in five minutes? Premium up. Drove past a known high-risk zip code? Premium up. The modern insurer’s dream. The modern driver’s nightmare.
Just imagine an AI flagging “suspicious driving patterns” — the same way automatic license plate readers already flag “suspicious” cars to police. Your car drove the same loop three times last Tuesday. Your car parked outside a protest. Your car visited a clinic. The data exists; the only question is who gets to query it, with what authorization. If history is any guide — and history is the only guide we have — the answer is: more people, with less authorization, every year.
Just imagine a CrowdStrike-style incident on wheels. On July 19, 2024, a single bad config file from a single security vendor crashed 8.5 million Windows computers, grounded thousands of flights, knocked out 911 services, and caused an estimated $5.4 billion in losses. Now substitute “8.5 million cars on the interstate at rush hour.” This is not science fiction; it is an engineering fact about what happens when fleet-wide software updates touch safety-critical systems. The 2015 Jeep hack — when security researchers Charlie Miller and Chris Valasek remotely killed a Jeep Cherokee on the highway with Andy Greenberg inside — forced Chrysler to recall 1.4 million vehicles. That was one car model with one vulnerability. Now multiply by every car sold after 2027.
Just imagine an authoritarian regime — yours, mine, anyone’s, ten administrations from now — inheriting this infrastructure. The rule I want every reader to internalize: whatever surveillance and control infrastructure today’s policymakers build, tomorrow’s policymakers will inherit. Every one of them. The system you build for the saint will be operated, eventually, by someone else. It always is.
Here is where things get interesting (and uncomfortable). China did not need to invent special technology to block 23 million people from buying plane and high-speed rail tickets via judgment-debtor blacklists. Its officials simply hooked existing transportation infrastructure to existing legal blacklists. We are about to mandate the hardware side of that equation in every American car. I do not say this because I think the United States is China. I say it because the engineering distance between “I built this for safety” and “they’re using it for control” is approximately zero. The hard part is building the infrastructure. The repurposing is trivial.
What the smart people are saying
I am not a lone voice. The chorus of warning here is striking precisely because it crosses every tribal line we have. From the libertarian right (Cato Institute, Reason, Competitive Enterprise Institute, National Motorists Association, Heritage Foundation) through the civil-libertarian center (ACLU, EFF, EPIC, Center for Democracy & Technology, Mozilla) to consumer and security voices (Consumer Watchdog, Bruce Schneier, Senators Wyden and Markey), people who agree on almost nothing else agree on this.
The ACLU, EFF, and EPIC filed joint comments urging NHTSA to follow three principles: data minimization, no off-vehicle export, and enforceable transparency. Current law contains none of these protections. Bruce Schneier — the cryptographer who has been writing about this for over a decade — has called connected-device cybersecurity providers part of “the homogeneous backbone of modern systems,” meaning a single bad day for one of them is a household-name disaster for all of us. That backbone is now being threaded through every new car in America and Europe.
Even the politicians fighting this are bipartisan. Massie’s January 2026 amendment was supported by 160 Republicans and four Democrats — Reps. Lou Correa of California, Val Hoyle of Oregon, Marcy Kaptur of Ohio, and Marie Gluesenkamp Perez of Washington. Rep. Harriet Hageman (R-WY) called the rule “an invasion of privacy on a greater scale than we are used to seeing.” Rep. Chip Roy (R-TX) called it “a direct threat to our Fourth Amendment rights.” Florida Gov. Ron DeSantis called the concept “something you’d expect in 1984.”
These are not fringe voices. These are voices warning, in a language Americans rarely use about each other anymore: we agree, regardless of party, that this is a line we should not cross.
What does this mean for you?
Here is where I usually try to balance the controlled passion with the actionable. So:
Know what you are buying. Before you sign on a new car, read the privacy policy. I know — no one does. Do it anyway. Mozilla’s guide is the most accessible resource I have found. So is Thorin Klosowski’s EFF piece on figuring out what your specific car knows about you.
Ask before you connect. Every connected feature you enable — every app, every voice assistant, every “smart” feature — is a data spigot. Some are worth it. Most are not. Default to off.
Tell your representatives. Whatever you think of Rep. Massie, Rep. Dingell, or anyone else mentioned here, this is one of those rare issues where a single phone call to a single staffer actually moves the needle. The mandate is being implemented through executive rulemaking; congressional pressure works.
Support the watchdogs. EFF, ACLU, EPIC, Mozilla Foundation, Cato, CEI, Reason Foundation — pick the one that fits your worldview and write a check. They are the ones submitting the 18,000-comment dockets, filing the FOIA requests, drafting the principles regulators ought to follow.
Push for the right solutions to the real problem. Drunk driving deaths are real. Distracted driving deaths are real. The right responses — interlocks for convicted offenders, better transit, better street design, automatic emergency braking that already works — exist and work without conscripting every sober driver into a continuous biometric audit.
The lesson, as I see it
I will be the first to admit I may be wrong about some of this. Maybe NHTSA writes a final rule with the strongest privacy protections in the history of federal rulemaking. Maybe the technology improves to the point that the false-positive rate becomes negligible. Maybe Congress passes meaningful federal data privacy legislation — the kind that would make the GM/LexisNexis story impossible to repeat — before the mandate fully takes effect.
I would love to be wrong. I am not betting on it.
What I am betting on is this: every architecture of judgment we have ever built into a private device — the smartphone, the laptop, the smart TV, the connected appliance — has been steadily, predictably, and irreversibly expanded beyond its original purpose. Sometimes by good people for good reasons. Sometimes by bad people for bad reasons. Always by someone. The architecture is the destiny.
A car is not a phone. A car is not an appliance. A car is, for most Americans, the single largest, most personal, most freedom-defining piece of technology they will ever own. The decision to embed in every one of them an AI judge with the power to stop the engine is not a technical one. It is a constitutional one. It deserves a constitutional debate — out loud, in public, with all the trade-offs named.
Right now we are not having that debate. We are sliding into it through executive rulemaking, contested fact-checks, a missed deadline, a 164-vote House minority, and a procurement pipeline already building the hardware on the assumption that the mandate stands.
One can only hope we wake up before we strap the world’s most invasive panopticon onto the world’s most beloved symbol of personal freedom — and call it a safety feature.
My vote? Keep the road open. Keep the keys yours. Keep the watchman out of the car.
The HAIA Foundation advocates for human-aligned AI policy and the preservation of personal autonomy in an age of pervasive automation. Subscribe at substack.haia.foundation for more on the policies, products, and patterns shaping how we live with intelligent systems — and how to keep them on our side.





