I was on a call when she sent the video: a Cybertruck rolling toward a concrete wall at highway speed while a driver fought the wheel. You can feel the seconds compress—the steering wheel doesn’t respond, the air tastes metallic. If you own a Tesla, that clip looks less like an outlier and more like a warning light blinking on the dashboard of a company.
I’ll cut to what matters: a Houston crash, a lawsuit that names Elon Musk, and a legal theory that blames both engineering choices and executive control. You should read this if you follow Tesla, product safety, or the legal ripple effects of automated driving.
The truck failed to turn at a Y-shaped overpass.
On August 18, 2025, Justine Saint Amour says her Cybertruck approached a split in the road in Houston with Full Self-Driving engaged. She alleges the vehicle did not follow the right-hand curve and instead drove straight toward a concrete barrier and the drop below. She claims she tried to disengage Autopilot but could not prevent a collision.
I’ve reviewed the sequence: purchase in February 2025, a few months of normal use, and then a moment where human and machine disagreed about what should happen next. That disagreement is now the basis for a lawsuit that seeks to pin responsibility on product engineering and corporate leadership.
Can Tesla be sued for an Autopilot crash?
Yes — courts are already hearing those cases. In one high-profile decision, a Florida jury awarded $243 million (≈€226 million) for a fatal 2019 crash linked to Tesla’s Autopilot, a verdict a judge recently upheld. Regulators have reacted: the California Department of Motor Vehicles pressed Tesla to stop using the Autopilot label in marketing, calling it misleading, and Tesla amended its wording to add “(Supervised)” to Full Self-Driving in 2024.
The complaint singles out marketing and product choices as causes.
The lawsuit argues Tesla overstated the capabilities of its systems and underwarned users. Saint Amour’s filing says Tesla’s manuals now say drivers must keep attention on the road, but the plaintiff contends that marketing and public statements encouraged a different belief about autonomy.
Tesla has repeatedly shifted language—Electrek and Tesla’s own owner’s manual show the “Supervised” tag appended to Full Self-Driving—and that tug-of-words is central to the claim that consumers were misled. Tesla’s position has long been that drivers must supervise; plaintiffs argue that real-world messaging contradicted that caution.
Engineers reportedly recommended LiDAR; management allegedly said no.
Internal debates about sensors are part of the public record and public argument. The lawsuit claims Tesla engineers urged adoption of LiDAR, the laser-based sensing tech used by rivals such as Waymo, while company leadership kept the fleet camera-first.
Elon Musk publicly dismissed LiDAR as a “fool’s errand” on X, and the complaint quotes that dismissal as evidence that Musk’s preferences shaped design choices. The filing says Tesla relied on vision cameras instead of LiDAR to save cost and to pursue a particular autonomy strategy.
Can you sue Elon Musk personally?
The Texas suit does just that. It alleges Tesla is negligent for “hiring and retaining Elon Musk as CEO, and allowing him to participate in product design decisions,” and claims Musk overrode engineering concerns. Naming a CEO is uncommon but not unprecedented; plaintiffs are arguing that Musk’s public promises and product decisions created foreseeable risk.
A jury award and regulatory pressure are tightening the frame around Tesla.
Courts, state regulators, and watchdog reporting are converging. The $243 million verdict in Florida, the California DMV’s move, and renewed scrutiny of Full Self-Driving place Tesla in a series of legal and public-relations quandaries.
Tesla’s brand is no longer just a carmaker in Silicon Valley mode; it’s a litigated actor whose product claims are being tested under tort law. Tesla’s marketing has become a carnival mirror, bending truth into glossy promises, and that distortion weakens legal defenses centered on informed consumer choice.
Lawyers are reworking the narrative from hardware failure to managerial fault.
Outside counsel are using leadership choices to expand liability. The theory is straightforward: if engineers warned about LiDAR and were overridden, and if the CEO publicly minimized a safer approach, then responsibility extends beyond factory engineers to executive decision-makers.
Musk’s public persona and product proclamations—on X and in interviews—are now evidence in civil litigation. The lawsuit argues those statements affected consumer expectations and design priorities. Musk’s role at Tesla has been a steering wheel of iron, turning design choices single-handedly, and that centralization may be a legal vulnerability.
What the case could change for drivers and the industry.
If plaintiffs prevail, manufacturers might face stricter labeling rules, a push toward redundant sensor suites (including LiDAR), and new limits on how autonomy is marketed. Regulators such as the California DMV and reporting outlets like Reuters and Gizmodo will keep pushing for clearer consumer protections.
As a reader who might drive a vehicle with driver-assist tech, you should take away practical steps: keep firmware updated, treat any “autopilot” label with skepticism, and maintain active supervision behind the wheel. I follow cases like this because they shift the balance between innovation and safety.
Tesla did not immediately respond to a request for comment to Gizmodo, and Musk has long defended Tesla’s camera-first approach. The legal fight will test whether product failures can be traced back up the chain to leadership and public messaging. Who wins could reshape how we trust driver-assist systems—do we let companies promise autonomy, or do we force clearer boundaries?
Will the courts hold a CEO accountable when a car steers itself wrong and a company’s public voice helped sell the dream?