The Arizona Tesla Incident

In late 2023, a unique and tragic incident occurred in Arizona, involving a Tesla reportedly operating on its Full Self-Driving (FSD) mode, raising significant concerns about the safety and future of autonomous vehicles. This event stands out among the 40,901 traffic fatalities that year because it was the only one involving a pedestrian being struck by a Tesla using FSD, coinciding with the launch of Tesla’s Robotaxi in Austin. This occurrence prompts a reevaluation of the readiness of autonomous vehicle technology.
Accident Details

The accident took place in November when Johna Story, a 71-year-old grandmother, stopped on the interstate to assist victims from an earlier accident. A Tesla, driven by Karl Stock, veered unexpectedly due to what seemed to be visual obstructions caused by direct sunlight. The car then crashed into a parked Toyota 4Runner and tragically hit Story. Stock later stated that the unfolding events happened too quickly to avoid collision, underscoring the challenges faced by both human and automated responses in such scenarios.
FSD Under Scrutiny

While some reports suggest FSD was engaged during the crash, the police report didn’t confirm this claim. This lack of clarity underscores the broader issue of FSD systems struggling under certain conditions – in this case, bright sunlight that mimics human visual impairments. The National Highway Traffic Safety Administration (NHTSA) has initiated an investigation into such incidents, focusing on visibility challenges like sun glare affecting FSD performance.
Vision vs. Lidar and Radar
Tesla’s reliance on a vision-only system, as opposed to the use of lidar or radar, is increasingly questioned. Vision-based systems, akin to human sight, struggle with conditions like fog, glare, or dust where lidar might perform better. Lidar can detect obstacles beyond visual obstructions, potentially preventing accidents in scenarios like this. However, systems using lidar, like those once developed by Cruise, which faced shutdown due to crashes despite heavy investment in radar and lidar technology, indicate there’s no perfect solution.
Driving Experience
The drive feels seamless and high-tech when the system works perfectly. The car handles like a futuristic robot chauffeur, smoothly navigating traffic while the driver can almost forget about traditional driving tasks. However, it becomes quite clear that Tesla’s reliance on vision-only technology is both ambitious and controversial. Compared to traditional systems or those using lidar, Tesla’s approach can feel less assured in challenging environments.
Implications for the Future
With Tesla’s ongoing efforts to enhance and expand its autonomous fleet, especially in the growing Robotaxi sector, these issues are critical. The promise of Level 5 autonomy—vehicles that can handle road conditions independently—is on the horizon, but these incidents highlight the journey’s complexity. Tesla maintains that its systems already enhance safety and work better than human drivers on average. But, without transparent and independent verification, these claims remain contentious.
Tesla’s path is one of technological ambition, walking a fine line between innovation and reliability. The debate on the safest way forward—whether technological evolution should prioritize vision, radar, or lidar—continues to unfold, with incidents like the Arizona crash serving as a stark reminder of the stakes involved.
Kia K5 Lights Recall
Oklahoma Plate Uproar
Widebody Rolls-Royce
BMW 507: Timeless Gem
Tesla Model Y Turmoil