The horrible fatal crash of a Tesla employee using Full Self-Driving Beta has been reported in detail for the first time to highlight responsibility in those accidents.
The Washington Post released a new report on the crash today, which happened back in 2022.
Hans von Ohain, a recruiter at Tesla, and his friend Erik Rossiter set out outside Denver, Colorado, in the former’s Tesla Model 3 to go golfing.
During the drive there, Rossiter says that von Ohain was driving on FSD beta, Tesla’s driver-assist system that takes over all the driving controls but the driver needs to keep their hands on the steering wheel and be ready to take control at all times.
Rossiter said that FSD Beta swerved several times during the drive there and von Ohain had to take control.
They played 21 holes and drank alcohol during the day before driving back. Rossiter said he seemed composed and “by no means intoxicated” when getting into the car for the drive back.
The Washington Post described the crash:
Hours later, on the way home, the Tesla Model 3 barreled into a tree and exploded in flames, killing von Ohain, a Tesla employee and devoted fan of CEO Elon Musk. Rossiter, who survived the crash, told emergency responders that von Ohain was using an “auto-drive feature on the Tesla” that “just ran straight off the road,” according to a 911 dispatch recording obtained by The Washington Post. In a recent interview, Rossiter said he believes that von Ohain was using Full Self-Driving, which — if true — would make his death the first known fatality involving Tesla’s most advanced driver-assistance technology.
While Rossiter admittedly doesn’t have a great recollection of what happened, he did say he remembers getting out of the car, a big orange glow, and then trying to get his friend out of the car as he was screaming inside of the burning car. A fallen tree was blocking the driver’s door.
An autopsy of Von Ohain found that he died with a blood alcohol level of 0.26 — more than three times the legal limit.
Colorado State Police determined that intoxication was the main factor behind the accident, but it also conducted an investigation into the possible role of Tesla’s Full Self-Driving Beta.
Von Ohain’s widow Nora Bass wants Tesla to take responsibility for her husband’s death:
“Regardless of how drunk Hans was, Musk has claimed that this car can drive itself and is essentially better than a human. We were sold a false sense of security.”
She hasn’t been able to find a lawyer to take the case because he was intoxicated.
Colorado State Patrol Sgt. Robert Madden, who led the investigation, has rolling tire marks at the site of the crash, which means that the motor kept sending power to the wheels at the time of impact.
There were also no skid marks found.
“Given the crash dynamics and how the vehicle drove off the road with no evidence of a sudden maneuver, that fits with the [driver-assistance] feature”
We don’t have access to the logs. The police were not able to recover it after the fire, and Tesla reportedly told the police that it didn’t receive the logs over the air. Therefore, it couldn’t confirm if any driver-assist features were activated at the time of the crash.
That’s horrible. I can’t imagine trying to drag your screaming friend out of a burning car. I am sorry for Von Ohain’s loved ones.
Based on the information we have here, it does seem like Von Ohain was intoxicated and overconfident in FSD Beta. The feature failed badly, and he couldn’t take control in time to avoid the fatal crash.
They are both at fault. Von Ohain, rest in peace, had no excuse for getting behind the wheel intoxicated, and it sounds like Tesla’s FSD Beta failed badly.
But if we dig a little bit deeper, it is an interesting situation.
To be honest, the fact that he was a Tesla employee makes this whole situation a lot more complicated. It means that he should have known very well that you need to pay attention on FSD Beta and be ready to take control at all times.
Now, it might be because of his intoxication that he decided that it would be a good idea to use FSD Beta on winding mountain roads while intoxicated, or he might have been taking chances with FSD Beta even when not intoxicated, which is what his wife is pointing to about a “false sense of security.”
This is definitely something where Tesla can improve: managing expectations when it comes to FSD Beta, which is not easy to do when you literally call it “Full Self-Driving.”
The Washington Post has released a detailed report on a fatal crash involving a Tesla employee using Full Self-Driving Beta, shedding light on the responsibility in such accidents. The crash occurred in 2022 when Hans von Ohain, a Tesla recruiter, and his friend Erik Rossiter were driving in von Ohain’s Tesla Model 3 to go golfing outside Denver, Colorado. Rossiter stated that von Ohain was using FSD Beta, Tesla’s driver-assist system that requires the driver to keep their hands on the steering wheel and be prepared to take control at all times.
During the drive, Rossiter claimed that FSD Beta swerved several times and von Ohain had to intervene. They played golf and consumed alcohol before heading back. Hours later, on their way home, the Tesla Model 3 crashed into a tree and burst into flames, resulting in von Ohain’s death. Rossiter informed emergency responders that von Ohain was using an “auto-drive feature on the Tesla” that “just ran straight off the road.” Rossiter believes von Ohain was using Full Self-Driving, potentially making it the first known fatality involving Tesla’s advanced driver-assistance technology.
An autopsy revealed that von Ohain had a blood alcohol level of 0.26, over three times the legal limit. While Colorado State Police attributed intoxication as the main cause of the accident, they also investigated the potential role of Tesla’s Full Self-Driving Beta. Von Ohain’s widow, Nora Bass, holds Tesla accountable for her husband’s death, stating that they were sold a false sense of security, despite his intoxication. However, the investigation was unable to access the vehicle’s logs, as they were not recoverable after the fire.
Electrek’s analysis suggests that both von Ohain and Tesla share responsibility. While von Ohain was intoxicated and should not have been driving, there are concerns about the expectations set by Tesla regarding FSD Beta. The fact that von Ohain was a Tesla employee adds complexity to the situation, as he should have been aware of the need to remain attentive while using the driver-assist system. The incident highlights the need for Tesla to manage expectations and clarify the capabilities of their technology to avoid potential misuse in the future.