An independent investigation by The Wall Street Journal found that Tesla’s Autopilot driver assistance system could cause fatal crashes due to poor decisions made by the algorithm. The publication’s journalists removed Tesla computers with the FSD function from damaged cars and, with the help of hackers, as Notebookcheck reports, gained access to raw autopilot data.
Analysis of this information, along with video recordings of accidents and official reports, allowed us to reconstruct 222 incidents involving Tesla. In 44 cases, cars on autopilot suddenly changed direction, and in 31 cases they did not stop or give way. The latter scenario, according to the study, usually led to the most severe accidents.
One of the most striking and sensational examples is a collision with an overturned trailer, which the autopilot system could not recognize as an obstacle and the Tesla car crashed into it at full speed. Other evidence is also provided when, during a test drive, the FSD system behaved inappropriately, as a result of which the car crossed a solid marking line and almost provoked an accident at an intersection. It was also found that the FSD system does not recognize emergency signals well, which leads to vehicle collisions.
An investigation by The Wall Street Journal found problems with both hardware and software in Autopilot, ranging from slow algorithm updates to insufficient camera calibration. It’s unclear whether this data is enough to refute Elon Musk’s claim that Tesla’s system is still safer than a human driver.