Most active driver assistance systems do not provide for avoiding unexpected obstacles, and Chinese automakers have advanced the furthest in this area. The other day, a video of an accident involving a Tesla went viral on the Internet, which “decided” to collide with another car so as not to run over a suddenly fallen pedestrian. On the pages of Reddit, the audience started a debate about who was driving the electric car. Not everyone is sure that it was automatic.

Image source: Reddit, 1heavyarms3

The incident happened in the Romanian city of Brasov. A white Tesla Model Y electric car was moving along a busy square, which was separated from the roadway by rare metal bollards. A few meters before the electric car, a pedestrian suddenly fell onto the road, tripped over a clearing in the paving stones and lost his balance. The white car suddenly braked and swerved to the left, as a result of which it collided with an Audi car moving in the oncoming lane. The pedestrian, judging by the video footage, suffered minimal damage due to the fall, and the Tesla, which bounced off an oncoming car, hit him a little with its rear bumper.

Details of the incident published by the Romanian press suggest that the white Tesla was driven by a woman who did not trust the automatic controls at that particular moment, and therefore was able to react by steering the vehicle away from the pedestrian, but not avoiding a head-on collision with an oncoming car. Insurance covered material damage, eyewitnesses also report that the Audi driver who collided with the Tesla was injured. The foreign tourist who fell on the roadway was provided with medical assistance. The investigation is now checking the actions of the woman driving the Tesla. As noted, by the time the police arrived at the scene, she had a panic attack.

This incident attracted public attention, as some sources began to claim that the decision about an imminent collision with an oncoming car in order to save the life of a pedestrian was made by Tesla’s on-board automation. It is worth noting that the company is testing its latest versions of software for active driver assistance systems only in the American market, so a European car could hardly gain the ability to independently avoid an obstacle. But this possibility cannot be completely ruled out – this could be a version originally intended for the United States. In most cases, Tesla electric vehicles in such a situation are programmed to perform emergency braking while maintaining the same straight trajectory. Tesla in the US market, after complaints from regulators, was forced to add a clarification to every mention of the FSD complex, which stated that it requires constant driver attention and allows only short-term removal of hands from the steering wheel.

It must be said that among Tesla’s competitors there are companies that have moved much further in certifying the so-called “level three autopilot,” which no longer requires the driver to constantly monitor the road or hold the steering wheel. Mercedes-Benz has received road approval for the systems in the USA and Germany, but even in this case the driver is involved in the driving process for the bulk of the journey. Back in 2021, Honda received permission to release a Honda sedan with a third-level autopilot system on Japanese roads, but the car was offered exclusively for leasing to legal entities, and therefore did not find widespread use. If we return to the incident with the Romanian Tesla, then there is a high probability that in this case we were not talking about the often discussed dilemma of choosing between two objects for a collision at the level of automation in conditions where someone will inevitably get hurt. This choice was most likely made by man.

Leave a Reply

Your email address will not be published. Required fields are marked *