Fatal accident reveals Tesla autopilot's inability to control driver

Fatal accident reveals Tesla autopilot's inability to control driver
Fatal accident reveals Tesla autopilot's inability to control driver
Anonim

Judging by the absence of a person in the driver's seat, the car was being controlled by an autopilot. In theory, he tracks the presence of a person behind the wheel and does not go without him. But in practice, a person can always cheat a computer. The inability of the computer to control a person in the described case led to sad consequences. Despite this, according to the latest data, Tesla's autopilot reduces the risk of an accident nine times. However, with one caveat.

Image

According to the KHOU-TV channel, near Houston there was an accident involving a Tesla Model S. The car flew off the road at a sharp turn, crashed into a tree and caught fire. It took a long time to extinguish the fire, and as a result, the bodies of two passengers were found in the car. None of them were in the driver's seat. Judging by the parameters of the accident, both victims could hardly move immediately after hitting a tree. From this, the local authorities conclude that the car was driving on autopilot.

Normally, Teslas do not drive if there is no person in the driver's seat. To make sure that it is there, the driver is required to periodically touch the steering wheel. Some of the company's cars still monitor the presence of a person behind the wheel with a camera. However, a number of owners of these machines bypass the restrictions. To simulate touching the steering wheel, they tape a half-empty can of drink to it: it shakes the can on bumps, and it shakes the steering wheel slightly, deceiving the autopilot. Others put a homemade scarecrow in the driver's seat and drive alongside. The camera cannot distinguish a dummy in dark glasses from a person and allows such a ride. Of course, such actions are unsafe, but it is not possible to convince all car owners of this due to the well-known features of human nature.

The incident underscores the main weakness of all autopilots on the planet: they are controlled by algorithmic computers. Real machines are run by people whose nature of thinking is unknown, but it is clear that it is not algorithmic. To understand how to act, autopilots compare the view of the road in front of them with the algorithms of behavior on a particular road embedded in the software. This means that if the algorithms for recognizing road conditions give an error, then the computer will react incorrectly: as in our case, when it “did not see” the bend in the road and flew off it. It is not yet clear how this problem can be solved. There are still no computers capable of thinking not in algorithms, realizing the surrounding reality, and not going through the programmed options without understanding what they describe. And it is impossible to come up with them now, since there is no clarity of how consciousness works in people.

At the same time, it should be noted that the Tesla autopilot as a result has good overall accident statistics. According to the company's report for the first quarter of 2021, cars driven by its autopilot had an accident once every 6.74 million kilometers. Teslas, whose owners did not pay for the autopilot when they bought it (Elon Musk calls it "passive autopilot"), showed one accident for about 3.2 million kilometers. Such cars turn on the brakes in front of an obstacle that the human driver did not notice, but only at the last moment. The company's cars without the included autopilot and "passive autopilot" showed one accident per 1.6 million kilometers.

On average, for the United States, one accident occurs for less than 0.8 million kilometers. This gave Musk reason to loudly declare that driving on autopilot is almost ten times safer than without it. Meanwhile, this is not entirely correct. Yes, the accident rate of a Tesla on autopilot is about nine times lower than the average American driver in the average American car. But this is an unfair comparison. From the statistics above, it is noticeable that the average "Tesla" without any features of the autopilot becomes a participant in an accident half as often as an ordinary American car. This is because it has a very effective braking system and a number of other features associated with the risk of getting into an accident. An honest comparison can only be made between Teslas without the features of the autopilot and them, but with the autopilot. It shows that autopilot is more than four times safer than driving by hand today.

Image

At the same time, for all the correctness of this thesis, deaths like those described above will definitely not disappear completely for many years to come. The autopilot really reduces the accident rate, but at the expense of the fact that it does not make "human" mistakes. But he makes - although much less often than people - his own mistakes, which a person can make very rarely.

The head of Tesla himself, Elon Musk, recently managed to say: he is sure that his cars will drive safer than cars with a "live" driver this year. If we proceed from the general accident rate, then this, as we can see, is a fait accompli. But in practice, it is doubtful that all people will trust the autopilot, which makes its own, "inhuman" mistakes and flies out of the way. Most likely, the authorities will be very reluctant to allow driving on full autopilot. Even in spite of the fact that it is less emergency than driving a "live" driver.

Popular by topic