You and I can witness a real technological revolution. Catch the moment when driven cars will become part of history, and unmanned vehicles will come to replace them.
Make way for the drone
Some of the autopilot functions are already available to the driver. You can, for example, remember about the advanced driver assistant used on Tesla electric vehicles. He is responsible for independently moving the car along the lane, as well as for changing lanes to the adjacent lane. To change lanes, you just need to give the system a command, and it will do everything for the driver. Such an autopilot can regulate the speed of the vehicle. All this is achieved through multiple devices working as a single unit. The car has special ultrasonic sensors that other cars look for. Frontal radar is responsible for the ability to "see" through rain or fog, and a special front camera captures road signs. With the help of an accurate GPS tracker, the work of the entire system is monitored as a whole. Tesla Motors' autopilot has already received scandalous fame. So, in 2016, there was an incident where an electric car on autopilot crashed into a small truck. In the same year, the Tesla Model X crossover crashed into the wall of a house, allegedly starting to spontaneously accelerate in the parking lot. However, according to Tesla representatives, a person is still to blame for the latest incident, since the car was manually operated.
However, this is only the first step. An advanced autopilot will soon appear, which will be able to perform almost all actions for the driver. Nowadays, many companies are doing something similar. Perhaps the most famous of these is the Google drone. In December 2016, this project was spun off into a separate company Waymo, a subsidiary of Alphabet. The concept itself was worked out on the basis of cars such as the Toyota Prius, and in the foreseeable future, Google planned to produce its own cars. Small two-seater electric cars, the speed of which (in any case, the first samples) is limited to 40 km / h. Then, however, the idea of making their own cars was abandoned. Before introducing its own car, Google tested the "self-driving" technologies of other cars, among which were originally six Toyota Prius, three Lexus RX450h and one Audi TT. As of summer 2015, twenty self-driving Google cars drove through the streets of Mountain View, California. They practiced maneuvering in the stream of cars.
Work on unmanned vehicles is in full swing in the bowels of Tesla, owned by Elon Musk. Interestingly, by about 2020, Russia wants to create an unmanned KamAZ truck, which may appear on public roads by 2025.
In general, it is the time of the beginning of the massive use of unmanned vehicles that is most of all interested in specialists. There are few doubts that such cars will enter the market, but there are heated discussions about their widespread use. Someone says that drones will replace conventional cars by 2020, others argue that this will happen a little later. Here you need to take into account that the level of infrastructure development in different countries differs very much, and what will be acceptable for London or New York will be completely unacceptable for the capital of one of the third world countries. Not so long ago, a representative of the online taxi service Uber, Travis Kalanick, made his prediction in this regard. In his opinion, drones will be widely deployed across the planet in twenty years. Moreover, in "desert" Africa, he said, it will be even easier to do than in such busy megacities as Moscow.A lot of cars, people and obstacles - this is what Kalanick sees as the main obstacle to the introduction of unmanned vehicles. A more optimistic forecast was given by Elon Musk: “I believe that autonomous driving is basically a solved problem. There is only one area where a slightly delicate situation remains - where you need to drive at a speed of 40-60 km / h. " The head of Tesla noted that in less than a few years it will be possible to create self-driving cars that are fully operational. That is, the main innovator of the planet de facto declared that “ordinary” cars had only a couple of years left to live. Whether this is true or not, we will soon see for ourselves. In the meantime, let's see how an unmanned vehicle works.
In 2016, a team of former Google engineers created Otto, a company that specializes in the development of unmanned cargo transportation. Specialists are developing special kits that will enable existing trucks to move in unmanned mode. Volvo cars were the "testers".
Sensors and Cameras
It is best to consider the concept of advanced autopilot using Waymo as an example. The system works using information that is collected by the Google Street View service. This, recall, is a special function of Google, which allows you to view panoramic views of city streets from a height of about 2.5 meters. As for the unmanned vehicle itself, the main subsystems look like this:
- Video camera
- Position gauge
The LIDAR sensor is mounted on the roof of the vehicle for greater efficiency. It rotates and scans the area within a radius of 60 m. The car has four radars installed in the front and rear bumpers: due to them, the car can effectively identify obstacles. Range, height, direction of movement and speed of the object - information about all this is transmitted by radars. They make it possible for the car to "see" far enough to react to changes on the track. In turn, a special sensor, which is connected to one of the rear wheels, determines the location on the map of the Google mobile itself. Geographic coordinates such as latitude, longitude, and altitude are captured. When the geostationary satellites that are broadcasting the GPS offset correction are “visible” by the vehicle, the device enters differential GPS mode (high GPS accuracy is ensured). When no correction signal is available, the device uses a signal with standard GPS accuracy. And, finally, a video camera located next to the rear-view mirror. She "sees" traffic lights and moving objects.
Now let's take a closer look at how car systems interact and what helps an unmanned vehicle to act in a variety of situations. The "heart" of the system is the LIDAR sensor manufactured by Velodyne (we have already discussed it above). With this device, you can generate a detailed volumetric map of the surrounding area. The on-board computer connects the information received from the sensor with a map of the area in memory. Then a special algorithm comes into play, assessing the situation taking into account how other road users can behave. The computer also calculates the trajectory that the unmanned vehicle should follow. It evaluates not only the type of object, but also details such as the gesture of a police officer. This is very important for normal movement.
Google engineers started with the simplest. They simulated the movement of a car on a road with relatively few vehicles. In such a situation, the autopilot had little choice: turning left or right, braking or accelerating. Both the computer model and the first Google cars that were born coped with all this. However, everything becomes much more complicated if you transfer such a car to a large settlement. The point is not only that there are thousands of different objects around the "smart" car.You still need to be able to distinguish between them. Let's say a drone must be able to respond to an accident ahead and to police officers nearby. If there is a school bus in front of the car, this should make the autonomous vehicle go more carefully.
The described system also has its drawbacks. As of 2013, Google cars were unable to drive normally in heavy rain or snowy terrain. The fact is that the identification of the area surrounding the car is made by comparing previously captured photos with the rendering of the landscape that surrounds the car now. This approach allows in normal weather conditions to distinguish one object from another. However, in bad weather, it is difficult for the system to distinguish between, say, a person and a pillar. But even what we have now is the result of long hard work.
Cars with an advanced autopilot function can appear not only on the roads, but also … overhead. Not so long ago, Terrafugia introduced the TF-X concept, an unmanned flying vehicle. They want to equip the novelty with folding wings with two propellers: they should allow the device to take off and land vertically. We will see a working sample in about ten years.
Self-driving cars "learn" faster due to the fact that the information they receive is sent to a database from which all cars get information. It is believed that an unmanned vehicle can successfully solve any "road" task that it sees. But there are also quite unexpected situations. So, during one of the drone tests on the road, a woman in a wheelchair chased a duck. Of course, there was no such scenario in the database. We do not know how the fate of the unfortunate animal developed, but at that moment, despite the absence of similar incidents, the car made the right decision - to brake. In general, an unmanned vehicle somehow does not work out with birds. For example, a bird is flying in front of an unmanned car, and what should the car do at this moment? The very first option is to stop. But the car will not fall into a stupor every time it sees birds. Therefore, developers from Google had to work hard to teach the car to properly respond to obstacles.
In general, experts have repeatedly said that Google's autopilot needs to be “humanized”. It was pointed out that he was behaving "too well." The first versions, seeing a person near the road, could stop the car, thinking that he was going to cross the road. Meanwhile, the pedestrian did not necessarily dare to cross, but could simply stand, waiting for a friend or girlfriend. Google experts decided that in such a situation it is better for the car not to stop completely, but to slow down. After all, a sudden stop for no apparent reason can in itself be the cause of an accident - someone can crash into a Google car from behind.
In the context of "humanizing" their creation, the developers endowed the car with the ability to honk. A special algorithm was created to automatically trigger a signal in case of increased danger for someone. Such a situation could be an unwary pedestrian running onto the road, or a cyclist making a dangerous maneuver. In addition, it is assumed that the car will honk if it itself comes close to causing an accident. The car will give different signals. For example, if in the parking lot another driver slowly backs away and does not see the Google car, the system will beep twice abruptly. In a more serious situation, the autonomous vehicle uses a long continuous signal. In May 2016, experts from the Georgia Institute of Technology (USA) presented a control model that allows an unmanned vehicle to maneuver in extreme situations. The concept is based on a unique model of anticipatory pathway integral control (MPPI).It is designed to solve the problem of nonlinear dynamics in the control of the machine when driving in close proximity to obstacles. This system, among other things, will allow the autopilot to effectively overtake.
Of course, experts are most concerned with safety. It was for this that all the fuss was started. If self-driving cars do not prove that they are less dangerous than driven cars, then the whole concept will collapse. However, the likelihood of this is negligible. Drone accidents are rare: as of early July 2015, there were fourteen reported accidents involving Google vehicles. This figure may seem impressive, but do not forget that in the case of self-driving cars, any, even the most insignificant, incidents were recorded. If they happened to ordinary drivers, they might not have registered at all.
In general, the prospects for drones are very, very good. None of the difficulties described above can be a serious obstacle to the dominance of self-driving cars.