Olga Uskova: "Fully autonomous cars are a question of 2025, but this phenomenon is unlikely to become widespread in 2025"

Olga Uskova: "Fully autonomous cars are a question of 2025, but this phenomenon is unlikely to become widespread in 2025"
Olga Uskova: "Fully autonomous cars are a question of 2025, but this phenomenon is unlikely to become widespread in 2025"
Anonim

Naked Science spoke with Olga Uskova, President of Cognitive Technologies, and inquired in detail about the present and future of unmanned vehicles, how computer vision systems of unmanned vehicles work, as well as about the plans of the company, one of the leaders of the modern IT industry and a Russian developer of control systems. unmanned vehicles.

fotocognitive_2017

Olga Anatolyevna, they have been talking about unmanned vehicles for a long time. So when will it be possible to come to a car dealership and buy a fully autonomous car - one that leaves the garage or parking space in the morning, meets you at the entrance and takes you to work?

Fully autonomous cars are a 2025 issue and a fairly balanced opinion. Almost all leading manufacturers of both automobiles and artificial intelligence systems, as well as component parts, agree with this.

Today the developers have realized that there are no fundamentally insoluble technical problems and they can predict with sufficient accuracy the periods of completion of the main work. These terms are also due to both the technological and legal state of the market, as well as the pace of its development.

However, this phenomenon is unlikely to become widespread in 2025. It should be understood that there is still a 10-year period of complex adaptation of fully robotic vehicles, their socialization in the framework of interaction with people, etc. Approximately the same as it was with the transition from horse-drawn transport to automobile transport. This will not happen smoothly and exactly at the same time all over the world. This process will necessarily depend on the country where it takes place, its economic, social, gender, religious and other factors.

It will also inevitably be accompanied by the absorption or even colonization by market leaders of those countries that will lag far behind in any of its aspects. For example, if the legal issues of the use of unmanned vehicles have not been resolved in a country, then, obviously, after a while it will have to accept the rules of the game established by those who have already worked and tested them. Agree with external regulations for certification and standardization, for which you will have to pay. Finally, as practice shows, third markets will receive lightweight models that are understandable for the population of this country.

In this sense, I would call the situation a military one, since we face one of the most monetary and strategic issues in terms of its importance for the country's economy and the market as a whole. The sale of cars and their components from the point of view of mass use can only be compared with mobile telephony.

What else needs to happen in the remaining time? What are the challenges facing software and hardware developers in the coming years?

The main problem facing the developers is the creation of systems that ensure 100% road safety or as close to this figure. The created intelligent driver assistance systems should guarantee the accuracy of detecting road scene objects not just by 98% or 99%, but by 99.999999%.The loss of one or two nines after the decimal point is tens and hundreds of thousands of lives around the world.

Today it is impossible to create solutions that meet such requirements. Experts allot at least 7-8 years for this. The developers of the software part will have to collect information for training detectors, including those based on neural networks, to test their algorithms on millions and even tens of millions of kilometers of roads for testing in real traffic situations.

Hardware manufacturers will have to solve the problem of miniaturizing sensors and computers without losing their basic characteristics and performance. Another challenge on the road to mainstream use is also a significant reduction in the cost of cameras, radars, lidars and other devices.

What unsolved problems are there in the field of legislation?

Legislation is the main problem preventing the massive proliferation of unmanned technologies. The United States has advanced further in this matter, where legislation has been passed at the level of individual states that allow cars with autopilot mode to go on public roads. Europeans and leading Asian countries, including Japan and China, are moving slightly behind the Americans, which also allow vehicles with intelligent driver assistance systems to move in certain territories. However, the Europeans have yet to amend existing documents, such as the Vienna Convention on Road Traffic, which states that every car must have a driver who is obliged to drive the vehicle.

Has Russian legislation lagged far behind the legislation of the United States or individual states in this regard?

In the sphere of Russian legislation, not a single problem has been solved and no systematic work is being carried out. There are no clearly defined groups that would deal with these issues. There are separate declarative statements and small initiative groups that are trying to raise such questions. Therefore, Russia runs the risk of falling into complete dependence, into the secondary echelon, when it will be forced to adopt international legislation and pay for someone else's certification and for someone else's standardization.

Do we have to change something so that self-driving cars can go to public roads?

In order for unmanned vehicles (UAVs) to be able to enter public roads, it is first necessary to introduce the concept of UAVs at the legislative level, because at the moment we have only such a concept as a modified vehicle. And also to determine the person responsible for the accident with the participation of the BPTS.

In the absence of legislation, domestic developers work semi-legally.

Will self-driving cars be available to the mainstream? How much will the price of a drone be higher than the price of a car that will have to be operated manually?

Unmanned vehicles will be 100% available to the mass buyer, because all major manufacturers, without exception, are now guided by this.

The economic calculations in which we participate are based on the fact that the price of a drone should not exceed the price of a car by more than 15–20%. And these are absolutely real numbers.

As you can see, there is still time before the appearance of fully unmanned vehicles. Which of the technologies you develop can be useful today? What can you offer to help the driver right now?

Today we offer solutions based on our intelligent driver assistance platform C-Pilot. This is a system that currently corresponds to level 1 according to the international classification of driver assistance systems (or ADAS - advanced driver assistant system) of SAE International. According to the ranking, 0 level has a system with manual control and the possibility of intelligent prompts in dangerous situations, 5 level - completely autonomous control.

We now offer solutions based on our intelligent driver assistance system primarily for commercial fleets or car fleets (freight transport, taxis, logistics, etc.). Such systems, according to experts, already now make it possible to raise transportation safety up to 20-25%. Similar solutions are offered for the use of transport in limited, closed areas (quarries, in-plant zones).

Then, already in the first half of this year, together with KAMAZ PTC we plan to present a prototype of a truck equipped with our ADAS. It is assumed that it will include modules for warning about the danger of collisions with cars, pedestrians, crossing the line marking, as well as identification of road signs. This will already be a real pre-industrial prototype, which can be brought to the stage of industrial production by the end of the year.

Finally, by the end of the year, we will be ready to present to potential customers a system for automatically guiding the combine along the edge of the field or swath. This is the first domestic intelligent driver assistance system for the tasks of "smart" agriculture. The presence of the driver in the cockpit will be mandatory. He will be able to take his hands off the steering wheel, but he will need to control the movement process and be ready at certain moments to take control.

It is worth saying that "smart" agriculture is one of the most promising areas, since it does not require solving legislative issues for the use of unmanned systems.

What international projects are you planning to implement in the near future?

We are actively entering the international market. In 2017, we expect a significant breakthrough in this area. Our immediate plans include the opening of an American office. We are also actively cooperating with Chinese companies. This region is in second place in our priorities. We deal with Europe separately. These are France and Germany. We already have preliminary agreements with the largest European car manufacturers for the sale of our ADAS system. It is worth saying that after the Tesla accident, many well-known automakers began to view us as a real alternative to the market leader, the Israeli company Mobileye. We have started negotiations for the sale of C-Pilot with the world's largest companies - owners of commercial fleets. We are also in the process of negotiating the use of our intelligent driver assistance system with one of the largest suppliers of agricultural equipment.

The approaches of the developers of unmanned control systems differ. Someone bets on a passive model, someone on an active one. What is the C-Pilot approach? And what is your competitive advantage over other developers?

There are two different approaches to building AI systems for self-driving cars today. One of them is based on the fact that the maximum information about the traffic situation comes from sensors (radars, lidars, etc.), while technical vision is a purely auxiliary and insignificant function. Some sources call it active, since its concept implies the predominant use of emitting devices. A professor at the University of Cambridge, Harry Bhadeshia, gave this approach a fairly accurate definition - "smart cities and smart roads."

The second approach is based on the anthropomorphic model. It has to do with simulating how the human brain works, and here computer vision is essential, such as having a chauffeur inside a car. Bhadeshia also gave it a very accurate name - "ordinary cities and thinking machines." This approach is also called passive, since it is based on the information absorption model.

This is the approach we took initially.The smart city approach does not work in our realities, while the anthropomorphic model allows us to achieve the required level of safety in bad road and weather conditions: in the absence of markings, damage to the roadway, snow, as well as in rain, thunderstorms, etc. e. For Russia, this situation is typical. Therefore, in order to create artificial intelligence systems capable of “understanding” it adequately and at the same time guaranteeing the required level of safety on the road, we had to create more perfect mathematics in comparison with competing models operating in regions where +25 all year round, perfect markings and smooth roads …

It should be said that there are about 70% of roads of the same quality as in Russia or those close to it. Therefore, we have a competitive advantage over foreign companies relying on city conditions close to ideal. In addition, having trained our math on bad roads and in unfavorable weather conditions, it is unlikely that we will be able to detect objects of the road scene worse than competitors in conditions close to ideal.

However, there are more and more supporters of the second approach in the world. At the end of last year, after the accident with Tesla, the leader of the market for driver assistance systems, the Israeli company Mobileye, whose solutions are used by many well-known automakers, including Volvo and BMW, also announced the possibility of working in poor conditions.

You have an interesting approach to training the neural network underlying C-Pilot. You are using videos received from DVRs and posted on the Internet. How it works?

There are two main types of data on which neural networks are trained. The first is real data about the traffic situation received from the DVRs. The second is the so-called synthetic or virtual data. The fact is that it is extremely difficult to recreate the entire set of situations that can happen on the road based on real data. Therefore, developers can carry out work using so-called virtual polygons, when the road scene or its elements are simulated artificially. The disadvantage of this approach is that not all characteristics of the road scene and its objects will be adequate. As a result, the results of training a neural network may differ from training on real data. And this at some point can lead to undesirable consequences. Therefore, we advocate a real data learning approach. We train neural networks by repeatedly traveling across territories in different weather and climatic conditions. And to collect additional information, we recently threw a cry for volunteers to help us with the transmission of video sequences filmed by their video recorders and containing complex, dangerous and critical situations on the road, on which we will be able to train our neural networks in the future.

We are not fond of virtual polygons, but use only to simulate individual, for example, dangerous situations that rarely occur in real life.

What technological results have been achieved by the company to date?

In recent years, we have created a number of computer vision technologies that allow us to maintain the required level of road safety in bad weather, road and climatic conditions.

This is, first of all, the technology of a virtual tunnel, which allows detecting the roadbed in domestic realities, regardless of the presence of markings, asphalt or other pavement. The technology also works successfully in snowy, rainy weather and just off-road. It is based on the feature model. For example, our algorithms are able to distinguish the road space from the curb and other elements of the road scene based on certain features. The set of spatial blocks receding into the distance, which have similar features and are similar to each other, form geometric shapes that resemble a tunnel.

Virtual Tunnel Technology / © Cognitive Technologies

The next technology is foveal vision modeling, which allows us to see in high resolution only those objects of the road scene that are most important in the current analysis of the situation. We act on the same principle as a person sees. When we look into the distance, we, thanks to foveal vision, clearly see only a narrow area, that is, 5–7% of the total area of ​​the entire picture (Fig. 1, 2), where our gaze is, in fact, directed. The rest of our peripheral vision is fuzzy and vague. Thus, we process only essential information, discarding "garbage" data, thereby significantly saving computing resources.

Image

Fig 1 / © Cognitive Technologies

Fig 2 / © Cognitive Technologies

It is also a technology for modeling the function of the human hippocampus, which gives the autobot the ability to interpret with high accuracy complex situations that arise, as a rule, in critical situations (the sudden appearance on the road of other road users, pedestrians, foreign objects, etc.).

In fact, the model of the hippocampus is an analogue of human short-term memory, which selects and retains the most important information on the current situation in the stream of external signals, performing the function of short-term memory storage, like computer RAM.

It is worth saying that it was this technology that would have avoided that landmark Tesla accident that occurred last year (Fig. 3). Our system would be able to detect a vehicle approaching from the side. Information extracted from memory, obtained from a greater distance and with a greater angle of coverage of the object in question, would allow the wheels, suspension, and other elements of the truck to be “seen”, which would ultimately make it possible to identify the approaching object as a car and gain a fraction of a second, extremely necessary to make the right decision in a critical situation.

Image

Fig 3 / © Cognitive Technologies

Modeling the function of the hippocampus in similar situations allows you to use the most complete data about the traffic situation. In fig. 3 in the ABCD picture that falls into the field of view of the autobot cameras, it is difficult to recognize a light image on a light background, however, the data reproduced "from memory" (picture A'B'C'D '), including wheels and other elements of the truck, allow accurately detecting an object as vehicle.

At the beginning of this year, we received high peer reviews from leading world scientists from the universities of Stanford, Iowa, Toronto, Tel Aviv and others regarding the quality of work and the capabilities of the computer vision technologies we have created. The scientific community was especially interested in the results obtained by the company in the direction of modeling the processes of human thinking and vision, which make it possible to more effectively recognize the road scene and significantly increase the safety of the autobot, especially in critical situations and bad weather and road conditions. In fact, experts called us number two in the market after the Israeli company Mobileye, whom we see as our main competitor.

Most recently, Cognitive Technologies led the OpenPower Foundation consortium to create a unified software standard for unmanned vehicles. The consortium, as you know, includes such giants as IBM, Google, Nvidia, Tyan. Why your company? What will this give Cognitive Technologies, including in terms of gaining market share?

Last year, we had a lot of contacts with well-known companies, market participants, for partnerships. The fact is that the development of modern technologies for unmanned vehicles involves working with large amounts of data. We needed such a technological partnership, within which we could improve our experience in developments in the field of artificial intelligence and, in particular, deep learning of neural networks, system testing, as well as effectively solve the issues of hardware implementation of our software solutions.

We were most satisfied with the offer of IBM to join the consortium of developers of solutions based on POWER technology.Having become a member of this high-profile community, we got the opportunity to promptly interact in the field of development at any level with the world's leading companies, as well as high-quality expertise of our software and hardware solutions.

We consider our main competitor to be the Israeli company Mobileye - the main player in this market. We plan to occupy about 3-5% of the market for autonomous driving systems by 2022.

In your opinion, what are the main trends in the field of unmanned vehicles and artificial intelligence both in Russia and in the world?

Today, when AI technologies become more widespread and increasingly enter our lives (products appear that have a fundamentally new quality - from objects they become thinking subjects), the main trend is the process of connecting the human lifestyle and working with an intelligent device, which actually becomes organic. addition to the person. Researchers begin to perceive a person as universal, to which a functional supplement in the form of artificial intelligence is given, which improves one or another area of ​​his activity. In the case of a drone, this is an improvement in the movement area, in fact, movement.

Market trends include widespread commercial distribution of first-tier ADAS technologies, including among commercial fleets, as well as the growth of the market for “smart” telematics solutions. Analysts predict an average annual growth rate in these segments of at least 28% through 2020.

Serious dynamics are observed in the direction of connected cars or connected cars. In this sense, other trends include the increasing personalization of development areas and the transformation of a drone from a simple "driver" into a full-fledged assistant. According to PwC forecasts, the volume of the global connected car market in monetary terms will more than triple in the next 5 years and will exceed 122 billion euros by 2021.

Infotainment is also a trending trend - an information resource that is used to equip and will equip cars for the comfort and time of passengers. Experts predict a dynamic growth in the market for car infotainment applications in the coming years. PWC expects that by 2021 this segment will become the third in terms of potential after providing safety and support for autonomous driving with an estimated volume of 13.4 billion euros.

As far as we know, Cognitive Technologies will open an office in California in the near future. Are you entering the US market? Do you have projects for the American market, and what kind of projects are they?

These are projects in the field of Connected Cars. Americans, unlike us, actually spend half their lives in cars. This segment of the US market is in high demand. We are already ready today to integrate the solutions of American companies with our ADAS technologies. In addition, we are considering other opportunities for our efforts in this segment, for example, creating social networks for passengers of cars with autopilot function.

Another area of ​​application of our solutions in the American market, we see the use of intelligent driver assistance systems in commercial fleets. Also, American companies are showing serious interest in our unmanned technologies in the field of "smart" agriculture.

At the end of last year - early this year, with the assistance of our American partner, we even conducted a sociological survey in order to understand the real state of the consumer segment of the market for unmanned vehicles in the United States and its readiness to use our solutions. In general, our forecasts were confirmed. We could even say that we did not notice a global difference in the behavior of Russian consumers from American ones.Americans are ready to give preference to driver assistance systems that can work not only in ideal, but also in bad weather and road conditions. True, there were also some surprises in the answers, for example, the Americans were not very scared of the possibility of hacker hacking into the drone's software.

What do you think of Vehicle-to-Vehicle technology? Do you plan to work in this direction?

Of course we are planning. For reliable operation of an unmanned vehicle, it needs as much information as possible about the current situation on the road. Including the one that is difficult to obtain with the help of sensors installed on board. In this sense, data received from other vehicles can successfully complement information from on-board sensors.

In addition, V2V technology is of great interest to companies that are developing applications for infotainment - information resources that provide the leisure of passengers in cars with an autopilot function. Analysts at the International Center for Robotics have calculated that more than a third of motorists today express a desire to communicate with each other while driving.

Your system performs well in poor road conditions. But if we talk about the rich countries of the West, then there is a trend of "smart" cities. Are you providing C-Pilot with the ability to interact with smart city infrastructure?

We do not plan to carry out special developments in this direction. When experts and developers began to estimate the real costs of maintaining the infrastructure of a smart city, they turned out to be prohibitive. Therefore, many well-known experts in the field of self-driving cars predict the possibility of using smart city technologies only within limited areas.

In addition, we believe that in practice the requirements for unmanned devices within a smart city will be drastically reduced, that is, there will be no need to offer solutions with powerful mathematics. As world practice shows, such projects, as a rule, are financed by the state, and with such schemes one should expect a large affiliation of “their” companies. It will be very difficult for us to enter these markets.

But we will definitely consider using this kind of systems when they appear. Additional information obtained from the elements of such an infrastructure will certainly be useful for improving the safety of the autorobot.

Probably, unmanned cars will have to share roads with ordinary drivers for quite a long time. But, as you know, drivers often communicate with each other using gestures, nods or, for example, high beam. Without possession of such conditioned signals and understanding them, it is difficult to merge into a dense stream and rebuild in it. What should an unmanned vehicle do in this case? Will he be able to adapt to this way of "communication" on the road?

The transition period for the joint use of unmanned and manually operated vehicles, according to our estimates, will last until about 2035, and the development of communication technologies between their drivers (operators) is certainly a promising direction. True, his turn has not yet come.

Experts expect the greatest dynamics of its development closer to the mid-2020s, when a significant percentage of cars with an autopilot function will appear on the roads. Although already now, a number of developers offer various formats of emotional communication between road users and pedestrians, from sound and light signals to emoticons - emoji on drone bumpers. We also do not lose sight of him in our R&D workshops. At one of the last ones, we analyzed the cases of the possibility of automatically determining the emotional state of passengers for its subsequent consideration when choosing infontainment modes.

Speaking of autonomous driving, first of all, they mean control systems for trucks and cars. But you are also engaged in agricultural machinery. How will this project look like?

In general, you need to understand that the management of a "smart" harvester, in comparison with an unmanned vehicle, has significant features, since, in addition to, in fact, the organization of autonomous driving, it is necessary to automatically control dozens of parameters of the technological process from choosing the required header angle to determining the quality of grain (threshing clearances adjustment), etc.

As part of the project to create a "smart" agricultural holding, we are responsible for the development of technologies for unmanned driving of agricultural machinery based on computer vision, an automated process control system, as well as an integrated information system for monitoring agricultural machinery and crops, including monitoring the processing of fields and conditions for crop growth.

The most important feature of the project is the ability to work in domestic realities, determined by the harsh climatic conditions in most of the country, low soil productivity, complex logistics, etc., which implies the creation of a more powerful mathematical apparatus in comparison with the solutions of many well-known foreign companies.

Last year, we have already demonstrated a pilot model of a computer vision system that allows agricultural machinery to "see" objects that pose a danger to it (stones, poles, metal structures, etc.) in the field and use information about them and their coordinates to ensure the safety of mechanisms when harvesting.

By the end of the year, we plan to release a prototype of an unmanned harvester capable of autonomous driving. Our partners in this project are the global manufacturer of agricultural machinery "Rostselmash" and one of the leading agricultural holdings in Tatarstan "Soyuz-Agro". The state, represented by the Russian Ministry of Education and Science, allocated 68 million rubles for the implementation of the technological part of the project.

No plans to venture into related areas such as unmanned aerial vehicles, sea and river vessels? Or, for example, railway transport, because it would seem that making an unmanned locomotive is much easier - it just rides on rails.

There are no such plans. There is a dialogue with the head structures of the railway transport. But since this direction is in the absolute monopoly of the state, then this, in its pure form, is a matter of its activity in this zone. We have recently made our proposal, but how and at what speed the events will develop, to what extent the solution to the issue will be related to real manufacturability or some conditions affiliated with previous contracts with Siemens will come into play here, it will be clear soon.

Now the question of choosing an autopilot is being actively discussed in a situation where human casualties cannot be avoided. Of course, this is a question of the legislator. But what selection criteria would you suggest?

Safety is the main issue in the creation of any autobot. Last year, we conducted a fairly serious study on the subject of how a drone should act in possible critical situations in which there is no unequivocal solution. In other words, when the autopilot must decide which of the traffic participants will have to sacrifice. The overall conclusion of the survey is as follows: artificial intelligence must act in such a way that the number of victims is minimal. This means, for example, that if several drunken homeless people run out into the road in front of you, then the car you are piloting will have to drive into a wall or ditch. Therefore, when respondents in personal interviews understood that in such situations they themselves could be in the role of a victim, they changed their answers.

Here we adhere to the fact that a sufficiently broad referendum is needed to resolve such situations.And only after the approval of the rules of conduct by our society can they be adopted at the legislative level.

What opportunities does the massive introduction of unmanned technologies give us?

First of all, it is safety. The use of unmanned vehicles, according to experts, will reduce the accident rate by orders of magnitude. If today more than 1.2 million people die on the roads of the world, then it is assumed that this figure will be reduced to several dozen people. Disorders associated with alcohol consumption, emotionality and behavioral factors will disappear. Today, analysts see among the main causes of road accidents in countries such as the United States, France, Canada and Germany, alcohol consumption while driving (35%), distraction from driving due to excessive sociability (20%) and violation of the speed limit (12-15 %). In addition, the massive appearance of autobots on the roads will lead to the ordering of the transport system as a whole. To her logical state. The traffic jams will disappear. The result will be a completely different, rational ability to move.

Finally, unmanned technology is about freeing up time. This is the optimization of living space. Instead of time spent on rather nervous mechanical work associated with controlling a machine, a person gets the opportunity for personal development, mastering professional, aesthetic, moral skills, and communicating with anyone.

By the way, in our last opinion poll there were questions about how people plan to occupy themselves in self-driving cars. Of all the options for drone training, almost half of the respondents chose rest - 48%. A third of the respondents (27%) wished to devote their time to work. About 17% of the respondents preferred to have fun. And of all the ways to spend free time in a drone, Russian citizens preferred communication on social networks and games, as well as watching videos - 33% and 35%, respectively. More than 21% agree to devote their leisure time to reading.

One of the horror stories associated with the future is technological unemployment, because, according to some experts, self-driving cars will deprive many people of jobs for whom driving is the only way to earn money. What do you think about this?

Indeed, the development of artificial intelligence will lead to major social change. McKinsey experts believe that technology can now automate 45% of human activities. For example, today our E1 Euphrates business process and workflow automation system is capable of replacing a person when performing more than 45% of routine operations in office work. And this process is in full swing all over the world. Experts predict that the professions associated with routine procedures will die out first. Unemployment growth in these sectors is projected at 36% over the next 7-8 years. In such a situation, it is frightening that humanity does not control the dynamics of these processes. For example, in China, whose citizens have suffered the most recently from the closure of mechanical factories during the transfer and opening of their robotic counterparts in Europe, there has been a significant increase in suicides.

It is not for nothing that brokers and financiers are already sounding the alarm today, where most operations are performed with the participation of robots. Most likely, the same fate awaits drivers in the foreseeable future. Humanity with a high probability will be divided into smart and the rest. Those who now have a low level of education and work in the service sector risk moving, roughly speaking, into an inferior race. To feel confident during a period of economic change, people will need to retrain, master new professions, such as an operator of unmanned vehicles, and learn to work with new technologies and services.

Popular by topic