Aerospace technology at the wheel of your car
As a young girl growing up in the 1980s, I was fascinated by movies like Back to the Future and, above many others, the Knight Rider series. I could intuitively know that those futuristic technologies were just science fiction, but they were definitively inspiring. Since then, I have been craving for the opportunity to contribute to making some of these high-tech dreams come true. And what better opportunity than applying together two of my passions—space and telecommunication—into the satellite navigation technology that is crucial for the next generation of self-driving cars? These driverless cars that, according to Yuval Noah Harari, Israeli historian and a professor, “are likely to save the lives of one million people every year.”
Almost forty years have passed, and we are almost there. However, self-driving technology does not happen overnight; it entails a relatively long process. The good news is that the lowest levels of autonomy (Levels 0, 1, and 2) in the automotive field are present in most of the cars on the road today. Current features such as Automatic Emergency Braking, Lane Keep Assist, or Adaptive Cruise Control are considered to reduce accidents by 40%.
The levels of self-driving technology are classified from 0 to 5, 0 being the classic cars as known for many years, where the driver is the only party responsible for all the operations carried out in the car. Level 5 means that no human intervention is required in any conditions or place.
Self-driving level 3—Conditional Automation—which is the target right now, means that the ADAS (Advanced Driver Assistance Systems) system can take control of the vehicle for long periods without any supervision. The vehicle may ask the driver to take control of the car if the environment is not considered appropriate for the function. As the level of self-driving technology increases, the vehicle’s capability to make independent decisions becomes a must. But the artificial intelligence (AI) algorithms, which are the basis of this independent decision-making process, entail several requirements for self-driving cars: powerful on-board computers, real-time information, continuous connectivity, safety mechanisms, precise localization, obstacles detection/identification, and more. Furthermore, smart infrastructure will also contribute to the AI algorithms running in self-driving cars.
Thanks to Moore’s law—the observation that the number of transistors in a dense integrated circuit doubles about every two years—on-board computers are powerful enough to run the complex SW that implements ADAS functions. Jaguar Land Rover estimated that self-driving will require 1000 times the amount of source code used in Apollo 11. The precise positioning of the vehicle means information gathered by the sensors must be processed (camera, lidar, radar, GNSS, inertial sensors, odometer, etc.)—a huge amount of data of about several terabytes per day. Additionally, the evolution of electronics has allowed integration of onboard equipment and devices that were not affordable for mass-market applications just a few years ago. And in parallel, the revolution of telecommunication—connectivity anywhere and anytime—and the proliferation of technologies such as C-V2X 5G or ITS -G5 make it possible to access real-time information in your car with a bandwidth that was unbelievable not so long ago.
But even with all these advances, self-driving cars would not be feasible without high-accuracy absolute localization. And this is when satellite navigation technology joins the game from the bench. Other technologies, such as radar and cameras, are used to detect obstacles or locate the vehicle with respect to other objects. But GNSS (Global Navigation Satellite Systems) algorithms are needed to position the car on a map with an accuracy good enough to distinguish the lanes on roads. GPS and Galileo, which are widely known in the aerospace sector, are now enablers of many different applications in other markets such as precision agriculture, the Internet of Things, and, of course, automobiles. In this regard, GNSS techniques have had to be adapted to work with low-cost automotive-grade equipment, in challenging driving conditions (dense foliage alongside roads, traffic signs, heavy traffic, and tunnels) and, simultaneously, provide instantaneous robust and safe high-accurate positioning solutions. It is amazing how GNSS, which 15 years ago was only used by highly specialized industries (aerospace, aeronautics, defense, etc.), has turned into an asset that is currently present in many aspects of our daily life.
And the journey towards full self-driving technology has not ended yet! Initial use cases of Level 3 of self-driving technology are often publicized in the media. Videos of Last Mile Delivery systems for residential goods and packages or First/Last Mile Shuttles (fixed route people carriers) can be easily found on YouTube. In these cases, the operational environment is restricted in the sense that the route, speed, and environmental conditions can be limited. The role of GNSS is crucial, not only to provide a high-accurate absolute positioning solution, but also because of the level of integrity that GNSS can guarantee. Precisely knowing the exact position of the vehicle is not enough, and self-driving algorithms must also ensure that the probability of positioning fix outside a certain safe area is lower than 1 out of several million. Civil aviation has been leveraging GNSS integrity for landing approach operations for years, and thanks to this experience, similar strategies have been derived to use GNSS integrity for autonomous driving.
So, what’s next? Level 4 of self-driving technology – High Driving Automation implies that a vehicle’s capability increases, and the vehicle can intervene if things go wrong or if there is a system failure, even if the driver still has the option to manually override. This is the challenge now and for the next years. The operational conditions will have to be extended and will not be constrained as much. The number of use cases will also increase: fully autonomous vehicles on different types of roads, mobile robotics, industry 4.0, and more. This means that the amount of information to be processed and the complexity of the positioning algorithms is exponentially growing. There is still work to be done, but we will be able to see the results hopefully quite soon.
The outlook is promising, but let’s note that the discussion until now has been purely technical. And we cannot forget about other issues such as legislation, regulation, adaptation of road and infrastructure or ethics. Quoting Yuval Noah Harari again: “which means that when designing their self-driving car, Toyota or Tesla will be transforming a theoretical problem in the philosophy of ethics into a practical problem of engineering.” I wonder if full autonomy as a reality will be determined by the pace of technical progress or the development of regulations and legislation. But something is clear to me: Aerospace technology is now at the wheel of our cars.
Author: Irma Rodríguez