Self-driving cars are changing lives by delivering unprecedented personal mobility. In fact, they represent a paradigm shift in the definition of a vehicle. To ensureEnsuring autonomous vehicles (AVs) deliver a safe, efficient, and enjoyable mode of travel requires complicated new technologies in the cloud and at the edge. In addition, along with data connectivity and powertrain electrification, AVs will drive groundbreaking semiconductor content. This, combined with mandatory requisites for ultra-high performance and reliable sensing, processing, and communication, represents an exceptional opportunity for technology innovation today and well into the future.
In terms of components, lidar, radar, and camera-based systems are essential elements in the suite of sensor technologies required for the safe operation of AVs. And, while there is a significant overlap in the functions served by radar and lidar, they are likely to coexist in AV systems for some time because of the advantages of sensor fusion and the need for redundant systems in safety-critical applications.
Already, camera-based sensing supports autonomous emergency braking, adaptive cruise control, and lane-departure warnings on many passenger vehicles. In AVs, cameras will be used in combination with the other sensing technologies to produce a detailed 3D representation of the vehicle’s surroundings. While radar can measure the relative position and velocity of an object, and lidar can produce precise 3D mapping of objects, camera-based sensing systems utilize rich visual information to complete the picture of what it is: another car, large truck or bus, bicycle, pedestrian, or even the meaning of a street sign.
As camera-equipped vehicles become more pervasive, they will become an essential mechanism for gathering crucial data on road conditions, traffic congestion, safety hazards, availability of parking spaces, and more.
Advances in image sensors, image processing algorithms, and high-performance computing hardware, have enabled the use of camera-based sensing in advanced driver-assistance systems (ADAS) and AV systems. These will remain key areas of innovation in future AV development.
A number of systems have been developed based on field programmable gate arrays (FPGAs) and graphics processing units (GPUs), which are well suited to the high degree of parallelism required by vision processing algorithms. One of the most successful automotive vision processing solutions is the Mobileye EyeQ series, a dedicated hardware accelerator application specific integrated circuit (ASIC). Prior to releasing ADAS application, Mobileye conducted extensive testing in real-world conditions. This enabled continuous refinement of the algorithms and silicon over successive generations of the chip. Recognizing the strategic position that Mobileye occupied, Intel acquired the company in August 2017 in a deal valued at over $15 billion.
Lidar’s ability to produce a very precise mapping of objects in relation to the vehicle makes it a critical sensing technology for AVs. It is useful in detecting road features such as curbs and lane markings, as well as tracking other objects in proximity to the vehicle. Lidar sensors transmit a laser pulse and detect the backscattered or reflected light energy, then calculate the distance to the object based on the elapsed time. Early AV development platforms used scanning lidar systems that employed a rotating mirror assembly to direct the laser pulse. While they performed well with a good range, they were too bulky and costly for producing AVs.
Lidar innovation is focused primarily on reducing the size and the cost of the system, while maintaining the required high levels of detection range and resolution. This focus resulted in the development of solid-state lidar systems, which greatly reduce the complexity—and therefore the size and cost of the system. The challenge in developing solid-state lidar is achieving the requisite range and resolution. This in turn is driving innovation in laser emitter, optics, photodetector, and signal processing over a wide range of technologies, including InGaAs photodetectors, virtual beam steering using MEMS technology, and advanced signal processing algorithms. As a result, there are dozens of startups developing solid-state lidar solutions for AVs, including LeddarTech, Innoviz, Luminar, and Quanergy, to name just a few.
Automotive radar was first used in adaptive cruise control systems in the early 2000s. As such, it is one of the more mature sensing technologies used in AVs. While lidar offers a wider field of view and higher precision, radar is less susceptible to many forms of visual interference, like smoke, fog, and glare. Despite its relative maturity, there is still room for innovation. High-frequency radar, in the 77 GHz band, improves long-range performance and has high reflectivity with non-metallic objects, which is essential for detecting pedestrians and animals. Advances in signal processing algorithms are enabling high-resolution radar systems, and the use of RF CMOS technology will allow higher functional integration. These result in a more compact system design as seen in the automotive radar system on chip (SoC) solutions recently introduced by NXP Semiconductors and Texas Instruments.
A large portion of AV system development centers on the optimization of the “virtual driver,” or the brain of the vehicle. The virtual driver system consists of machine learning algorithms and middleware that connects to the sensing, actuation, and communication subsystems of the vehicle. This technology is core to the functioning of the AV; in fact, one possible future scenario would have leading AV developers licensing their virtual driver software stack to vehicle manufacturers who integrate it onto their platform using standard interfaces for sensors, actuators, and data communication protocols. However, this requires the development of industry standards and more mature sensing technologies that can be separated from the control system.
Automotive technical intelligence and intellectual property management can help automakers and suppliers protect their market position and identify additional revenue streams through competitive benchmarking, patent licensing negotiation, and indemnification. This process assists in market entry due diligence decisions and helps companies understand the intellectual property and technology strengths of their competitors to enable differentiation, guide important technical design decisions, and guide the acquisition of patents to enable lower-risk market entry. Experienced, objective third-party analysis at component, circuit, and system levels helps support assertion of patent claims and patent transaction valuation. Finally, an understanding of components’ underlying cost structure aides in price negotiation with vendors.