• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer

Electrical Engineering News and Products

Electronics Engineering Resources, Articles, Forums, Tear Down Videos and Technical Electronics How-To's

  • Products / Components
    • Analog ICs
    • Battery Power
    • Connectors
    • Microcontrollers
    • Power Electronics
    • Sensors
    • Test and Measurement
    • Wire / Cable
  • Applications
    • 5G
    • Automotive/Transportation
    • EV Engineering
    • Industrial
    • IoT
    • Medical
    • Telecommunications
    • Wearables
    • Wireless
  • Learn
    • eBooks / Handbooks
    • EE Training Days
    • Tutorials
    • Learning Center
    • Tech Toolboxes
    • Webinars & Digital Events
  • Resources
    • White Papers
    • Educational Assets
    • Design Guide Library
    • Digital Issues
    • Engineering Diversity & Inclusion
    • LEAP Awards
    • Podcasts
    • DesignFast
  • Videos
    • EE Videos and Interviews
    • Teardown Videos
  • EE Forums
    • EDABoard.com
    • Electro-Tech-Online.com
  • Bill’s Blogs
  • Advertise
  • Subscribe

How does multi-sensor calibration work in autonomous vehicles?

June 26, 2025 By Jeff Shepard

Multi-sensor calibration of advanced driver assistance systems (ADAS) in autonomous vehicles ensures accurate data fusion by aligning various sensors (like cameras, LiDAR, and radar) into a unified coordinate system, enabling precise perception and localization.

This article reviews the types and techniques of multi-sensor calibration of ADAS.

ADAS sensor calibration uses a combination of intrinsic considerations for individual sensors and extrinsic system-level factors. Intrinsic calibration considers general factors, such as linearity, output slope, and offset, as well as sensor-specific specifications. Examples of sensor-specific factors include:

  • Camera — focal length, lens distortion, resolution, high dynamic range (HDR), speed of lens focusing, high sensitivity, low light performance, LED flicker mitigation (to minimize the effects of traffic lights), and low latency.
  • LiDAR — laser beam angle, field of view (FOV), scan rate, ranging accuracy, angular resolution, and internal coordinate system.
  • Radar — antenna gain, frequency, pulse characteristics, range, FOV, resolution, speed measurement accuracy, and the ability to detect various types of objects/materials.

Extrinsic calibration examines the spatial relationship between sensors, encompassing both translation and rotation. It involves calibrating the camera, LiDAR, and radar to ensure that their coordinate systems are aligned. For example, it can validate object-level tracking and how the data is fused using a track-level fusion scheme.

Extrinsic calibration of ADAS systems can be implemented using a combination of target-based and targetless methodologies or employing only targetless techniques.

Target-based calibration

Target-based calibration, also called controlled environment or static calibration, uses targets with specific shapes and sizes at specified distances to calibrate ADAS sensor performance in a static setting.

Static ADAS calibration requires specific lighting conditions and the absence of reflective surfaces to avoid sensor confusion. Calibration targets are used to calibrate and align the sensors (Figure 1).

Figure 1. Typical target-based ADAS calibration system. (Image: John Bean)

The highly controlled conditions for target-based ADAS calibration support high-accuracy calibrations. However, the use of a controlled environment is also a limitation, as ADAS is often operated in uncontrolled environments on roadways. As a result, target-based ADAS calibration is generally used in combination with targetless calibration.

Targetless calibration

The limitations of target-based ADAS calibration stem from the diverse types of data provided by the three sensor modalities. Cameras produce 2D images, LiDAR produces dense 3D point clouds, and radar provides sparse 4D point clouds where the fourth dimension represents the object’s speed.

Targetless calibration can be implemented by attaching a scan tool to the car’s computer and driving at specified speeds, following other vehicles, and navigating on clearly marked roads. The scan tool detects objects and road markings and uses an algorithm to calibrate the sensors based on the real-world environment.

A new targetless calibration approach has been proposed based on self-supervised learning (SSL) and deep neural networks. In this approach, one part of the signal is used to predict another part of the signal. It has been used for super-resolving a radar array, up-sampling camera frames, or lidar measurements to improve calibration results.

In Figure 2a, the camera-lidar calibration based on SSL is shown. The projected lidar point cloud (colored by range) and the camera image are clearly misaligned in the left-hand image; however, they can be aligned using SSL, as shown in the right-hand image.

Figure 2. Examples of using SSL for camera-lidar calibration (top) and camera-radar calibration (bottom). (Image: Scientific Reports)

Figure 2b illustrates the camera-radar calibration. Before calibration, the radar, represented by the cyan ‘+’ markers, are misaligned with the moving vehicles in the left-hand image. SSL can be used to calibrate the camera-radar sensors and align their outputs, as shown in the image on the right.

Summary

ADAS are complex with multiple sensor modalities that require different types of intrinsic and extrinsic calibration. Additionally, the overall ADAS operation requires multimodal calibration, utilizing a combination of target-based and targetless calibration methods. Recently, SSL techniques have been applied to targetless ADAS sensors to deliver improved calibration results.

References

A Multi-sensor Calibration Toolbox for Autonomous Driving, arXiv
An Auto-Calibrating System for Sensors in Autonomous Vehicles, KPIT Technologies
Enhancing lane detection with a lightweight collaborative late fusion model, Science Direct
How to Calibrate Sensors with MSA Calibration Anywhere for NVIDIA Isaac Perceptor, Nvidia Developer
Joint Calibration of a Multimodal Sensor System for Autonomous Vehicles, MDPI sensors
Multilevel Data and Decision Fusion Using Heterogeneous Sensory Data for Autonomous Vehicles, MDPI remote sensing
Physics and semantic informed multi-sensor calibration via optimization theory and self-supervised learning, Scientific Reports
Probability-Based LIDAR–Camera Calibration Considering Target Positions and Parameter Evaluation Using a Data Fusion Map, MDPI sensors
The Complete Guide to ADAS Calibration,John Bean

EEWorld Online related content

What are the mathematics that enable sensor fusion?
What determines the connectivity bandwidth needed in a machine vision system?
How will neurotechnology and sensing impact automotive: part 1
What are the types and uses of position and angle sensors in an EV?
If you are working with video signal processing here are some tools to consider, Part 1

You Might Also Like

Filed Under: Automotive/Transportation, Featured, Sensor Tips Tagged With: FAQ

Primary Sidebar

EE Engineering Training Days

engineering

Featured Contributions

Integrating MEMS technology into next-gen vehicle safety features

Five challenges for developing next-generation ADAS and autonomous vehicles

Robust design for Variable Frequency Drives and starters

Meeting demand for hidden wearables via Schottky rectifiers

GaN reliability milestones break through the silicon ceiling

More Featured Contributions

EE Tech Toolbox

“ee
Tech Toolbox: 5G Technology
This Tech Toolbox covers the basics of 5G technology plus a story about how engineers designed and built a prototype DSL router mostly from old cellphone parts. Download this first 5G/wired/wireless communications Tech Toolbox to learn more!

EE Learning Center

EE Learning Center
“ee
EXPAND YOUR KNOWLEDGE AND STAY CONNECTED
Get the latest info on technologies, tools and strategies for EE professionals.
“bills
contribute

R&D World Podcasts

R&D 100 Episode 10
See More >

Sponsored Content

Designing for Serviceability: The Role of Interconnects in HVAC Maintenance

From Control Boards to Comfort: How Signal Integrity Drives HVAC Innovation

Built to Withstand: Sealing and Thermal Protection in HVAC Sub-Systems

Revolutionizing Manufacturing with Smart Factories

Smarter HVAC Starts at the Sub-System Level

Empowering aerospace E/E design and innovation through Siemens Xcelerator and Capital in the Cloud

More Sponsored Content >>

RSS Current EDABoard.com discussions

  • The Analog Gods Hate Me
  • Apc 650 upa
  • CST Studio RAM error
  • MCP23017 what is the typical output voltage at logic Hi?
  • Battery Deep Discharge – IC Workarounds?

RSS Current Electro-Tech-Online.com Discussions

  • The Analog Gods Hate Me
  • Simple LED Analog Clock Idea
  • Wideband matching an electrically short bowtie antenna; 50 ohm, 434 MHz
  • PIC KIT 3 not able to program dsPIC
  • Parts required for a personal project
Search Millions of Parts from Thousands of Suppliers.

Search Now!
design fast globle

Footer

EE World Online

EE WORLD ONLINE NETWORK

  • 5G Technology World
  • Analog IC Tips
  • Battery Power Tips
  • Connector Tips
  • DesignFast
  • EDABoard Forums
  • Electro-Tech-Online Forums
  • Engineer's Garage
  • EV Engineering
  • Microcontroller Tips
  • Power Electronic Tips
  • Sensor Tips
  • Test and Measurement Tips

EE WORLD ONLINE

  • Subscribe to our newsletter
  • Teardown Videos
  • Advertise with us
  • Contact us
  • About Us

Copyright © 2025 · WTWH Media LLC and its licensors. All rights reserved.
The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media.

Privacy Policy