Design & Reuse

Industry Expert Blogs

Sensor fusion in action: How cameras and LiDAR integrate with radar for safer driving

Yuichi Motohashi - GlobalFoundries
February 2, 2026

Sense – analyze – act. This is the principle that advanced driver assistance systems (ADAS) operate on. Modern vehicles rely on a network of sensors to build a more precise, reliable perception of their surroundings. Sensor fusion combines these inputs – from radar, camera, LiDAR, and ultrasound – with artificial intelligence and deep learning to deliver the environmental acuity required for vehicles to make split-second decisions. 

Since 1999, when Mercedes-Benz “taught the car to see,” radar has been a proven cornerstone of ADAS. However, camera and LiDAR technologies are rapidly advancing, adding new levels of detail and depth to a vehicle’s perception. LiDAR in particular has long been stuck in the space between functional solutions and scalable manufacturing. GF is closing that gap, using FinFET, advanced packaging and photonics to unlock the path to mass-market viability. 

Together, complementary sensors provide high-resolution imagery, 3D mapping and object classification capabilities – each essential for the safer driving of today, and the fully autonomous mobility of tomorrow. 

Cameras: Sharpening your car’s view of the world 

Cameras capture high-quality images around cars to detect lane markings, speed limits, turn signals, pedestrians and more. Sophisticated algorithms analyze images taken by cameras to determine the distance, size, and speed of objects, enabling the system to react appropriately. 

Automotive cameras do not utilize ultra-high megapixel counts like mobile phones because additional pixels result in increased data for the vehicle’s computer system to process. Producing extremely high-resolution images would significantly expand the volume of data transmitted to the central processor, potentially exceeding the capabilities of System on Chips (SoCs) that must analyze this information instantaneously to ensure safety. Excessive data could hinder processing speeds or overwhelm the system. Consequently, it is essential to carefully balance detection distance with the processing power required by the central SoC. 

The primary image quality Key Performance Indicator (KPI) is dynamic range, which is vital for maintaining accuracy in difficult lighting and weather conditions—ranging from intense sunlight at dusk to darkness, heavy rainfall, or fog. Achieving such high dynamic range imaging necessitates increasingly sophisticated Read-out ICs (ROIC) within automotive stacked CMOS Image Sensors (CIS). There exists a direct relationship between system-level, circuit-level and transistor-level requirements for high-performance automotive CIS ROIC. 

 

Click here to read more ...