A vehicle’s ability to “see” and translate its surroundings comes from onboard sensors. The three major sensor systems in use, alone or in combination, by vehicles currently on the road are: 1.) cameras; 2.) radar; and 3.) Light Detection and Ranging (LIDAR)…
…”The camera is very good at providing a huge amount of information,” says John Dolan, a principal systems scientist at Carnegie Mellon University’s Robotics Institute. “But interpreting that information accurately is difficult because of the lighting issue,” he says, referring to the loss of image quality that occurs in situations like direct sunlight, poor or extreme contrast, or fog.
Lasers like LIDAR aren’t disrupted by lighting issues, and are “very good at giving you shape information without too much difficulty in terms of the processing,” Dolan says. “But it gets confused easily by bad weather.” Radar is not confused by weather, “but it doesn’t give as much shape as a LIDAR [system]—it gives, basically, just range or distance, and the rate at which the distance is changing, or the velocity of the vehicle.”
Lasers like LIDAR aren’t disrupted by lighting issues, and are “very good at giving you shape information without too much difficulty in terms of the processing,” Dolan says. “But it gets confused easily by bad weather.” Radar is not confused by weather, “but it doesn’t give as much shape as a LIDAR [system]—it gives, basically, just range or distance, and the rate at which the distance is changing, or the velocity of the vehicle.”