Emerging From Stealth, NODAR Introduces “Hammerhead 3D Vision” Platform For Automated Driving

If you follow the self-driving vehicle world even marginally, you’re aware of the trifecta by which self-driving vehicles perceive the driving environment: camera, radar, and LiDAR. Radar and LiDAR bounce energy off the outer world and process the returning energy to detect objects, measuring range and velocity. Essential tracking and context for these objects is derived by processing data from individual cameras. Less well known is stereo vision, which relies on relative motion as seen from two cameras looking in roughly the same direction to measure range to objects ahead, a technique called stereopsis. Our eyes and brains do this, too; try maneuvering swiftly through a complex space with one eye covered and you’ll see how important that second eye is! Because cameras are super-cheap and high-performance radars and LiDARs are more expensive, stereo vision offers the potential to achieve high quality perception at a much lower cost.

If a couple of low cost standard automotive cameras can provide highly accurate long range sensing, as NODAR claims, it’s a game changer.
More>>