|AEye’s iDAR combines LiDAR (right) with HD cameras (right) and AI to give vehicles a sense of human-like perception. (Image source: AEeye)|
AEye’s president Blair LaCorte, thinks his company’s sensor technology is not unlike the Indoraptor from Jurassic World: Fallen Kingdom. If you’ve seen the movie you’ll recall the genetically engineered dinosaur, which attacked on command using a laser guidance system.
LaCorte showed a clip from the film to an audience at Automobility LA, where he discussed how AEye’s fusion of LiDAR and HD cameras – intelligent detection and ranging (iDAR) – is going to take sensing in autonomous vehicles to a level that exceeds even human capabilities.
LaCorte said he liked to show that clip – first to wake people up – but also because it demonstrates a key tenet behind AEye’s technology development. “It’s not enough to detect, you have to identify a target and acquire a target,” he said.
The team behind AEye comes from a defense background and have previously worked on creating missile defense systems. Working in the defense industry taught them a lot about working in scenarios, “where you can’t afford to miss anything entering your scene and you have to determine what objects are quickly,” LaCorte said.
But AEye’s latest mission hits closer to home. The company wants to bring military-grade perception to autonomous vehicles – and its says iDAR can do just that. Rather than creating a new sensing solution, LaCorte said AEye has taken cues from the biology of the human visual cortex to develop a system that not only senses objects, but perceives them, “a perception system that can out-perceive the human eye,” he said. “We’re using human biomimicry and, instead of finding things to kill, we’re finding out how not to kill things.”
Current LiDAR systems are great for identifying the presence of objects in the road. The problem is that they do not have an understanding of context. They don’t take environmental conditions, or how they may change, into account. They also can’t balance competing priorities. As far as LiDAR is concerned a trash bag rolling across the road is no more important than a human or animal.
“Systems today bring in data passively,” LaCorte said. “They spend as much time on the sky as on the leaves on a tree, or the girl crossing the street.”
iDAR overcomes this limitation by giving systems an understanding of the importance of objects. By combining a camera and LiDAR, iDAR is able to acquire addition information such as color and depth as it senses. With the additional data it captures, combined with computer vision algorithms, iDAR is able to classify an object and estimate the object’s center of mass, width, height, and depth, as well as its speed and velocity.
And AEye emphasizes this is all done on the edge within the sensor, not via sensor fusion or on the cloud. “You’re actually getting information at the point of acquisition,” LaCorte said. Doing this also cuts down on the processing time and compute power requirements of traditional LiDAR, he added. “By looking at how we comprehend things we’ve been able to teach a computer to comprehend things better than a human does.”
Since its system needs less data than traditional LiDAR and requires less processing, AEye is promising that iDAR can increase the speed of a car’s artificial perception up to 10 times, and also reduce power consumption by five to 10 times.
But the secret that changes everything, LaCorte said, is in iDAR’s motion forecasting capability. The system will not only recognize where an object is, but also where it is headed. Is that other car about to drift into your lane? Is that ball bouncing down the sidewalk heading into the street? Maybe that dog is about to dart across traffic. With iDAR equipped, AEye says it’s possible for a self-driving car to detect and respond to these sorts of things.
|The AE110 is AEye’s first hardware product for autonomous vehicle applications. (Image source: AEye)|
“No one has ever been able to do this at the edge of the network,” LaCorte said. “With this capability you give [engineers] the tools they need to match heterogenous nodes so they can decide where the decision is made, whether that be in the sensor, the trunk [of the vehicle], or in the cloud. This is about data being turned into information.”
AEye currently offers one iDAR hardware product, the AE110, which fuses 1550 nanometer, solid-state agile MOEMS LiDAR with a low-light HD camera, and embedded AI for autonomous vehicle applications. In 2020 the company is planning to roll out another sensor, the AE200, which will be targeted at level 3 advanced driver assistance systems (ADAS) applications. The company says the AE200 will be a modular system that provide long-range sensing up to 200 meters and a short range performance configuration of 50 meters, both with 10% reflectivity.
“We believe the power and intelligence of the iDAR platform transforms how companies can create and evolve business models around autonomy without having to wait for the creation of full level 5 robotaxis,” LaCorte said. “Automakers are now seeing autonomy as a continuum, and have identified the opportunity to leverage technology across this continuum. As the assets get smarter, OEMs can decide when to upgrade and leverage this intelligence. Technology companies that provide software-definable and modular hardware platforms now can support this automotive industry trend.”
Chris Wiltz is a Senior Editor at Design News covering emerging technologies including AI, VR/AR, blockchain, and robotics.
January 28-30: North America’s largest chip, board, and systems event, DesignCon, returns to Silicon Valley for its 25th year! The premier educational conference and technology exhibition, this three-day event brings together the brightest minds across the high-speed communications and semiconductor industries, who are looking to engineer the technology of tomorrow. DesignCon is your rocket to the future. Ready to come aboard? Register to attend!