News Article

AEye Perception Software Designed For Sensors Of Autonomous Vehicles


Artificial perception pioneer AEye have announced the world's first commercially available, 2D/3D perception system designed to run in the sensors of autonomous vehicles. For the first time, basic perception can be distributed to the edge of the sensor network. This allows autonomous designers to use sensors to not only search and detect objects, but also to acquire, and ultimately to classify and track these objects. The ability to collect this information in real-time both enables and enhances existing centralized perception software platforms by reducing latency, lowering costs and securing functional safety.

This in-sensor perception system is intended to accelerate the availability of autonomous features in vehicles across all SAE levels of human engagement, allowing automakers to enable the right amount of autonomy for any desired use case - including the most challenging edge cases - in essence, providing autonomy “on demand” for ADAS, mobility and adjacent markets.

AEye's achievement is the result of its flexible iDAR™ platform that enables intelligent and adaptive sensing. The iDAR platform is based on biomimicry (see white paper), and replicates the elegant perception design of human vision through a combination of agile LiDAR, fused camera and artificial intelligence. It is the first system to take a fused approach to perception - leveraging iDAR's unique Dynamic Vixels, which combine 2D camera data (pixels) with 3D LiDAR data (voxels) inside the sensor. This unique software-definable perception platform allows for disparate sensor modalities to complement each other, enabling the camera and LiDAR to work together to make each sensor more powerful, while providing “informed redundancy” that ensures a functional safe system.

AEye's approach solves one of the most difficult challenges for the autonomous industry as it seeks to deliver perception at speed and at range: improving the reliability of detection and classification, while extending the range at which objects can be detected, classified and tracked. The sooner an object can be classified and its trajectory accurately forecasted, the more time the vehicle has to brake, steer or accelerate in order to avoid collisions. See AEye's white paper on Range, Resolution and Rate.

Enabling Autonomy On-demand

First generation robotic vision systems tried to solve the challenges of fully autonomous driving by capturing as much data as possible. This required both time and power to process. Second generation systems are designed to intelligently collect, manage and transform data into actionable information.

The unique intelligent capabilities of the iDAR platform allow for applications ranging from ADAS safety augmentation, such as collision avoidance, to selective autonomy (highway lane change), to fully autonomous use cases in closed-loop geo-fenced or open-loop scenarios.

Engineers can now experiment using software-definable sensors without waiting years for the next generation of hardware. They can adapt shot patterns in less than a second and simulate impact to find optimal performance. They can also customize features or power usage through modular design, for instance using a smaller laser and no camera to create a specialized ADAS system for under $1000, or mixing and matching short and long range LiDAR with camera and radar for more advanced 360 degree systems for under $15,000. Unlike with the industry's previous generations of sensors, OEMs and Tier 1s can now also move algorithms into the sensors when it is appropriate.

“We believe the power and intelligence of the iDAR platform transforms how companies can create and evolve business models around autonomy without having to wait for the creation full Level 5 Robotaxis,” said Blair LaCorte, president of AEye. “Automakers are now seeing autonomy as a continuum, and have identified the opportunity to leverage technology across this continuum. As the assets get smarter, OEMs can decide when to upgrade and leverage this intelligence. Technology companies that provide software-definable and modular hardware platforms now can support this automotive industry trend.”

iDAR's 2D/3D Perception System

AEye's system more quickly and accurately searches, detects and segments objects and, as it acquires specific objects, validates that classification with velocity and orientation information. This enables the system to forecast the object's behavior, including inferring intent. By providing the smarts to capture better information faster, the system enables more accurate, timely, reliable perception, using far less power than traditional perception solutions.

This 2D/3D perception system is based on AEye's iDAR platform, whose perception advancements the company will make broadly available via a software reference library. That library includes the following features that will be resident in AEye's AE110 (Mobility) and AE200 sensors (ADAS):

Detection: Identification of objects (e.g. cars, pedestrians, etc.) in the 3D point cloud and camera. The system accurately estimates their centroids, width, height and depth to generate 3D bounding boxes for the objects.

Classification: Classifying the type of detected objects. This helps in further understanding the motion characteristics of those objects.

Segmentation: Further classifying each point in the scene to identify specific objects those points belong to. This is especially important to accurately identify finer details, such as lane divider markings on the road.

Tracking: Tracking objects through space and time. This helps keep track of objects that could intersect the vehicle's path.

Range/Orientation: Identifying where the object is relative to the vehicle, and how it's oriented relative to the vehicle. This helps the vehicle contextualize the scene around it.

True Velocity: Leveraging the benefits of agile LiDAR to capture the speed and direction of the object's motion relative to the vehicle. This provides the foundation for motion forecasting.

Motion Forecasting: Forecasting where the object will be at different times in the future. This helps the vehicle to assess the risk of collision and charter a safe course.

“Combining sensor modes that can measure speed, distance, orientation and semantic understanding in real-time enables a more robust virtual driver system,” said Sam Abuelsamid, principal analyst at Navigant Research. “Creating a perception system at the sensor level can potentially deliver more depth, nuance and critical information for improved prediction to feed into path planning systems than is possible with a 2D image-based system, which will be a boon to ADAS and autonomous vehicle initiatives.”

AngelTech Online Summit - Tuesday 19th May

The health and well-being of AngelTech speakers, partners, employees and the overall community is our top priority. Due to the growing concern around the coronavirus (COVID-19), and in alignment with the best practices laid out by the CDC, WHO and other relevant entities, AngelTech decided to postpone the live Brussels event to 16th - 18th November 2020.

In the interim, we believe it is still important to connect the community and we want to do this via an online summit, taking place live on Tuesday May 19th at 12:00 GMT and content available for 12 months on demand. This will not replace the live event (we believe live face to face interaction, learning and networking can never be fully replaced by a virtual summit), it will supplement the event, add value for key players and bring the community together – digitally.

The event will involve 4 breakout sessions for CS International, PIC International, Sensors International and PIC Pilot Lines.

Key elements of the online summit:

  • Covering key topics of the industries
  • Live 3 hour online summit
  • 10 minute presentations – learn from experts in the industry
  • Recorded product demos
  • Live audience questions
  • Enhanced discussion and audience interaction
  • Video panel sessions
  • Sponsors digital booth (intro video, company content, lead generator, facilitate one on one video meetings with attendees)
  • Live private video meetings between two or more attendees
  • After the live event, monthly keynotes to drive traffic to the event 24/7, 365

Innovation is in AngelTech’s DNA and we are leveraging this strength to bring you an immersive and inspiring online event without the risk of travel during this uncertain period.

Register to attend
Angel Tech Online Summit
Breast Screening Breakthrough To End Unnecessary Biopsies
Velodyne Lidar Launches Ecosystem Partner Programme
US Scientists Detect X-rays With Perovskite
ADASKY Wins Contract To Supply Thermal Camera For An Autonomous EV OEM
LeddarTech Announces Volume Production Of The Leddar Pixell Cocoon LiDAR With Manufacturing Partner Clarion Malaysia
LiDAR Startup SOS LAB Closes A $8 Million Series A+ Round
High-end Inertial Sensors: At The Dawn Of A Mass-adoption?
Eyesight And Ariel University Deploying Mobile Laboratory To Research The Driver’s State Behind The Wheel
Velodyne Lidar Announces Agreement With TLD
Velodyne Lidar Announces Agreement With EasyMile
Making Waves In 2D Materials
Tektronix Adds Industry-First Technology Which Eliminates Pulse Tuning In New All-In-One 2601B-PULSE System SourceMeter
Nanoscale Sensors Transmit Data From Living Tissue
Gallium Oxide Holds Promise For X-ray Detectors
ADAS: A Rebirth For The Car Industry?
Quanergy Appoints Kevin Kennedy As CEO, Secures New Capital
GSS Announces High-Speed CO2 Sensor
Amphenol Develops Sensor To Detect Water In EV Battery Packs
Waymo Introduces The Fifth-generation Of Its Self-driving System
Bird Introduces New 0.5% Precision Pulse Power Sensor
TomTom, TRI-AD And DENSO Collaborate On Advanced Mapmaking For Automated Driving
FLIR Systems And Foresight Sign Agreement To Develop, Market And Distribute QuadSight Vision System
Karma Automotive Debuts New E-Flex Platform For Level 4 Autonomous Van
Teledyne E2v Expands Its Emerald Image Sensor Family With New Compact Global Shutter 3.2MP Sensor

Search the news archive

To close this popup you can press escape or click the close icon.
Register - Step 1

You may choose to subscribe to the Sensor Solutions Magazine, the Sensor Solutions Newsletter, or both. You may also request additional information if required, before submitting your application.

Please subscribe me to:


You chose the industry type of "Other"

Please enter the industry that you work in:
Please enter the industry that you work in: