Info
Info
News Article

Sensor Models For Virtual Testing Of Autonomous Vehicles

News

It is widely accepted that simulation will form part of the testing and validation process for autonomous vehicles, but this presents a number of challenges for the simulation tools that could be used for this. Mike Dempsey, Managing Director of Claytex explains

It is now well accepted that simulation will have to form part of the validation process for any autonomous vehicle. This is based on the fact that it is simply not physically possible or practical to conduct all the required tests in the real world due to the sheer number of scenarios that need to be considered.

The RAND Corporation published a report in 2016 that looked into the problem of how many miles an autonomous vehicle would have to be driven to prove it was safer than human operated vehicles. They produced the figure of 5 billion miles as the distance an autonomous vehicle would need to be driven in the real world to have 95% confidence that it was significantly safer than a human driver. “To drive this distance is not feasible especially when you consider that this would have to be repeated every time a line of code or piece of hardware is changed,” says Mike Dempsey, Managing Director of Claytex.

The generally accepted approach, and the route that Claytex advocates, is the need to have a mix of field testing, proving ground testing and simulation to build up the validation process for an autonomous vehicle. The field trials allow the vehicle to be tested in the real world with all the random and unpredictable features and behaviours that exist on the roads today. “Proving ground testing allows us to control many of the scenarios and situations we expose the vehicle to so that we can start to objectively measure its behaviour. However, in both field tests and proving ground tests, there is little chance that you can exactly repeat a scenario and you have no control of environmental factors such as temperature, rain, and more,” Dempsey explains.

“Therefore, it is necessary to use virtual testing in combination with physical tests to allow the rapid and repeatable testing of the autonomous vehicle throughout the development and validation phases. However, in order to rely on virtual testing, it is important to be able to confirm that the simulation produces the same results as the real tests”.

“He continues: “The simulation tools used need to be able to accurately recreate real-world locations so that we can build exactly the same scenarios we experience in the field in simulation. This means the simulation tool needs to be able to include traffic, pedestrians, cyclists and many other movable objects that might feature in the recorded scenarios. We also need our simulation tool to support running in real-time with the unmodified vehicle control system so that we are testing the same controller that runs in the vehicle”.

Driving in Montreal with complex scenarios including traffic, pedestrians and cyclists

“This is the challenge that we are working on at Claytex. We want to ensure that virtual testing is truly representative, and that the AV will respond the same on the road as it did in simulation. Just as a driving simulator must immerse the driver in a convincing virtual reality, the sensor models used to test an AV must accurately reproduce the signals communicated by real sensors in real situations,” Dempsey comments.

Claytex is an established distributor and systems integrator of the 3D driving simulation software rFpro. The software provides high quality graphics and accurate 3D models of real-world locations.

Using rFpro as the core simulation environment enables Claytex to build full autonomous vehicle simulators that allow the vehicle control system to be immersed into the virtual test environment. rFpro renders images using physical modelling techniques, the laws of physics, rather than the computationally efficient special effects developed for the gaming and film industries. This means that rFpro images don't just convince human viewers, they are also suitable for use with machine vision systems that must be fed sensor data that correlates closely with the real-world.

New sensor models help autonomous cars to ‘see' the road ahead more clearly

Dempsey explains that one of the key challenges of building a virtual testing environment for autonomous vehicles is the need to replace the sensors that the vehicle control system relies on with sensor models. “These sensor models need to generate the same output format messages as the real device and they must replicate how the real device perceives the environment. For a camera, this is relatively easy because rFpro already generates very high-quality images and it is capable of doing that at resolutions and frame rates that exceed those being used in autonomous vehicles today.”

He argues that it is the LiDAR, radar and numerous other sensors that need the most work. “rFpro has developed solutions for a number of the technical limitations that have constrained sensor modelling until now, including new approaches to rendering, beam divergence, sensor motion and camera lens distortion.”

“In order to create a representative model of a LiDAR sensor a sensor model is needed that captures the behaviour of the real device,” adds Dempsey. “So, if the real device is a scanning LiDAR and spins at 1200 rpm, then the sensor model needs to do the same and whilst it's spinning it needs to move through the virtual world. The model needs to measure the distance and intensity of the reflection to each point in the environment and pack this into the same UDP message format used by the real sensor”. This is all possible using rFpro and the VeSyMA - Autonomous SDK developed by Claytex.

The VeSyMA - Autonomous SDK (Sensor Development Kit) has been created to make the creation of device specific sensor models easy by encapsulating a lot of the common features needed by sensor models. Using the SDK, Claytex have created a suite of generic sensors (radar, LiDAR, camera, ultrasound and GPS) that can be used in the early stages of an autonomous vehicle development project to assess what kind of sensing is desirable.

In addition, Claytex are building a library of device specific sensor models that support testing throughout the development process. The first of these device specific models are already being used to support the virtual testing of an autonomous vehicle and the range of sensors available will be continuously added to.

Driving in Paris with a LiDAR sensor model hosted in rFpro providing data to sensor suppliers own data visualisation tool

With sensor models representative of the real devices in place, the focus then shifts to the virtual world and the definition of the test scenarios. rFpro are able to provide a wide range of high-fidelity models of public roads, proving grounds and race tracks. These models are all built from detailed surveys of the real locations and cover locations such as Paris, rural Warwickshire, Connecticut, Shanghai and many other world locations.

“Support for traffic, pedestrians and cyclists as well as other movable objects in the virtual world enable us to create a wide variety of test scenarios that we can then repeat under different weather and lighting conditions. These capabilities allow us to start building complex scenarios that we can use to test the vehicle control system and measure its performance”, concludes Mike Dempsey, Managing Director of Claytex.

The drive towards autonomy is continuing to push the development of virtual test environments into new and challenging areas. The challenge of producing physics based sensor models that can run in real-time is one being tackled by Claytex with the first full vehicle simulators now in operation.

AngelTech Online Summit - Tuesday 19th May

The health and well-being of AngelTech speakers, partners, employees and the overall community is our top priority. Due to the growing concern around the coronavirus (COVID-19), and in alignment with the best practices laid out by the CDC, WHO and other relevant entities, AngelTech decided to postpone the live Brussels event to 16th - 18th November 2020.

In the interim, we believe it is still important to connect the community and we want to do this via an online summit, taking place live on Tuesday May 19th at 12:00 GMT and content available for 12 months on demand. This will not replace the live event (we believe live face to face interaction, learning and networking can never be fully replaced by a virtual summit), it will supplement the event, add value for key players and bring the community together – digitally.

The event will involve 4 breakout sessions for CS International, PIC International, Sensors International and PIC Pilot Lines.

Key elements of the online summit:

  • Covering key topics of the industries
  • Live 3 hour online summit
  • 10 minute presentations – learn from experts in the industry
  • Recorded product demos
  • Live audience questions
  • Enhanced discussion and audience interaction
  • Video panel sessions
  • Sponsors digital booth (intro video, company content, lead generator, facilitate one on one video meetings with attendees)
  • Live private video meetings between two or more attendees
  • After the live event, monthly keynotes to drive traffic to the event 24/7, 365

Innovation is in AngelTech’s DNA and we are leveraging this strength to bring you an immersive and inspiring online event without the risk of travel during this uncertain period.

Register to attend
Angel Tech Online Summit
Eyesight And Ariel University Deploying Mobile Laboratory To Research The Driver’s State Behind The Wheel
Bird Introduces New 0.5% Precision Pulse Power Sensor
FLIR Systems And Foresight Sign Agreement To Develop, Market And Distribute QuadSight Vision System
Waymo Introduces The Fifth-generation Of Its Self-driving System
TomTom, TRI-AD And DENSO Collaborate On Advanced Mapmaking For Automated Driving
Karma Automotive Debuts New E-Flex Platform For Level 4 Autonomous Van
Velodyne Lidar Launches Ecosystem Partner Programme
LeddarTech Announces Volume Production Of The Leddar Pixell Cocoon LiDAR With Manufacturing Partner Clarion Malaysia
Nanoscale Sensors Transmit Data From Living Tissue
Gallium Oxide Holds Promise For X-ray Detectors
Quanergy Appoints Kevin Kennedy As CEO, Secures New Capital
GSS Announces High-Speed CO2 Sensor
Velodyne Lidar Announces Agreement With EasyMile
Amphenol Develops Sensor To Detect Water In EV Battery Packs
LiDAR Startup SOS LAB Closes A $8 Million Series A+ Round
Teledyne E2v Expands Its Emerald Image Sensor Family With New Compact Global Shutter 3.2MP Sensor
Making Waves In 2D Materials
ADASKY Wins Contract To Supply Thermal Camera For An Autonomous EV OEM
US Scientists Detect X-rays With Perovskite
High-end Inertial Sensors: At The Dawn Of A Mass-adoption?
Velodyne Lidar Announces Agreement With TLD
Tektronix Adds Industry-First Technology Which Eliminates Pulse Tuning In New All-In-One 2601B-PULSE System SourceMeter
Breast Screening Breakthrough To End Unnecessary Biopsies
ADAS: A Rebirth For The Car Industry?

Info
×
Search the news archive

To close this popup you can press escape or click the close icon.
Logo
×
Logo
×
Register - Step 1

You may choose to subscribe to the Sensor Solutions Magazine, the Sensor Solutions Newsletter, or both. You may also request additional information if required, before submitting your application.


Please subscribe me to:

 

You chose the industry type of "Other"

Please enter the industry that you work in:
Please enter the industry that you work in:
 
X
Info
X
Info
{taasPodcastNotification}