Info
Info
News Article

Sensor Models For Virtual Testing Of Autonomous Vehicles

News

It is widely accepted that simulation will form part of the testing and validation process for autonomous vehicles, but this presents a number of challenges for the simulation tools that could be used for this. Mike Dempsey, Managing Director of Claytex explains

It is now well accepted that simulation will have to form part of the validation process for any autonomous vehicle. This is based on the fact that it is simply not physically possible or practical to conduct all the required tests in the real world due to the sheer number of scenarios that need to be considered.

The RAND Corporation published a report in 2016 that looked into the problem of how many miles an autonomous vehicle would have to be driven to prove it was safer than human operated vehicles. They produced the figure of 5 billion miles as the distance an autonomous vehicle would need to be driven in the real world to have 95% confidence that it was significantly safer than a human driver. “To drive this distance is not feasible especially when you consider that this would have to be repeated every time a line of code or piece of hardware is changed,” says Mike Dempsey, Managing Director of Claytex.

The generally accepted approach, and the route that Claytex advocates, is the need to have a mix of field testing, proving ground testing and simulation to build up the validation process for an autonomous vehicle. The field trials allow the vehicle to be tested in the real world with all the random and unpredictable features and behaviours that exist on the roads today. “Proving ground testing allows us to control many of the scenarios and situations we expose the vehicle to so that we can start to objectively measure its behaviour. However, in both field tests and proving ground tests, there is little chance that you can exactly repeat a scenario and you have no control of environmental factors such as temperature, rain, and more,” Dempsey explains.

“Therefore, it is necessary to use virtual testing in combination with physical tests to allow the rapid and repeatable testing of the autonomous vehicle throughout the development and validation phases. However, in order to rely on virtual testing, it is important to be able to confirm that the simulation produces the same results as the real tests”.

“He continues: “The simulation tools used need to be able to accurately recreate real-world locations so that we can build exactly the same scenarios we experience in the field in simulation. This means the simulation tool needs to be able to include traffic, pedestrians, cyclists and many other movable objects that might feature in the recorded scenarios. We also need our simulation tool to support running in real-time with the unmodified vehicle control system so that we are testing the same controller that runs in the vehicle”.

Driving in Montreal with complex scenarios including traffic, pedestrians and cyclists

“This is the challenge that we are working on at Claytex. We want to ensure that virtual testing is truly representative, and that the AV will respond the same on the road as it did in simulation. Just as a driving simulator must immerse the driver in a convincing virtual reality, the sensor models used to test an AV must accurately reproduce the signals communicated by real sensors in real situations,” Dempsey comments.

Claytex is an established distributor and systems integrator of the 3D driving simulation software rFpro. The software provides high quality graphics and accurate 3D models of real-world locations.

Using rFpro as the core simulation environment enables Claytex to build full autonomous vehicle simulators that allow the vehicle control system to be immersed into the virtual test environment. rFpro renders images using physical modelling techniques, the laws of physics, rather than the computationally efficient special effects developed for the gaming and film industries. This means that rFpro images don't just convince human viewers, they are also suitable for use with machine vision systems that must be fed sensor data that correlates closely with the real-world.

New sensor models help autonomous cars to ‘see' the road ahead more clearly

Dempsey explains that one of the key challenges of building a virtual testing environment for autonomous vehicles is the need to replace the sensors that the vehicle control system relies on with sensor models. “These sensor models need to generate the same output format messages as the real device and they must replicate how the real device perceives the environment. For a camera, this is relatively easy because rFpro already generates very high-quality images and it is capable of doing that at resolutions and frame rates that exceed those being used in autonomous vehicles today.”

He argues that it is the LiDAR, radar and numerous other sensors that need the most work. “rFpro has developed solutions for a number of the technical limitations that have constrained sensor modelling until now, including new approaches to rendering, beam divergence, sensor motion and camera lens distortion.”

“In order to create a representative model of a LiDAR sensor a sensor model is needed that captures the behaviour of the real device,” adds Dempsey. “So, if the real device is a scanning LiDAR and spins at 1200 rpm, then the sensor model needs to do the same and whilst it's spinning it needs to move through the virtual world. The model needs to measure the distance and intensity of the reflection to each point in the environment and pack this into the same UDP message format used by the real sensor”. This is all possible using rFpro and the VeSyMA - Autonomous SDK developed by Claytex.

The VeSyMA - Autonomous SDK (Sensor Development Kit) has been created to make the creation of device specific sensor models easy by encapsulating a lot of the common features needed by sensor models. Using the SDK, Claytex have created a suite of generic sensors (radar, LiDAR, camera, ultrasound and GPS) that can be used in the early stages of an autonomous vehicle development project to assess what kind of sensing is desirable.

In addition, Claytex are building a library of device specific sensor models that support testing throughout the development process. The first of these device specific models are already being used to support the virtual testing of an autonomous vehicle and the range of sensors available will be continuously added to.

Driving in Paris with a LiDAR sensor model hosted in rFpro providing data to sensor suppliers own data visualisation tool

With sensor models representative of the real devices in place, the focus then shifts to the virtual world and the definition of the test scenarios. rFpro are able to provide a wide range of high-fidelity models of public roads, proving grounds and race tracks. These models are all built from detailed surveys of the real locations and cover locations such as Paris, rural Warwickshire, Connecticut, Shanghai and many other world locations.

“Support for traffic, pedestrians and cyclists as well as other movable objects in the virtual world enable us to create a wide variety of test scenarios that we can then repeat under different weather and lighting conditions. These capabilities allow us to start building complex scenarios that we can use to test the vehicle control system and measure its performance”, concludes Mike Dempsey, Managing Director of Claytex.

The drive towards autonomy is continuing to push the development of virtual test environments into new and challenging areas. The challenge of producing physics based sensor models that can run in real-time is one being tackled by Claytex with the first full vehicle simulators now in operation.

Continental And Iteris Collaborate To Explore Intelligent Infrastructure Technology
Take It To The Limit With HBK’s New Force Sensor
New Wireless Torque Technology
NTU Singapore Launches Quantum Science And Engineering Centre
Toshiba Expands Scope Of Its Solid-State LiDAR Solution To Address Transportation Infrastructure Monitoring
Scientists Make Novel Thermal Sensor
Lumentum Expands VCSEL Array Range
Brewer Science’s Newly Launched Smart Devices Will Be Displayed At CES
BAE Systems Unveils Ultra Low-Light Image Sensor
SiLC Rolls Out Chip-Integrated FMCW LiDAR Sensor
Smart Eye And OmniVision Announce End-to-End Interior Sensing Solution
New Investment In Light-powered Biosensor Accelerates The Availability Of High Quality – Low Cost Tests
Webinar: Next Generation Optical Spectrum Analyzer
HELLA Brings Latest Passenger Car 77GHz Radar Technology Into Series Production
OmniVision Announces Industry’s First 8 Megapixel Medical-Grade Image Sensors For Single-Use And Reusable Endoscopes
Faraday Future Selects Velodyne As Exclusive Lidar Supplier For Flagship FF 91
Brewer Science Demonstrates Smart Devices & Printed Electronics Capabilities
Paragraf Introduces A Graphene Hall Sensor
World-leading Pharmaceutical Developer Turns To TorqSense
Landis+Gyr Awarded Major Smart Water Contract By South East Water
Melexis Announces Latest Triaxis Position Sensor Together With New PCB-less Packages
RoboSense Teams Up With Webasto On Smart Roof Module With Integrated MEMS LiDAR
Plus Selects Aeva 4D LiDAR For The Volume Production Of Autonomous Trucks
The All-round Smart Proximity Sensor Chip
×
Search the news archive

To close this popup you can press escape or click the close icon.
Logo
×
Logo
×
Register - Step 1

You may choose to subscribe to the Sensor Solutions Magazine, the Sensor Solutions Newsletter, or both. You may also request additional information if required, before submitting your application.


Please subscribe me to:

 

You chose the industry type of "Other"

Please enter the industry that you work in:
Please enter the industry that you work in:
 
X
Info
X
Info
Live Event