CARB awards $20M in Cap-and-Trade funding to zero-emission transportation technology demos, H2 ferry, BEV locomotive
Solaris delivers first electric buses to Romania

rFpro developing virtual model of Applus+ IDIADA proving ground for development of connected autonomous vehicles

UK software specialist rFpro is developing a highly accurate virtual model of Applus+ IDIADA’s proving ground to be used for the development of vehicles in simulation. The Digital Twin of the proving ground enables vehicle manufacturers to accelerate the development of ADAS and CAVs (Connected Autonomous Vehicles) by testing them in a fully representative virtual environment before validation on the track.

Using a virtual environment is the only cost-effective way to subject these self-learning systems to the limitless number of scenarios that can occur in the real world. Identical scenarios can be choreographed at the proving ground to validate the simulation results, allowing customers to confidently progress to real-world trials. Our virtual model is a vital part of the road map for the development of CAVs at Applus+ IDIADA within the regulatory framework.

—Chris Hoyle, Technical Director, rFpro

rFpro’s Applus+ IDIADA Digital Twin will be the latest addition to its library of digital models, which is the world’s largest, and includes proving grounds, test tracks and thousands of kilometres of varying real roads. rFpro says that it is the industry’s most open simulation software package, capable of use with a wide range of vehicle models and driving simulation platforms.

B8A4E3A3-ACA6-40AC-9C53-824FCA17071B

rFpro has developed a virtual model of Applus+ IDIADA’s proving ground to be used for the development of vehicles in simulation.

The Applus+ IDIADA facility already provides a safe environment for the controlled testing of autonomous functionality and a natural, ‘real world’, extension to customers’ software engineering processes. By investing in a digital model, we can also become an integrated part of our customers’ continuous software development tool-chain, significantly reducing the development and validation time, and therefore the cost, of autonomous systems.

—Javier Gutierrez, Applus+ IDIADA Project Manager, Chassis Development Vehicle Dynamics

The digital models created by rFpro can be populated by ego vehicles (the customer’s vehicles) as well as by semi-intelligent Swarm traffic and Programmed traffic. Vehicles and pedestrians can share the road network correctly with perfectly synchronized traffic and pedestrian signals, following the rules of the road, while also allowing ad-hoc behavior, such as pedestrians stepping into the road, to provoke an emergency. This allows digital experiments to precisely mirror the physical tests conducted on the proving ground with robot soft targets.

Early adopters of testing autonomous vehicles in a virtual environment are already carrying out more than 2 million miles of testing per month, but in order to be effective as a development tool, the simulation must correlate accurately with the real world. rFpro is an industry leader in this respect; by utilizing phase-based laser scanning survey data it can create models with an accuracy of around 1mm in Z (height) and in X and Y (position).

A key element of rFpro’s software is its TerrainServer surface model that enables a highhadefinition surface to be simulated. By capturing detailed surface information that is missed by point-based sampling methods, TerrainServer allows very high correlation with the actual road surfaces used during ‘real world’ testing. This extends the use of the digital model into Vehicle Dynamics applications, allowing ride and secondary ride experiments to be conducted by real-time models on driving simulators.

rFpro uses exceptionally high quality, realistic rendering, up to HDR-32, which is essential for the training, testing and validation of deep-learning based ADAS and autonomous systems, as Hoyle explains.

The resolution and dynamic range of camera sensors are increasing every year, so it is important to be able to render high resolution HDR32 video in real-time. Our system is also unique in avoiding the patterns and video artefacts that arise in synthetic simulation tools, which would otherwise impair deep learning training performance.

—Chris Hoyle

Lighting is modelled accurately for Applus+ IDIADA’s latitude and longitude, day of the year, time of day, atmospheric and weather conditions. This includes circumstances such as the transition between poorly-lit and well-lit roads, the effect of the sun low in the sky or the approaching headlights of oncoming traffic, all of which can be particularly challenging for ADAS and autonomous vehicle sensors.

rFpro provides driving simulation software, and 3D content, for Deep Learning Autonomous Driving, ADAS and Vehicle Dynamics testing and validation. rFpro is being used to train, test and validate Deep Learning systems for ADAS and Autonomous applications by OEMs and Tier-1s. When developing systems based on machine learning from sensor feeds, such as camera, LiDAR and radar feeds, the quality of the 3D environment model is very important. The more accurate the virtual world is the greater the correlation will be when progressing to real-world testing.

rFpro’s HiDef models are built around a graphics engine that includes a physically modeled atmosphere, weather and lighting, as well as physically modeled materials for every object in the scene. 100s of kilometers of public road models are available off-the-shelf, from rFpro, spanning North America, Asia and Europe, including multi-lane highways, urban, rural, mountain routes, all copied faithfully from the real world. rFpro scales from a desktop workstation to a massively parallel real-time test environment connecting to customers’ autonomous driver models and human test drivers.

Comments

The comments to this entry are closed.