Toshiba develops DNN hardware IP for image recognition AI processor Visconti5 for automotive driver assistance systems
BMW’s additional developments for iFE.18 electric racer; brake-by wire, rear-end and rear suspension

ZF showcasing automotive sensor portfolio at CES 2019

At CES this year, ZF is highlighting its sensor portfolio which can detect vehicle surroundings and thereby help to enhance the safety of conventional and automated vehicles. The resulting architecture—including a new, full-range radar, solid state LiDAR, innovative cameras and acoustic sensors—are combined with a powerful and scalable NVIDIA platform from the ZF ProAI product family to create a powerful overall sensor system.

CV-ProAI-1_3_2_748px

ZF ProAI.

The sensor set comprises ZF’s latest generation cameras, radars, LiDAR and acoustic sensors, and software tools and algorithms for detection and classification, and vehicle control, that are hosted in the ZF ProAI central control unit. The entire architecture is designed to address demanding automotive requirements including extreme temperatures and vibrations.

These highly advanced sensor systems are also important in helping to comply with future safety regulations and consumer safety ratings (e.g., NCAP).

Radar. Fitted to the front of the vehicle, ZF’s high-resolution Full-Range Radar features superior detection performance in the four dimensions of speed, distance, angular resolution and height. This high-performance 77-GHz sensor is designed for premium ADAS applications, and highly automated and autonomous driving (Level 3 and higher).

Like other radar systems, it transmits electromagnetic (radio) waves to target and determine the range, angle or velocity of objects (echo principle). The high-resolution sensor, however, can also more accurately measure height to create a three-dimensional view of the environment. The radar can work even in most poor weather, low light and bad visibility conditions—similar to ZF’s Medium-Range Radars, which provide a range of ADAS functions.

LiDAR. Combined with software tools, LiDAR sensors based on laser technology can also create a more accurate 3D model of the vehicle’s environment. They can help to better recognize objects and free space, including complex traffic situations, and in virtually all lighting conditions.

The new, high-resolution Solid-State LiDAR—which ZF is developing together with IBEO—can also better detect pedestrians and small objects in 3D. This plays an important role for highly automated driving at level 3 and above. The solid-state technology makes this innovation much more robust than previous solutions. Due to its modular design and field-of-view options, the sensors are suitable for a wide range of applications.

Vehicle cameras. ZF’s S-Cam4 highlights the further development and expansion of the S-Cam portfolio. With a 100-degree field-of-view and a 1.7-megapixel High Dynamic Range (HDR) image sensor, the technology offers high performance when it comes to detecting pedestrians and cyclists in a city environment. The cameras can also include ZF’s advanced longitudinal and transverse control algorithms for Adaptive Cruise Control (ACC), Automatic Emergency Braking (AEB) and Lane Keeping Assist (LKA), as well as other functions.

Remote Camera Heads, which can be installed in very small housings, can help to detect the surrounding vehicle environment and stream video to the driver, or classify objects. It is possible to combine up to 12 cameras to build a 360-degree view of the vehicle’s surroundings. For each remote camera, manufacturers can choose sensor resolutions of between 1.2 and 8 megapixels, and fields-of-view between 28 and 195 degrees. This means that a multi-camera system can be tailored to meet the customer’s specific requirements.

Interior. Highly automated driving will give vehicle occupants more freedom of movement inside the vehicle. A 3D interior camera from ZF can enable new comfort and safety benefits. As part of the ZF Interior Observation System (IOS), it can collect real-time information about the size, position and posture of passengers.

As a result, the performance of various occupant safety systems in the vehicle can be adapted in such a way that in an emergency, the impact of a collision can be better mitigated. Driver monitoring will also play a key role in transfer scenarios between human driver and autopilot; the IOS can also determine whether the driver has his hands on the steering wheel, is actively steering the vehicle and has his head facing the road.

Listening. With Sound.AI, ZF also helps enable cars to hear. Among other things, the system analyzes siren signals to determine what kind of emergency vehicle is approaching, and from which direction (siren detection). The system display can also provide the driver with important information including instructions such as “pull over to the right” or “move to an emergency lane”. Fully automated vehicles from Level 4 upwards can independently perform maneuvers such as this.

Comments

Verify your Comment

Previewing your Comment

This is only a preview. Your comment has not yet been posted.

Working...
Your comment could not be posted. Error type:
Your comment has been posted. Post another comment

The letters and numbers you entered did not match the image. Please try again.

As a final step before posting your comment, enter the letters and numbers you see in the image below. This prevents automated programs from posting comments.

Having trouble reading this image? View an alternate.

Working...

Post a comment

Your Information

(Name is required. Email address will not be displayed with the comment.)