HRL Labs team develops 3D printing process for ceramics; propulsion components, microelectromechanical systems and more
China researchers discover that Li-ion battery cycling can control magnetization

Visteon showcasing advanced gesture recognition and HUD technologies at CES

Visteon will display two vehicles featuring advanced concepts at CES 2016 this coming week in Las Vegas: one with 3-D gesture technology and the other with a large field-of-vision windshield head-up-display (HUD). A separate driving simulator brings together the latest Visteon human-machine interaction (HMI) input technologies for an interactive experience using real-world driving scenarios.

The proprietary 3-D gesture recognition concept is demonstrated within a compact multi-purpose vehicle. The system quickly reads defined hand movements to command certain features, using time-of-flight camera technology and high-performance, image-processing algorithms.

Reading hand gestures faster and more precisely than with today’s 2-D solutions helps prevent distraction when accessing driver information and infotainment systems. The system offers the possibility to render any surface touch-sensitive, due to to its high spatial accuracy, eliminating the need for touch panels on displays.

The system recognizes specific gestures such as holding up one, two or three fingers to perform different functions such as operating the windows, changing audio volume or opening the glove box. This provides quicker access, without the need to touch buttons or look for knobs. The system distinguishes between driver and passenger hand gestures, and also allows customizable gestures.

—Patrick Nebout, director, advanced technology and innovation for Visteon

Time of flight technology is based on the time it takes for light to travel from the source to the object and back to the camera’s sensor. By providing distance images in real time, the time-of-flight camera enables close-range gesture control in the cockpit.

Large field-of-view head-up display. Visteon delivers an extra-large windshield HUD image with rich color, contrast and brightness, enhancing content without requiring the user to look away from his or her usual viewpoint. The wide-field image—about twice the size of a normal windshield HUD—allows the driver to see information not usually displayed in HUD systems, such as menus for music, multimedia and simple maps. The full-color image is designed to be seen clearly even on very bright days, through a powerful backlight and large mirror inside the instrument panel.

The full-color resolution display uses data analytics to understand the environment in and around the vehicle, such as rain or heavy traffic. The driver sees different information displayed on the windshield as road and driving conditions change, prompting him or her to change speeds, adjust climate controls, respond to warnings, navigate, select music or answer the phone – all while keeping their eyes in the direction of the road.

Contextual user experience cockpit. Several Visteon HMI technologies converge in this interactive cockpit simulator, including spatial gesture technology, pressure-sensitive touch pads and contextual or suggestive HMI—which helps drivers make quicker decisions based on past preferences and the changing environment around the vehicle. Features include:

  • Spatial gesture (swipe up, down, left, right, and rotary motions), available at significantly less cost than camera-based solutions.

  • Pressure-sensitive touch pad input zones for the driver and passenger, which sense the depth and location of button presses to trigger different features. Additionally, the surface can create a “virtual” touch screen and can also accommodate handwriting recognition. The pressure-sensitive pad works when the user is wearing gloves, and can accommodate various surface finishes, including leather, wood veneers, vinyl, plastic and composites.

  • Contextual or suggestive HMI—using data analytics, this feature creates shortcuts and suggestions, specific to each driver, based on past behavior for connectivity, entertainment, navigation and climate. For example, when the driver gets in the car to go home from work, he or she can set the destination for home, check traffic, call home, and turn on the defroster with one gesture, as opposed to navigating a menu for each option.

  • Time-of-Flight gesture system—Allows for more complex gestures such as hand/finger signs and swipes; enables the creation of virtual touch planes and/or surfaces in the vehicle.



A gesture controlled car and an Italian passenger could be dangerous.

The comments to this entry are closed.