Chevrolet delivers first Bolt EVs to SF Bay Area customers
NHTSA proposes rule to mandate vehicle-to-vehicle (V2V) communication on light vehicles

Visteon’s Silicon Valley Technical Center to lead development of AI for autonomous vehicles

Visteon Corporation’s new technical center in Silicon Valley will lead the company’s development of artificial intelligence for autonomous vehicles. Visteon’s autonomous vehicle program will apply machine learning technology for accurately detecting and classifying objects in a vehicle’s path and planning the vehicle’s movements, resulting in fully trained driving control systems.

The recently opened facility in Santa Clara, California, will work closely with global Visteon tech centers to develop excellence in artificial intelligence software, advanced driver awareness systems (ADAS) and deep machine learning. These efforts will support Visteon’s approach to autonomous driving, which encompasses three key elements:

  • Creating fail-safe, centralized domain controller hardware leveraging Visteon’s industry-first cockpit domain controller, Smartcore. (Earlier post.)

  • Unlocking the innovation potential of algorithm developers through an easy-to-access open framework and test/simulation environment.

  • Applying artificial intelligence for object detection, classification, perception and decision-making in future autonomous vehicles.

  • /ul>

    Most current advanced driver assistance systems based on radar and cameras are not capable of accurately detecting and classifying objects—such as cars, pedestrians or bicycles—at a level required for autonomous driving. We need to achieve virtually 100 percent accuracy for autonomous driving, which will require innovative solutions based on deep machine learning technology. Our Silicon Valley team, with its focus on machine learning software development, will be a critical part of our autonomous driving technology initiative.

    —Sachin Lawande, president and CEO of Visteon

    Visteon’s recently opened facility in the heart of Silicon Valley will house a team of engineers specializing in artificial intelligence and machine learning. The center is located close to the West Coast offices of various automakers and tech companies, as well as Stanford University and the University of California, Berkeley—two of the leading universities for artificial intelligence and deep learning in the US.

    In addition to leading Visteon’s artificial intelligence efforts, the Silicon Valley office will play a key role in delivering control systems, localization and vision processing—interpreting live camera data and converting it to information required for autonomous driving. Visteon is targeting launching its first autonomous driving domain controller platform in 2018.

    Leading Visteon’s artificial intelligence effort based in Silicon Valley is Vijay Nadkarni, who joined Visteon earlier this year from Chalkzen, Inc., where he developed and launched a SaaS (software as a service) and connected car services platform.

    Visteon has been continually expanding its resources to support development of next-generation ADAS development and autonomous driving. On 1 April, Markus Schupfner joined the company as chief technology officer, bringing more than 20 years of experience leading software development for global automotive suppliers. Visteon recently announced that Matthias Schulze will join the company in January 2017 from Daimler AG; he will lead Visteon’s ADAS development, including overseeing the Silicon Valley facility.


Account Deleted

Humans only have two “cameras” in color and some humans only black and white cameras and we can still drive a car safely. I would not worry too much about what kind of sensors we need for self-driving cars. Some will make it work with cameras and radar and ultrasonic like Tesla and others will rely also on lidar.


Human drivers actively survey less than 180 degrees (in good weather conditions) and are responsable for about 2 million fatalities/year and 10 times as many accidents.

ADVs with more (up to 10) sensors, to cover 360 degrees in all weather conditions, should do better than qualified human drivers by 2030 or so?

Dr. Strange Love

Too many obstacles and situations that will not have been modeled. The human will have to be prepared to take over. Think of storms and fallen trees, power lines. Think of crowed streets, Halloween Party goers on crowded Georgetown Washington D.C streets. Think of a ditch or a muddy field or a Sink hole, or a reverse camber snow covered road next to a cliff with no guard rails.

I trust my own experience and instincts.


The learning period for ADVs may be longer than many posters expect, but by 2030 or so, ADVs will do better than the average human driver in most conditions.

ADVs on invisible ice patches (covered with light snow) may have the same difficulties as human drivers.

AWD ADVs may be an asset.


No one can accurately arbitrarily pick a year for major advances, that is just guessing. Right now people think we will have self driving cars in a few years, that is probably totally wrong.

The comments to this entry are closed.