Nissan CEO announces LEAF with ProPilot autonomy; other autonomous and connected technologies & partnerships
06 January 2017
In his 2017 Consumer Electronics Show (CES) keynote, Nissan chairman of the board and chief executive officer Carlos Ghosn made five key announcements of several technologies and partnerships as part of the Nissan Intelligent Mobility blueprint for transforming how cars are driven, powered, and integrated into wider society.
Among these is the plan to launch a new Nissan LEAF, with ProPILOT technology (earlier post), enabling autonomous drive functionality for single-lane highway driving. Nissan introduced ProPilot on the new Serena in Japan last year. Employing advanced image-processing technology, ProPILOT understands road and traffic situations and executes precise steering enabling the vehicle to perform naturally.
Nissan’s next step, planned for 2018, will be to add multiple-lane functionality, enabling automatic lane changes on highways. Stage three is autonomous city driving, which Nissan expects to be available in 2020. The fourth and final stage is fully autonomous and driverless vehicles.
This new LEAF will build on the company's leadership in electric vehicles, which includes having sold more than 250,000 Nissan LEAFs sold worldwide since 2010. Ghosn said the new LEAF is coming in the near future and represents the next chapter of Nissan Intelligent Power.
Seamless Autonomous Mobility. To shorten the time it will take for fully autonomous vehicles to get on the road, Ghosn also announced a technology called “Seamless Autonomous Mobility,” or SAM. Developed from NASA technology, SAM partners in-vehicle artificial intelligence (AI) with human support to help autonomous vehicles make decisions in unpredictable situations and build the knowledge of in-vehicle AI. This technology could potentially enable millions of driverless cars to co-exist with human drivers in an accelerated timeline. This technology is part of Nissan Intelligent Integration.
A use case scenario: an autonomous vehicle is moving through city streets and comes across an accident, with police using hand signals to direct traffic, perhaps across double yellow lines and against traffic lights. The vehicle cannot and should not, reliably judge what to do by itself.
Vehicle sensors (LIDAR, cameras, radars) can tell the car where obstacles are, the traffic light state, and even recognize some hand gestures, but human judgment is required to understand what other drivers and pedestrians are doing and decide on the appropriate course of action.
With SAM, the autonomous vehicle becomes smart enough to know when it should not attempt to negotiate the problem by itself, as in this instance. Instead, it brings itself to a safe stop and requests help from the command center.
The request is routed to the first available mobility manager—a person who uses vehicle images and sensor data (streamed over the wireless network) to assess the situation, decide on the correct action, and create a safe path around the obstruction.
The mobility manager does this by “painting” a virtual lane for the vehicle to drive itself through. When the policemen wave the vehicle past, the manager releases the car to continue on by itself along the designated route. Once clear of the area, the vehicle resumes fully autonomous operations, and the mobility manager is free to assist other vehicles calling for assistance.
As this is all happening, other autonomous vehicles in the area are also communicating with SAM. The system learns and shares the new information created by the Mobility Manager. Once the solution is found, it’s sent to the other vehicles.
As the system learns from experience, and autonomous technology improves, vehicles will require less assistance and each mobility manager will be able to guide a large number of vehicles simultaneously. There are several factors that will determine how many managers are necessary: for example, how busy the zone is, and what service the vehicle is providing, whether it's for robo-taxis, robo-shuttle, or a robo-delivery vehicle.
NASA’s Visual Environment for Remote Virtual Exploration (VERVE) open-source software, used to visualize and supervise interplanetary robots, was the starting point for Nissan’s SAM platform. NASA’s robots use autonomous technology to avoid obstacles and calculate safe driving paths through unpredictable and uncertain environments. Where the environment makes autonomous decision-making difficult, NASA supervisors draw the desired route and send to the robot for execution.
The backbone of SAM is human/machine teaming. The goal is not to remove the human from the system, but rather to use the human intelligence more strategically to support a larger system of autonomous mobility—and to help improve the artificial intelligence of the vehicles in real-time.
SAM makes it possible for society to reap the benefits of the mass introduction of autonomous vehicles. In any single day, autonomous vehicles will encounter thousands of situations that should not be resolved autonomously. Without SAM, these vehicles will be stranded, causing traffic congestion, creating a public nuisance and failing to reach their destinations. SAM permits autonomous vehicles to seamlessly integrate into existing transportation infrastructure and society. SAM is a necessary component of any system with autonomous vehicles, Nissan argues/ Without a technology such as SAM, the full integration of autonomous vehicles into society will be difficult.
SAM will also benefit companies that wish to deploy fleets of commercial autonomous vehicles, including delivery companies, taxi services and transportation systems.
Autonomy for commercial services. Taking the carmaker’s autonomous drive strategy another step further, Ghosn announced that leading a Renault-Nissan Alliance engagement, Nissan and Japanese internet company DeNA will begin tests aimed at developing driverless vehicles for commercial services.
The first phase of testing will begin this year in designated zones in Japan, with a focus on technology development. By 2020, Nissan and DeNA plan to expand the scope of their tests to include the commercial usage of driverless technology for mobility services in the Tokyo metropolitan area.
The Alliance will leverage its car-building expertise and advanced autonomous drive knowledge to build and provide the prototype vehicles, which will also be electric. DeNA will, in turn, provide its expertise in creating online and mobile user experiences to build and lead the information technology systems to provide a mobility service platform.
This is the first time the Alliance has announced a development plan of autonomous vehicles, including driverless technology. Now, the Alliance has tests targeted at all levels of autonomous drive in the United States, Europe, China and Japan.
Connected cars: innovations in intelligent driving & integration. On connected cars, which combine Nissan Intelligent Driving and Nissan Intelligent Integration, Ghosn announced that the Renault-Nissan Alliance is continuing its partnership on the development and deployment of advanced connected technologies, such as Microsoft Cortana, an in-vehicle virtual personal assistant. With features such as Cortana speech analytics, drivers can benefit from advanced in-vehicle voice recognition and intuitive human machine interface (HMI).
Cortana will allow the vehicle to adapt to personalized driver settings, even understanding different driver preferences in a shared vehicle.
The Renault-Nissan Alliance will develop and launch new connected services and applications that make it easier for people to stay connected to work, entertainment and social networks. It will also offer vehicle-centric services that can simplify and enhance engagement with the car through usage-based information, remote access, remote diagnostics and preventive maintenance.
The Renault-Nissan Alliance and Microsoft signed a global, multiyear contract focused on vehicle connectivity and connected services in September 2016. (Earlier post.)
Resilient Cities. To support the policy environment and planning needed to integrate these technologies into the world's cities, Ghosn announced a new partnership with 100 Resilient Cities - Pioneered by The Rockefeller Foundation (100RC). 100RC is a global non-profit working to help cities build resilience to physical, social and economic challenges. Together, Nissan and 100RC will help cities lay the groundwork for autonomous drive, electric vehicles, and new mobility services. Nissan is 100RC’s first automotive platform partner.
We invite others to join us, as well, from tech partners to e-commerce companies, ride-hailing and car-sharing platforms, and social entrepreneurs who can help us to test and develop new vehicles and services, and make sure everyone has access to the latest technologies and services that bring value to their lives.
—Carlos Ghosn
The SAM system outlined is the safe and smart way of introducing autonomy gradually.
Just like the Volvo system to trial this year, it makes intelligent use of a control centre.
But most importantly, it avoids the nonsense of expecting a driver to be ready to grab the wheel instantly, which is an inferior solution to no autonomy at all, as it of course adds the extra reaction time of the driver sussing the automation is screwing up before taking avoiding action.
So the important thing is simply:
'it brings itself to a safe stop and requests help from the command center.'
Lets hope that in practise this is more like the Volvo system of safely parking up, as it could cause a jam if it just stops.
Posted by: Davemart | 06 January 2017 at 07:56 AM
Yeah, come back and tell me when you have a 20 mile range and then I'm interested in what else you have to say. Next.
Posted by: DaveD | 06 January 2017 at 02:57 PM
Sigh. Still no edit function on GCR after all these years. Clearly that was supposed to say "200 mile range".
Posted by: DaveD | 06 January 2017 at 04:47 PM
Yeah, the lack of edit function really bugs me.
Posted by: Brent Jatko | 09 January 2017 at 06:42 PM
The SAM thing is essential if you want to get the driver out of control.
You can imagine where you are trying to merge into busy traffic and have to barge out as the traffic won't give way. Probably need a human to do that; but the human doesn't need to be in the car.
However, it will add costs, especially if you need a lot of people around rush hour times.
Question: do you need different "call centres" for each city/country or could you base the whole thing in (say) India?
Posted by: mahonj | 10 January 2017 at 01:58 AM