[Due to the increasing size of the archives, each topic page now contains only the prior 365 days of content. Access to older stories is now solely through the Monthly Archive pages or the site search function.]
Scania leading full-scale autonomous truck platoon project in Singapore; 4-truck convoys
January 11, 2017
Scania will design the world’s first full-scale autonomous truck platooning operations, based on its own advanced technology. The platoon will traffic public roads while transporting containers between port terminals in Singapore. The aim is to organize convoys of four trucks—with the following three trucks behind the lead truck autonomously driven, as well as to fully automate the processes for precise docking and undocking of cargo.
The multi-year project is organized by the Ministry of Transport and the Port of Singapore Authority (PSA Corporation). Toyota is also participating in this project.
Nissan unveils Vmotion 2.0 concept signaling future design, zero-emission and autonomous directions
January 10, 2017
At the 2017 North American International Auto Show, Nissan unveiled Vmotion 2.0, a new concept vehicle that signals the company’s future sedan design direction and Intelligent Mobility technology—Nissan’s roadmap to achieve zero emissions and zero fatalities. Nissan Vmotion 2.0 features Nissan Intelligent Driving, one of three core elements of Nissan Intelligent Mobility. Nissan Intelligent Driving helps provide a safe and comfortable driving experience through technology such as ProPILOT, which is envisioned to ultimately allow the vehicle to drive in autonomous mode—not just on the highway and in heavy traffic conditions – but also on urban roads with intersections.
Nissan Vmotion 2.0 is the evolution of the “V-motion” front design signature seen on many current Nissan products, such as the Murano mid-size crossover and Maxima 4-door sports car. Vmotion 2.0 takes the design a step further by forming an intelligent three-dimensional shape to create the volume and architecture of the vehicle. In this concept, the V-motion grille becomes the main fuselage, allowing for extremely sharp yet expressive surface language, accented by crisp character lines that resonate throughout the body.
VW unveils fully autonomous, electric Microbus I.D. Buzz concept at Detroit; all-wheel drive, 270-mile range
January 09, 2017
Volkswagen is presenting a fully autonomous, electric Microbus concept at the North American International Auto Show in Detroit. Like the compact I.D. presented at the Paris Motor Show last year (earlier post), the I.D. Buzz is built off the Modular Electric Drive kit (MEB). With two-motor, all-wheel drive, a fully autonomous driving mode (“I.D. Pilot”) and a new generation of display elements and controls, the concept conceptually follows the new Volkswagen brand strategy.
The all-wheel drive system has a total output of 369 hp (275 kW) and an NEDC electric driving range of 600 kilometers (270 miles on a predicted US. driving cycle). One electric motor at the front axle and one at the rear each deliver a power output of 201 hp (150 kW), distributed between the two axles by an electric propshaft. The I.D. BUZZ shown in Detroit can accelerate from 0 to 60 mph in about 5 seconds, with a governed top speed of 99 mph.
Koito and Quanergy collaborate to design automotive headlight concept with built–in solid-state LiDAR
January 07, 2017
Koito Manufacturing Co., Ltd., the largest global maker of automotive headlights, and Quanergy Systems, Inc., a leading provider of LiDAR sensors and smart sensing solutions, are collaborating to design an automotive headlight concept with built-in Quanergy S3 solid state LiDAR sensors (earlier post). The Koito headlight with built-in sensors is on display at CES 2017.
The Koito headlights, which will be located on the corners of a vehicle, each incorporates two compact Quanergy S3 solid state LiDARs that perform sensing forward and to the side, and provide real-time long-range 3D views of the environment around the vehicle and the ability to recognize and track objects.
Nissan CEO announces LEAF with ProPilot autonomy; other autonomous and connected technologies & partnerships
January 06, 2017
In his 2017 Consumer Electronics Show (CES) keynote, Nissan chairman of the board and chief executive officer Carlos Ghosn made five key announcements of several technologies and partnerships as part of the Nissan Intelligent Mobility blueprint for transforming how cars are driven, powered, and integrated into wider society.
Among these is the plan to launch a new Nissan LEAF, with ProPILOT technology (earlier post), enabling autonomous drive functionality for single-lane highway driving. Nissan introduced ProPilot on the new Serena in Japan last year. Employing advanced image-processing technology, ProPILOT understands road and traffic situations and executes precise steering enabling the vehicle to perform naturally.
Honda introduces “Cooperative Mobility Ecosystem” at CES 2017; Honda Riding Assist, NeuV concept w/ AI emotion engine
Honda unveiled its Cooperative Mobility Ecosystem concept at CES 2017 in Las Vegas, connecting artificial intelligence, robotics and big data to transform the mobility experience of the future. Featuring a number of prototype and concept technology demonstrations at CES, the Honda concept envisions a future in which vehicles will communicate with each other and infrastructure to mitigate traffic congestion and eliminate traffic fatalities, while increasing the productivity of road users and delivering new types of in-vehicle entertainment experiences. Vehicles will create new value by autonomously providing services when not in use by their owners.
Honda also announced collaborations with Visa, DreamWorks Animation and start-ups through the Honda Developer Studio and Honda Xcelerator open innovation programs based out of Honda Silicon Valley Lab. Further, as part of its effort to accelerate open innovation, Honda has established a new URL for areas including AI, Big Data and Robotics. Interested companies and individuals can access the following URL: http://www.honda.co.jp/openinnovation/.
Next-gen Audi A8 to feature MIB2+, series debut of zFAS domain controller, Mobileye image recognition with deep learning; Traffic Jam Pilot
January 05, 2017
Audi’s next-generation A8, premiering this year, will feature the first implementation of the MIB2+ (Modular Infotainment Platform). The key element in this new implementation of the MIB is NVIDIA’s Tegra K1 processor (earlier post), which makes new functions possible and has the computing power needed to support several high-resolution displays—including the second-generation Audi virtual cockpit. Onboard and online information will merge, making the car part of the cloud to a greater degree than ever.
The A8 also marks the series debut of the the central driver assistance controller (zFAS), which also features the K1; in the future, the X1 processor (earlier post) will be applied in this domain controller. The zFAS, developed in collaboration with TTTech, Mobileye, NVIDIA and Delphi, also integrates a Mobileye image processing chip. (Earlier post.)
Renesas Electronics and TTTech deliver highly automated driving platform
Renesas Electronics and TTTech Computertechnik AG, a global leader in robust networking and safety controls, have developed a highly automated driving platform (HADP). The new HADP is a prototype electronic control unit (ECU) for mass production vehicles with integrated software and tools, which demonstrates how to use Renesas and TTTech technologies combined in a true automotive environment for autonomous driving. The HADP accelerates the path to mass production for Tier 1s and OEMs.
The newly released HADP is the first outcome of the collaboration between TTTech and Renesas announced in January 2016 (earlier post), and is an extended version of the HAD solution kit released in October 2016. It is based on dual R-Car H3 system-on-chips (SoCs) (earlier post) and the RH850/P1H-C microcontroller (MCU).
Audi & NVIDIA partner to deliver fully automated driving with AI starting in 2020; piloted Q7 w/ neural network CES demo
Audi announced a partnership with NVIDIA to use artificial intelligence in delivering highly automated vehicles starting in 2020. Deep learning technology will enable skilled handling of real-road complexities, delivering safer automated vehicles earlier. The first phase of this expanded collaboration between the nearly decade-long partners focuses on NVIDIA DRIVE PX, which uses trained AI neural networks to understand the surrounding environment, and to determine a safe path forward. (Earlier post.)
Audi and NVIDIA have combined their engineering and visual computing technologies in the past on Audi innovations such as Audi MMI navigation and the Audi virtual cockpit. Later this year Audi will introduce the next-generation Audi A8 featuring Traffic Jam Pilot—the world’s first Level 3 automated vehicle (as defined by SAE International) equipped with a first-generation central driver assistance domain controller (zFAS) that integrates NVIDIA computing hardware and software. (Earlier post.)
BMW, Intel, Mobileye: 40 autonomous BMWs to be on road by 2H 2017; standards-based open platform for autonomy
BMW Group, Intel and Mobileye announced that a fleet of approximately 40 autonomous BMW vehicles will be on the roads by the second half of 2017, demonstrating the significant advancements made by the three companies towards fully autonomous driving. Revealing this at a podium discussion held during a joint press conference at CES, the companies further explained that the BMW 7 Series will employ advanced Intel and Mobileye technologies during global trials starting in the US and Europe.
In July 2016, BMW Group, Intel and Mobileye announced a collaboration to bring solutions for highly and fully automated driving into series production by 2021. The three said they would create a standards-based open platform—from door locks to the datacenter—for the next generation of cars. (Earlier post.) The companies have since developed a scalable architecture that can be adopted by other automotive developers and carmakers to pursue state of the art designs and create differentiated brands. The offerings scale from individual key integrated modules to a complete end-to-end solution providing a wide range of differentiated consumer experiences.
Renesas Electronics unveils RH850/V1R-M automotive radar solution for ADAS and autonomous driving vehicles
January 04, 2017
Advanced semiconductor supplier Renesas Electronics Corporation introduced the RH850/V1R—its first product from the new RH850-based, 32-bit, automotive radar microcontroller (MCU) series—that will deliver the high performance and features required for enabling future advanced driver assistance systems (ADAS) and autonomous driving vehicles. The RH850/V1R-M includes a digital signal processor (DSP) and high speed serial interfaces and is specifically designed for middle- to long-range radars.
Vehicles are being equipped with a broad spectrum of sensors such as cameras, LiDAR and ultrasonic sensors to support expanded advanced driver assistance (ADAS) and emerging autonomous driving functionality. Radar sensors are needed for ADAS applications—including advanced emergency braking and adaptive cruise control—because, unlike other sensors, radar sensors are not negatively affected by external environmental limitations which includes adverse weather conditions, such as rain, fog or whether the sun is shining or not.
Qualcomm introducing Drive Data platform for sensor fusion
Qualcomm is introducing the Qualcomm Drive Data Platform to collect and analyze intelligently information from a vehicle’s sensors. Cars will be able to determine their location up to lane-level accuracy, to monitor and to learn driving patterns, to perceive their surroundings, and to share this reliable and accurate data with the rest of the world.
These capabilities will be key for many connected car applications, from shared mobility and fleet management to 3D high-definition mapping and automated driving. Qualcomm Drive Data platform is built on three pillars: heterogeneous connectivity; precise positioning; and on-device machine learning, all integrated into the Qualcomm Snapdragon solution.
FCA unveils Chrysler Portal concept: next-gen family transportation for Millenials; electric, semi-autonomous, connected, flexible
January 03, 2017
FCA US unveiled the Chrysler Portal concept, an all-electric, semi-autonomous (SAE Level 3) interpretation of the “fifth generation” of family transportation focused toward the millennial generation. (Earlier post.)
Millennials—the demographic cohort following Generation X, and defined as people born between 1982 – 2001—are tech-savvy, environmentally aware and cost-conscious. To balance those needs and to be an integral part of millennials’ lives, the Chrysler designed the Portal concept to maximize interior space with an impressive degree of flexibility between seating and cargo configurations; to enable users to include and expand their social media communities with seamless wireless integration between the vehicle and mobile devices; to take advantage of advances in battery-electric powertrains and the growth of a rapid recharging network; and to deliver these capabilities in a powerful, high-tech form.
Intel to acquire 15% of HD-mapping company HERE; collaborating on technology for autonomous vehicles and IoT
Intel will purchase a 15% ownership stake in HERE, a global provider of high-definition digital maps and location-based services, from HERE’s current indirect shareholders: Audi AG, BMW AG and Daimler AG. The three automakers acquired HERE from Nokia in 2015 (earlier post) with the goal of securing the long-term availability of HERE’s products and services as an open, independent and value-creating platform for cloud-based maps and other mobility services accessible to all customers from the automotive industry and other sectors. The three automakers hold an equal stake in HERE.
In conjunction with Intel’s acquisition of a stake in HERE, the two companies also signed an agreement to collaborate on the research and development of a highly scalable proof-of-concept architecture that supports real-time updates of high definition (HD) maps for highly and fully automated driving.
Volvo Cars and Autoliv announce the launch of Zenuity JV for AD and ADAS; first ADAS products in 2019
Volvo Cars, and Autoliv signed a final agreement to establish a new joint venture called Zenuity to develop software for autonomous driving (AD) and driver assistance systems (ADAS), based on the letter of intent announced during fall 2016. (Earlier post.)
Both Volvo Cars and Autoliv will license and transfer the intellectual property for their ADAS systems to the joint venture. From this base, the company will develop new ADAS products and AD technologies. The new company is expected to have its first driver assistance products available for sale by 2019 with autonomous driving technologies following shortly thereafter.
MIT CSAIL, Cornell study finds rides-sharing theoretically could cut taxi traffic in NYC by 75%
A new modeling study by a team from MIT CSAIL (Computer Science and Artificial Intelligence Laboratory) and Cornell suggests that using ride-sharing from companies like Uber and Lyft theoretically could reduce the number of taxis on the road in New York City by 75% without significantly impacting travel time. A paper on their work will be published this week in Proceedings of the National Academy of Sciences (PNAS).
Led by Professor Daniela Rus of MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL), the researchers developed a dynamic ride-sharing algorithm that found that 3,000 four-passenger cars could serve 98% of taxi demand in New York City, with an average wait-time of only 2.7 minutes. The team also found that 95 percent of demand would be covered by just 2,000 ten-person vehicles, compared to the nearly 14,000 taxis that currently operate in New York City.
Texas Automated Vehicle Proving Ground Partnership forms; applying for national designation
January 02, 2017
Cities and regions across Texas are partnering with the Texas A&M Transportation Institute (TTI), the University of Texas at Austin’s Center for Transportation Research (CTR), and Southwest Research Institute (SwRI) to form the Texas Automated Vehicle (AV) Proving Ground Partnership.
The partnership builds upon the momentum of the US Department of Transportation (USDOT) Smart City Challenge, in which Austin was a finalist, and is a direct outcome of the Texas Mobility Summit held 1–2 December 2016. The summit, hosted by the Texas Department of Transportation (TxDOT) Texas Technology Task Force, brought together nine teams representing 10 cities and three research institutions to galvanize key leadership in developing innovative solutions to the state’s mobility challenges. The teams are committed to continuing the collaboration, beginning with leveraging their collective resources, expertise and opportunities to advance AV technology.
Lucid Motors chooses Mobileye as partner for autonomous vehicle technology
December 30, 2016
Luxury EV developer Lucid Motors and Mobileye N.V. announced a collaboration to enable autonomous driving capability on Lucid vehicles. Lucid will launch its first car, the Lucid Air (earlier post), with a complete sensor set for autonomous driving from day one, including camera, radar and lidar sensors.
Lucid chose Mobileye to provide the primary compute platform, full 8-camera surround view processing, sensor fusion software, Road Experience Management (REM) crowd-based localization capability, and reinforcement learning algorithms for Driving Policy. These technologies will enable a full Advanced Driver Assistance System (ADAS) suite at launch, and then enable a logical and safe transition to autonomous driving functionality through over-the-air software updates.
Infiniti’s QX50 Concept to debut in Detroit; potential application for VC-Turbo variable compression ratio engine; autonomous drive support
Infiniti will stage the world premiere of the QX50 Concept mid-size premium SUV at the 2017 North American International Auto Show in Detroit; the concept showcases the brand’s vision for a next-generation mid-size premium SUV and also demonstrates a potential application for Infiniti’s production-ready VC-Turbo variable compression ratio engine (earlier post).
Infiniti’s QX50 Concept also previews the brand’s roll out of its autonomous drive support technologies. Central to the strategy for the development of all future autonomous drive support systems is that they should ensure the driver retains ultimate control over the vehicle—in keeping with Infiniti’s focus on driver engagement—while providing a proactive approach to safety.
Rinspeed Oasis autonomous electric concept first to be built on ZF Intelligent Rolling Chassis
At CES 2017 in Las Vegas next week, Rinspeed will unveil its Oasis autonomous electric concept vehicle. The two-seat runabout—somewhat reminiscent of a modern interpretation of famous Star Wars icon R2D2—can turn on its wheels with almost a zero radius due to a special steering angle, two in-wheel electric motors and torque vectoring, all developed by ZF on Lake Constance.
The “Intelligent Rolling Chassis” (IRC) from ZF offers a highly flexible platform for urban electric vehicles and features ZF’s all-electric drive based on the electric twist beam (eTB) rear axle. The aluminum electric motors at the rear axle are integrated with a single speed transmission that enables the Oasis to accelerate to 100 km/h in roughly nine seconds and, if needed, up to a speed of 150 km/h (93 mph).
HERE and Mobileye to partner on crowd-sourced HD mapping for automated driving
December 29, 2016
High-definition (HD) mapping company HERE and Mobileye, developer of computer vision and machine learning, data analysis, localization and mapping for Advanced Driver Assistance Systems and autonomous driving, plan to enter a strategic partnership that links their respective autonomous driving technologies into an enhanced industry-leading offering for automakers.
Under the partnership, Mobileye’s Roadbook—a detailed, cloud-based map of localized drivable paths and visual landmarks constantly updated in real-time—will be integrated as a data layer in HERE HD Live Map, HERE’s real-time cloud service for partially, highly and fully automated vehicles. Roadbook information will provide an important additional layer of real-time contextual awareness by gathering landmark and roadway information to assist in making a vehicle more aware of—and better able to react to—its surroundings, as well as to allow for more accurate vehicle positioning on the road.
Volvo Cars adds Microsoft’s Skype for Business to 90 Series; looking ahead to autonomous cars
Volvo Cars will introduce Skype for Business, Microsoft’s collaborative productivity app, to its new 90 Series cars. Volvo Cars is the first carmaker to launch such an in-car productivity tool.
Skype for Business is actively used by millions of people at work around the globe. In Volvo’s 90 Series cars people will be able to view their upcoming meetings and participant details, and join meetings with one click via the large center display.
Ford introducing next-gen Fusion Hybrid autonomous development vehicle at CES and NAIAS in January
December 28, 2016
Ford Motor Company is introducing its next-generation Fusion Hybrid autonomous development vehicle; the car will first appear at CES 2017 and the North American International Auto Show in January. The new vehicle uses the current Ford autonomous vehicle platform, but ups the processing power with new computer hardware.
Electrical controls are closer to production-ready, and adjustments to the sensor technology, including placement, allow the car to better see what’s around it. New LiDAR sensors have a sleeker design and more targeted field of vision, which enables the car to now use just two sensors rather than four, while still getting just as much data.
TriLumina to demo 256-pixel 3D solid-state LiDAR and ADAS systems for autonomous driving at CES 2017
December 27, 2016
At CES 2017, TriLumina (earlier post)—a spin-out from Sandia National Laboratories—will demonstrate, in collaboration with LeddarTech (earlier post), an innovative 256-pixel, 3D LiDAR solution for autonomous driving applications powered by TriLumina’s breakthrough laser illumination module and LeddarTech’s LeddarCore ICs.
TriLumina has developed eye-safe, vertical-cavity surface-emitting lasers (VCSELs). The TriLumina illumination modules replace the expensive, bulky scanning LiDARs being used in current autonomous vehicle demonstration programs with high resolution and long-range sensing in a small, robust and cost-effective package.
U of Waterloo Autonomoose autonomous vehicle on the road in Canada
December 23, 2016
Researchers from the University of Waterloo Center for Automotive Research (WatCAR) in Canada are modifying a Lincoln MKZ Hybrid to autonomous drive-by-wire operation. The research platform, dubbed “Autonomoose” is equipped with a full suite of radar, sonar, lidar, inertial and vision sensors; NVIDIA DRIVE PX 2 AI platform (earlier post) to run a complete autonomous driving system, integrating sensor fusion, path planning, and motion control software; and a custom autonomy software stack being developed at Waterloo as part of the research.
Recently, the Autonomoose autonomously drove a crew of Ontario Ministry of Transportation officials to the podium of a launch event to introduce the first car approved to hit the roads under the province’s automated vehicle pilot program.
Honda R&D and Alphabet’s Waymo enter discussions on technical collaboration on full autonomous vehicles
December 22, 2016
Honda R&D Co., Ltd., the R&D subsidiary of Honda Motor Co., Ltd., is entering into formal discussions with Waymo, an independent company of Alphabet Inc. (earlier post), to integrate its self-driving technology with Honda vehicles. This technical collaboration between Honda researchers and Waymo’s self-driving technology team would allow both companies to learn about the integration of Waymo’s fully self-driving sensors, software and computing platform into Honda vehicles.As part of the discussion on technical collaboration, Honda could initially provide Waymo with vehicles modified to accommodate Waymo’s self-driving technology. These vehicles would join Waymo’s existing fleet, which are currently being tested in four US cities. (Waymo will also receive 100 Chrysler Pacifica PHEV minivans from FCA built uniquely to support self-driving operation. (Earlier post.) Waymo CEO John Krafcik will have one of the autonomous PHEV minivans onstage for his keynote at NAIAS in January.)
MIT systems analysis identifies optimal vehicle platooning strategies
December 21, 2016
MIT engineers have studied a simple vehicle-platooning scenario and determined the best ways to deploy vehicles in order to save fuel and minimize delays. Their analysis, presented this week at the International Workshop on the Algorithmic Foundations of Robotics (WAFR 2016), shows that relatively simple, straightforward schedules may be the optimal approach for saving fuel and minimizing delays for autonomous vehicle fleets. The findings may also apply to conventional long-distance trucking and even ride-sharing services.
Platooning for heavy-duty trucks is on the cusp of market introduction. Peloton plans to bring its first platooning system to market in 2017, and ARPA-E is funding research into platooning technology as part of its NEXTCAR project. (Earlier post.)
FCA delivers 100 uniquely built Chrysler Pacifica PHEVs to Waymo for autonomous driving test fleet
December 19, 2016
Waymo (formerly the Google self-driving car project, earlier post) and FCA announced that production of 100 Chrysler Pacifica Hybrid minivans (earlier post) uniquely built to enable fully self-driving operations has been completed.
The plug-in hybrid (PHEV) vehicles are currently being outfitted with Waymo’s fully self-driving technology, including a purpose-built computer and a suite of sensors, telematics and other systems, and will join Waymo’s self-driving test fleet in early 2017. Waymo and FCA also revealed today the first images of the fully self-driving Chrysler Pacifica Hybrid vehicle.
LeddarTech showcasing 2D and 3D solid-state LiDARs for mass-market autonomous driving deployments; Leddar Ecosystem
December 16, 2016
At CES 2017, LeddarTech will be showcasing 2D and 3D high-resolution LiDAR solutions for autonomous driving applications based on its next-generation LeddarCore ICs and developed with the collaboration of leading-edge suppliers and partners from the newly-established Leddar Ecosystem. (Earlier post.)
Presented publicly for the first time, these systems demonstrate the scalability of Leddar technology and its ability to meet the high levels of performance, resolution, and cost-effectiveness required by Tier-1 and OEMs for mass-market autonomous driving applications. These LiDAR systems’ production versions will offer resolutions of up to 512×64 on a field of view of 120×20 degrees, and detection ranges that exceed 200 m for pedestrians and over 300 m for vehicles.
BMW Group and IBM collaborate on research on future driver assistance systems; IBM Watson IoT
December 15, 2016
IBM and BMW Group researchers are collaborating to explore the role of Watson cognitive computing in personalizing the driving experience and creating more intuitive driver support systems for cars of the future. As part of an agreement between the two companies, the BMW Group will collocate a team of researchers at IBM’s global headquarters for Watson Internet of Things (IoT) in Munich, Germany and the companies will work together explore how to improve intelligent assistant functions for drivers.
IBM recently pledged to invest US$200 million to make its new Munich center one of the world’s most advanced facilities for collaborative innovation as part of a global investment of US$3 billion to bring Watson cognitive computing to the Internet of Things.
Uber launches self-driving pilot in San Francisco with Volvo Cars
December 14, 2016
Uber is expanding its self-driving pilot to San Francisco, California, using specially-converted self-driving Volvo XC90 premium SUVs. This marks the next phase in a deepening alliance between Volvo and Uber after the two companies signed an agreement in August 2016 to establish a jointly-owned project to build base vehicles that can be used to develop fully autonomous driverless cars. (Earlier post.) These cars were initially tested in Pittsburgh, Pennsylvania.
The latest cars to be used in San Francisco have been built by Volvo and sold to Uber, after which Uber’s own self-driving hardware and software package has been added, most visibly in the roof-mounted control apparatus. Volvo Cars and Uber are contributing a combined US$300 million to the project. Both Uber and Volvo will use the same base vehicle for the next stage of their own autonomous car strategies.
Visteon’s Silicon Valley Technical Center to lead development of AI for autonomous vehicles
December 13, 2016
Visteon Corporation’s new technical center in Silicon Valley will lead the company’s development of artificial intelligence for autonomous vehicles. Visteon’s autonomous vehicle program will apply machine learning technology for accurately detecting and classifying objects in a vehicle’s path and planning the vehicle’s movements, resulting in fully trained driving control systems.
The recently opened facility in Santa Clara, California, will work closely with global Visteon tech centers to develop excellence in artificial intelligence software, advanced driver awareness systems (ADAS) and deep machine learning. These efforts will support Visteon’s approach to autonomous driving, which encompasses three key elements:
Velodyne LiDAR announces new design for miniaturized, low-cost solid-state LiDAR sensors using GaN technology
Velodyne LiDAR announced a new design for a solid-state LiDAR sensor that can deliver a subsystem cost of less than US$50 when sold in high-volume manufacturing scale. The technology will impact the proliferation of LiDAR sensors in multiple industry sectors, including autonomous vehicles, ridesharing, 3D mapping, and drones.
LiDAR sensors that leverage this new design will be less expensive, easier to integrate due to their smaller size, and more reliable as a result of fewer moving parts. The technology can also be integrated in Velodyne LiDAR’s existing Puck form factors.
Hyundai Motor accelerating development of its connected car operating system
December 09, 2016
Hyundai Motor is accelerating development of its advanced ccOS (connected car Operating System), the core platform technology for its future connected cars. The software will optimize the high-speed transmission and reception of data within the vehicle to support increasingly complex features that will lead the connected car market.
Hyundai outlined the development of ccOS in November, and has established an Infotainment Software Development Team in its Namyang Research and Development Center dedicated solely to developing ccOS technologies. In November, Hyundai said that ccOS would incorporate open source software including an open In-vehicle Infotainment (IVI) platform of the GENIVI Alliance.
Daimler joining MIT CSAIL Alliance Program for AI work; cognitive vehicles
December 07, 2016
Daimler is becoming a new member of the MIT CSAIL Alliance Program. MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) is the largest research laboratory at MIT and one of the world’s most important centers of information technology research. With 1,000 members and more than 100 principal investigators coming from eight departments, CSAIL includes approximately 50 research groups organized into three focus areas: artificial intelligence, systems and theory.
Key CSAIL initiatives currently underway include tackling the challenges of big data, developing new models for wireless and mobile systems, securing computers and the cloud against cyber attacks, rethinking the field of artificial intelligence, and developing the next generation of robots. CSAIL Alliances is a gateway into the lab for organizations seeking a closer connection to the work, researchers and students of CSAIL.
Audi sponsors AI conference NIPS 2016, demonstrates how car develops intelligent parking strategies
Audi is a sponsor this year of the annual Conference and Workshop on Neural Information Processing Systems (NIPS) and is showcasing its expertise at the conference on artificial intelligence for the first time. Throughout this week, the automaker is showing, with the aid of a 1:8 scale model—the “Audi Q2 deep learning concept”—how a car develops intelligent parking strategies. On an area measuring 3 x 3 meters, the Q2 autonomously searches for and finds a suitable parking space in the form of a metal frame, and then parks itself there.
Self-learning systems are a key technology for piloted-driving cars, and Audi has already built up a body of know-how in machine learning. The company is the only automaker represented at NIPS 2016 with its own stand and a showcase. (Although engineers from Daimler are presenting a demonstration at NIPS 2016 on the detection of obstacles by an autonomous car.)
Honda to show EV concept with AI emotion engine from joint project with Softbank
December 06, 2016
At CES 2017, Honda will showcase what it calls a future technology path toward a redefined mobility experience. The exhibit will include the NeuV, a concept automated EV commuter vehicle equipped with artificial intelligence (AI) called the “emotion engine” that creates new possibilities for human interaction and new value for customers.
The emotion engine is the focus of a joint research project with Softbank Corporation that Honda announced in July 2016 to apply the AI technology in mobility products. (Earlier post.) The “emotion engine” is a set of AI technologies developed by cocoro SB Corp., which enable machines artificially to generate their own emotions.
Nissan introduces driverless towing system at Oppama Plant using modified LEAF
December 05, 2016
Nissan Motor has introduced Intelligent Vehicle Towing (IVT), a fully automated vehicle towing system, at its Oppama Plant. The IVT system uses a modified Nissan LEAF autonomously to tow trollies carrying finished vehicles between designated loading and unloading points at the plant.
Unlike conventional automatic guided vehicle systems for transporting parts, which often require the installation of rails or extensive use of magnetic tape, this system does not need any special infrastructure to operate. This new project, which utilizes mapping and communication technologies to link an intelligent and all-electric car to infrastructure, is a step towards the realization of Nissan Intelligent Integration, a component of Nissan’s Intelligent Mobility vision.
BMW Group expands BMW i Ventures role with new €500M fund; widened scope of investment, greater independence
December 01, 2016
The BMW Group is expanding the remit of its BMW i Ventures venture capital unit and creating a new fund of up to €500 million (US$531 million) over ten years to support it. The new fund will allow BMW i Ventures to make investments in a wider range of areas, such as autonomous driving and digitalization, and to secure continued access to the technologies of the future.
BMW i Ventures’ previous focus on mobility services and electro-mobility will be expanded to cover the BMW Group’s full innovation spectrum in all areas of Strategy Number ONE > NEXT, even those outside of the traditional automotive value chain. Future topics for exploration will focus on “Enabling Technology and Digital Vehicle Technology”, “Mobility and Digital Services”, “Customer Experience” and “Advanced Production Technology”.
ORNL study finds even low penetration of CAVs delivers significant fuel economy benefits, but increases travel time slightly
A new study by a team at Oak Ridge National Laboratory (ORNL) suggests that low penetration rates of connected and automated vehicles (CAVs) can deliver significant fuel consumption benefits, but that the total travel time increases slightly. The study, presented presented at the 9th ACM SIGSPATIAL on Computational Transportation Science, also found that benefits in travel time increase with higher penetration rates of CAVs.
The study—one of the first that captures the impact of different penetration rates of CAVs on fuel consumption and travel time—builds off of earlier work done by the team of Jackeline Rios-Torres and Andreas Malikopoulos to develop an optimization framework and an analytical closed-form solution that addresses the problem of optimally coordinating connected and automated vehicles (CAVs) at merging roadways to achieve smooth traffic flow without stop-and-go driving. (Earlier post.)
Delphi & Mobileye to showcase Centralized Sensing Localization and Planning (CSLP) autonomous driving system in public demo at CES 2017
November 30, 2016
Delphi Automotive PLC and Mobileye will showcase their Centralized Sensing Localization and Planning (CSLP) automated driving system—which will be ready for production by 2019—on a 6.3-mile urban and highway combined public route in Las Vegas for CES 2017. (Earlier post.)
The partners said that CSLP is the first turnkey, fully integrated automated driving solution with an industry-leading perception system and computing platform. (Intel will provide the system-on-a-chip (SOC) for the systems.) The Las Vegas drive will tackle everyday driving challenges such as highway merges, congested city streets with pedestrians and cyclists and a tunnel.
Department of Transportation seeking proposals for automated vehicle proving grounds pilot
November 28, 2016
The US Department of Transportation (DOT) is requesting proposals from applicants (DOT-OST-2016-0233) to form an initial network of multiple proving grounds focused on the advancement of autonomous vehicle technology. The proving grounds will develop and share best practices around the safe testing, demonstration and deployment of autonomous vehicle technology.
The selected proving grounds will be designated “USDOT Automated Vehicle Proving Grounds.” DOT anticipates that the designation will encourage new levels of public safety while contributing to a foundation able to transform personal and commercial mobility and provide new opportunities to disadvantaged people and communities.
Volkswagen’s 10-year evolution of Park Assist; heading toward trained parking and higher levels of autonomy
November 26, 2016
Volkswagen first introduced a parking assistance system based on ultrasonic sensors in the early 1990s. However, it was the “Park Assist” Gen 1 system presented in the Touran in 2007 that marked a foundational point in the commercial development of the technology. After it was activated, Park Assist was able to detect parallel parking spaces on the left and right sides of the road as the car passed them using special, side-oriented ultrasonic sensors, enabling semi-automatic parking for the first time.
Volkswagen engineers have continued to enhance the functionality, leading to the release of Gen 3 Park Assist in 2014, with a clear roadmap to the deployment of higher levels of autonomy, including trained parking: fully automated parking with a one-off training process. At a recent visit to Volkswagen’s Ehra proving ground (Prüfgelände Ehra), Green Car Congress had the opportunity to see a prototype of trained parking in action.
Juniper Research: taxi sector to lead self-driving market to >22M consumer vehicles on the road by 2025
November 23, 2016
New findings from Juniper Research project that the annual production of self-driving cars will reach 14.5 million in 2025, up significantly from only a few thousands in 2020, resulting in a global installed base of more than 22 million consumer vehicles by 2025.
The new research, Autonomous Vehicles & ADAS: Adoption, Regulation & Business Models 2016-2025, found that the market adoption of AV (Autonomous Vehicle) technology is set to accelerate over the next few years, driven by increasingly stringent vehicle safety specifications; environmental pressures; and rapid technological developments.
nuTonomy to test its self-driving cars on specific public roads in Boston
November 21, 2016
nuTonomy, developer of software for self-driving cars, has signed a Memorandum of Understanding (MOU) with the City of Boston and the Massachusetts Department of Transportation that authorizes nuTonomy to begin testing its growing fleet of self-driving cars on specific public streets in a designated area of Boston.
nuTonomy will begin testing its self-driving Renault Zoe electric vehicle before the end of the year in the Raymond L. Flynn Marine Park in the Seaport section of the city. nuTonomy outfits its vehicles with a software system which has been integrated with high-performance sensing and computing components to enable safe operation without a driver. The company’s autonomous and robotics technology system grew out of research conducted in MIT labs run by nuTonomy co-founders Karl Iagnemma and Emilio Frazzoli.
U-Michigan, China enter two new automated and connected vehicle partnerships
November 17, 2016
The University of Michigan is entering two separate agreements with Chinese institutions targeting automated and connected vehicles. Together with a third new agreement focused on clean water, the three agreements add up to more than $54 million to advance research in these key areas.
First, a $27-million research agreement with Shenzhen-based investment firm Frontt Capital Management will advance autonomous, connected vehicles and robotic technologies. This agreement puts in place measures that U-M and Frontt agreed to in a memorandum of understanding signed last month in China. It establishes a Joint Research Center for Intelligent Vehicles at U-M. It contributes toward construction of the recently approved Robotics Laboratory and a vehicle garage on U-M’s North Campus near Mcity, the simulated urban-suburban environment for testing connected and automated vehicles.
Hyundai introduces new autonomous IONIQ concept at AutoMobility LA
November 16, 2016
Hyundai Motor Company introduced the Autonomous IONIQ concept during its press conference at AutoMobility LA (Los Angeles Auto Show). With a design resembling the rest of the IONIQ lineup (earlier post), the vehicle is one of the few self-driving cars in development to have a LiDAR system hidden in its front bumper instead of installed on the roof, enabling it to look like any other car on the road and not a high school science project.
Hyundai’s goal for the autonomous IONIQ concept was to keep the self-driving systems as simple as possible. This was accomplished by using the production car’s Smart Cruise Control’s forward-facing radar, Lane Keep Assist cameras and integrated them with LiDAR technology.
Mitsubishi unveils new electric compact SUV concept; 400 km range
Mitsubishi Motors North America, Inc. (MMNA) introduced the Mitsubishi eX Concept, a compact SUV with a next-generation EV system, at the 2016 Los Angeles Auto Show. The concept car is a showcase of Mitsubishi’s electric vehicle (EV) technologies, a new iteration of the Dynamic Shield front design concept, autonomous driving capabilities, Artificial Intelligence (AI) as well as a range of other technologies.
The system is configured with a new drive battery that greatly improves energy density of previous batteries and front and rear compact high-output motors. Together with the reduction in weight and higher efficiency of the new EV system, a non-compromising reduction in the weight of the body has given the Mitsubishi eX Concept a cruising range of 400 km (248.5 miles).
U-M offers open-access automated cars to advance driverless research
New University of Michigan research vehicles will be open testbeds for academic and industry researchers rapidly to test self-driving and connected vehicle technologies at a world-class proving ground. These open connected and automated research vehicles (CAVs) are equipped with sensors including radar, LiDAR and cameras, among other features. They will be able to link to a robot operating system. An open development platform for connected vehicle communications will be added later.
The open CAVs are based at Mcity, U-M’s simulated urban and suburban environment for testing automated and connected vehicles. While a handful of other institutions may offer similar research vehicles, U-M says that it is the only one that also operates a high-tech, real-world testing facility.
Intel to invest more than $250M over next two years in autonomous driving; “Data is the new oil”
November 15, 2016
In a keynote address at the AutoMobility LA conference, Intel CEO Brian Krzanich announced that Intel Capital is targeting more than $250 million of additional new investments over the next two years to make fully autonomous driving a reality. This is the first time Intel is keynoting at an automotive conference, signifying how critical the automotive market has become for the company.
These investments will drive the development of technologies that push the boundaries on next-generation connectivity, communication, context awareness, deep learning, security, safety and more. Drilling down into the areas that will be fueled by the fresh investments, Krzanich highlighted technologies that will drive global Internet of Things (IoT) innovation in transportation; areas where technology can directly mitigate risks while improving safety, mobility, and efficiency at a reduced cost; and companies that harness the value of the data to improve reliability of automated driving systems.
Ford working with Bloomberg Aspen Initiative on Cities and Autonomous Vehicles
Bloomberg Philanthropies and the Aspen Institute recently launched the Bloomberg Aspen Initiative on Cities and Autonomous Vehicles, a new program for leading global mayors who will work together to prepare their cities for the emergence of autonomous vehicles.
The intention of the initiative is to galvanize experts and data to accelerate cities’ planning efforts, and to produce a set of principles and tools that participating cities, as well as cities around the world, can use to chart their own paths forward. The inaugural cities in the initiative include Austin, Texas; Buenos Aires, Argentina; Los Angeles, California; Paris, France; and Nashville, Tennessee. Five additional cities will be announced later this year. At AutoMobility LA, Ford CEO Mark Fields announced that his company is working with Bloomberg in this initiative.
Rolls-Royce and VTT Technical Research Centre partner to develop remote and autonomous ships
November 14, 2016
Rolls-Royce and VTT Technical Research Centre of Finland Ltd have formed a strategic partnership to design, to test and to validate the first generation of remote and autonomous ships. The new partnership will combine and integrate the two companies’ unique expertise to make such vessels a commercial reality. (Earlier post.)
Rolls-Royce is pioneering the development of remote-controlled and autonomous ships and believes a remote-controlled ship will be in commercial use by the end of the decade. The company is applying technology, skills and experience from across its businesses to this development.
Volkswagen unveils updated Golf; Millerized engines, semi-automated driving, digital cockpit and gesture control
November 10, 2016
Volkswagen presented a major update of the Golf in an event at the Autostadt in Wolfsburg. In addition to some design enhancements, the new Golf features new engines (including the Millerized 1.5L EA211 introduced at the Vienna Motor Symposium in April, earlier post), new assistance systems and a new generation of infotainment systems. As a world-first in the compact class, the top-of-the-range “Discover Pro” infotainment system can be operated by gesture control.
With its 9.2-inch screen it forms a conceptual and visual entity with the Active Info Display (digital instrument panel), which is also new to the Golf. (earlier post) The updated Golf is also one of the first compact cars to be available with semi-automated driving functions—the new Traffic Jam Assist function can guide the Golf at speeds of up to 60 km/h (37 mph) in strenuous stop-and-go traffic. It steers, brakes and accelerates the new Golf.
Renesas Electronics delivers 2nd-gen ADAS view solution kit for surround view, electronic mirrors and driver monitoring for autonomous driving
November 08, 2016
Renesas Electronics Corporation has introduced a new all-in-one Advanced Driver Assistance Systems (ADAS) view solution kit. Expanding the success of the first-generation ADAS surround view kit that was launched in October 2015, Renesas’ second-generation ADAS view solution kit with up to eight cameras realizes next-generation electronic mirrors, driver monitoring and surround view systems at the same time.
It has become a standard in autonomous driving and ADAS applications to enable sensor fusion combining and processing the collected information from automotive cameras and radars for vehicles to recognize their surroundings. 360-degree surround view is expected to become an essential feature available in all vehicle segments. Additionally, mirrors will be replaced by cameras, and driver monitoring features will be required for autonomous driving and to increase safety.
Groupe Renault announces strategic partnership with computer vision innovator Chronocam
Groupe Renault has entered into a strategic development agreement with Chronocam SA (earlier post), a developer of biologically-inspired vision sensors and computer vision solutions for automotive applications. This agreement will focus on further developing and applying Chronocam’s innovative approach to sensing and processing visual inputs to Renault’s Advanced Driver Assistance Systems (ADAS) and autonomous driving developments.
Renault previously announced an investment in Chronocam’s Series B round of funding, which raised $15 million for the Paris-based start-up and includes a group of international venture capital funds including: Intel Capital, Robert Bosch Venture Capital, iBionext, 360 Capital and CEA investissement.
NXP and DAF Trucks commit to set new benchmark in truck platooning: 30x faster than human reaction time
NXP Semiconductors N.V. and DAF Trucks announced plans to empower truck platoons to react 30 times faster than humans in 2017, enabling a reduced distance between platooning trucks. Achieving this goal would mark a significant milestone in the introduction of platooning to fleet operators who expect considerable efficiency and safety gains while maintaining a maximum level of data security.
In Munich, NXP and its partners are showcasing the progress of secure intelligent transport systems in advance of this year’s electronica show. The demonstrations include platooning live on Munich roads, traffic signal and vehicle synchronization, and technology that protects vulnerable road users based on secure vehicle-to-everything technology (V2X). Platooning promises to increase fuel efficiency up to 10%, improve road safety and reduce exhaust emissions.
Toshiba advances deep learning with extremely low-power neuromorphic processor; supporting IoT edge devices
November 07, 2016
Toshiba has developed what it calls Time Domain Neural Network (TDNN)—a neural network using a time-domain analog and digital mixed signal processing technique—based on a new, extremely low-power consumption neuromorphic semiconductor circuit to perform processing for Deep Learning. (The acronym TDNN (time-delay neural network) is also used broadly to describe feed-forward neural networks, first described in a 1989 paper (Waibel et al.).
Deep learning—as could be applied, for example, in autonomous driving—requires massive numbers of calculations, typically executed on high performance processors that consume a lot of power. However, bringing the power of deep learning to IoT edge devices such as sensors and smart phones requires highly energy-efficient ICs that can perform the large number of required operations while consuming extremely little energy.
New Telit autonomous navigation IoT module relies on internal sensors to deliver class-leading dead reckoning accuracy
November 06, 2016
Telit announced commercial availability of the SL869-3DR, a GNSS (global navigation satellite system) module for global use which leverages information from internal gyros, accelerometers and a barometric pressure sensor to perform dead reckoning (DR) navigation for application areas such as track & trace and in-vehicle systems.
The module delivers accurate position data either directly from its multi-constellation receiver or from a fully autonomous DR system, requiring no connections to external devices or components other than an antenna for satellite signal reception and power. The module allows integrators to design zero-installation, in-vehicle navigation and tracking devices for fleets and other commercial or consumer applications that operate simply perched on the dashboard, connected only to vehicle power.
Ford developing new ADAS technologies for stress-free parking, collision avoidance, wrong-way driving alerts
November 04, 2016
Ford Motor Company is expanding its portfolio of driver-assist technologies with a range of next-generation features designed to ease parking hassles, improve collision avoidance, detect objects in the road and prevent wrong-way driving.
Cross-traffic alert with braking technology in development at Ford is being designed to help reduce parking stress by detecting people and objects about to pass behind the vehicle, providing a warning to the driver and then automatically braking if the driver does not respond. Rear wide-view camera, on the in-car display, will offer an alternative wide-angle view of the rear of the vehicle. Enhanced active park assist will parallel or perpendicular park at the push of a button.
ARPA-E awards $32M to 10 new projects to improve connected and automated vehicle efficiency
November 03, 2016
The Energy Department’s Advanced Research Projects Agency-Energy (ARPA-E) announced up to $32 million in funding for 10 innovative projects as part of the Next-Generation Energy Technologies for Connected and Autonomous On-Road Vehicles (NEXTCAR) program. (Earlier post.) With a goal of reducing individual vehicle energy usage by 20%, NEXTCAR projects will take advantage of the increasingly complex and connected systems in today’s—and tomorrow’s—cars and trucks to improve their energy efficiency.
Connected and automated vehicle (CAV) technology utilizes on-board or cloud-based sensors, data and computational capabilities to help a vehicle better process and react to its surrounding environment. This knowledge could include the location of stop signs and intersections, the actions of nearby vehicles, the location of congested areas, and much more. Currently, CAV technologies predominantly improve upon vehicle safety and add driving convenience. NEXTCAR projects will leverage these rapidly evolving technologies to greatly reduce vehicle energy use.
BlackBerry signs agreement with Ford for expanded use of BlackBerry’s QNX and security software
November 02, 2016
BlackBerry Limited has signed an agreement with Ford Motor Company for expanded use of BlackBerry’s QNX and security software. The deal signifies an acceleration in BlackBerry’s pivot from hardware to software in support of the automaker’s goal of providing connected vehicles and mobility to its customers.
As part of this agreement, BlackBerry will dedicate a team to work with Ford on expanding the use of BlackBerry’s QNX Neutrino Operating System, Certicom security technology, QNX hypervisor and QNX audio processing software. The terms of the deal are confidential.
Test deployment of new on-demand hub & shuttle mobility system at U Michigan; connected & automated vehicles & big data
November 01, 2016
A test deployment of a new hub-and-shuttle urban mobility system will take place on the University of Michigan’s North Campus. Its creators say the proposed system could deliver riders to their destinations in as little as half the time of the existing bus system at a lower cost, eventually using a fleet of autonomous shared vehicles. The limited test deployment would likely mark the world’s first on-the-ground implementation of such a system.
Called Reinventing Public Urban Transportation and Mobility (RITMO), the proposed system combines aspects of Uber-style ridesharing, fixed-route buses and light rail into the hub-and-shuttle system. It would combine high-frequency buses serving the busiest transportation hubs with a fleet of about 50 on-demand shared shuttles to get riders to and from those hubs.
Chronocam raises $15M in Series B; high-performance bio-inspired vision technology for autos and other machines
October 27, 2016
France-based Chronocam SA, a developer of biologically-inspired vision sensors and computer vision solutions for automotive, IoT and other applications requiring vision processing, raised $15 million in Series B financing. The funding comes from lead investor Intel Capital, along with iBionext, Robert Bosch Venture Capital GmbH, 360 Capital, CEAi and Renault Group.
Chronocam will use the investment to accelerate product development and commercialize its computer vision sensing and processing technology. The funding will also allow the company to expand into key markets, including the US and Asia.
Intel introducing new processor series dedicated for automotive applications
October 26, 2016
Intel is developing a new processor series dedicated for automotive applications. The A3900 series will enable a complete software-defined cockpit solution that incudes in-vehicle infotainment (IVI), digital instrument clusters and advanced driver assistance systems (ADAS)—all in a single, compact and cost-effective SoC.
Intel announced the new automotive processor family along with its introduction of the new Intel Atom processor E3900 series for the Internet of Things (IoT). The A3900 series will allow car makers to offer new levels of determinism for real-time decision-making required in next-generation cars. It is currently sampling with customers and will be available in Q1 2017.
Infineon launches next gen AURIX hexa-core microcontroller for automotive applications; 3x more performance than current
October 24, 2016
Infineon Technologies AG launched the next generation of its AURIX microcontroller family. The TC3xx microcontrollers offer the highest level of integration on the market and real-time performance that is three times higher than that available today.
With a high-performing hexa-core architecture and advanced features for connectivity, security and embedded safety, the AURIX family TC3xx is suited for a wide field of automotive applications. In addition to engine management and transmission control, powertrain applications include new systems in electrical and hybrid drives. Specifically hybrid domain control, inverter control, battery management, and DC-DC converters will benefit from the new architecture.
Tesla putting hardware for full autonomy in all models; temporary loss of some Gen1 Autopilot functions
October 20, 2016
Tesla announced that effective immediately, new Tesla vehicles—including Model 3—will have the hardware needed to support full autonomous driving.
The required software for full autonomous driving is still under development and will need validation and regulatory approval. In fact, Teslas with the new hardware will temporarily lack certain features currently available on Teslas with first-generation Autopilot hardware, including some standard safety features such as automatic emergency braking, collision warning, lane holding and active cruise control.
Infineon and Argus demonstrate cyber security solution for connected and automated cars; central gateway protection
October 19, 2016
At the VDI Kongress in Baden-Baden this week, Infineon Technologies AG and cyber security company Argus will demonstrate an integrated cyber security solution for connected and autonomous vehicles. The system is based on an Infineon AURIX multicore microcontroller with the Intrusion Detection and Prevention System (IDPS) and remote cloud platform from Argus. At the heart of a vehicle’s central gateway, the cyber security solution protects the vehicle’s internal network from remote cyber-attacks.
The central gateway is crucial in the automotive security architecture. It interconnects all electronic control units (ECU) of in-vehicle domains, such as those used in the powertrain, driver assistance, chassis, as well as body and convenience control. The central gateway routes and controls the complete data communication between the ECUs. In addition, it is the central access point for software updates over the air (SOTA) and for diagnostics processes and maintenance updates via the On-Board Diagnostics (OBD) port.
Oryx Vision raises $17M to create novel depth-sensing solution for autonomous vehicles; LiDAR replacement
Oryx Vision has emerged from stealth with a veteran team from the Israeli high-tech industry to build a novel depth-sensing solution for autonomous vehicles that overcomes some of the limitations of current LiDAR systems. Oryx has raised $17 million in Series A funding led by Bessemer Venture Partners (BVP), with additional participation from Maniv Mobility and Trucks VC. BVP Partner Adam Fisher will join Oryx’s board of directors.
In order to drive accurately and safely, autonomous vehicles need a highly detailed 3D view of their environment. Existing depth-sensing solutions rely mostly on LiDAR devices, which send short laser pulses while rotating, receive the reflected light back with photo-electric sensors, and thus construct a 3D map of the car’s surroundings, pixel by pixel. However, current LiDAR is mechanically complicated, expensive and has a severe range limit due to eye-safety considerations, Oryx says.
Renesas Electronics delivers highly automated driving solution kit to accelerate development of autonomous vehicles
Renesas Electronics Corporation announced a highly automated driving (HAD) solution kit that delivers high computing performance targeted at automotive functional safety to reduce development time of electronic control units (ECUs).
The HAD solution kit is based on two Renesas R-Car H3 Starter Kit Premier and the automotive control RH850/P1H-C microcontroller (MCU), and is compliant with both ISO 26262 ASIL-B functionality safety standard and ISO 26262 ASIL-D standard. Each ASIL (Automotive Safety Integrity Level) stipulates requirements under the ISO 26262 functional safety standard and safety measures for avoiding unacceptable residual risk. There are four safety levels, A to D, with ASIL-D being the strictest.
Audi presenting piloted driving and Car-to-X technologies from Digital Motorway Test Bed; LTE-V for V2X
October 18, 2016
Twelve months after the launch of the “Digital Motorway Test Bed” in Germany, Audi is presenting new technologies for piloted driving and Car-to-X-communication at the German Federal Ministry of Transport. Audi is involved in six projects on the test bed; three of them focus on structural measures; the remaining three on communication technologies.
The Digital Motorway Test Bed is on the A9 between Munich and Nuremberg and enables the automotive industry, suppliers, the telecommunication and software industry as well as research centers to field-test their systems under development in mixed traffic. The “Digital Motorway Test Bed” is a joint initiative between the Federal Ministry of Transport and Digital Infrastructure, the Free State of Bavaria, the automotive and supply industry as well as the IT sector.
DENSO & Toshiba partner on Deep Neural Network-IP for image recognition systems for ADAS & automated driving
October 17, 2016
DENSO Corporation and Toshiba Corporation have reached a basic agreement jointly to develop an artificial intelligence technology called Deep Neural Network-Intellectual Property (DNN-IP), which will be used in image recognition systems which have been independently developed by the two companies to help achieve advanced driver assistance and automated driving technologies.
The partners expect DNN, an algorithm modeled after the neural networks of the human brain, to perform recognition processing as accurately as, or even better, the human brain.
Chinese firm invests $27M with U of Michigan to advance autonomous vehicle research and development
October 16, 2016
Frontt Capital Management Ltd, a Shenzhen-based investment firm focused on developing the intelligent vehicle industry in China, is making a $27-million investment to advance autonomous, connected vehicles and robotic technologies with the University of Michigan, along with industry and government partners.
Under the terms of a memorandum of understanding signed between Frontt and U-M, the funding will establish a Joint Research Center for Intelligent Vehicles at U-M to support faculty projects on autonomous vehicle technologies. The funding will also cContribute toward construction of the recently approved Robotics Laboratory and a vehicle garage on U-M's North Campus that would be located near Mcity, the simulated urban-suburban environment for testing connected and automated vehicles.
UC Riverside team developing nav system that uses signals of opportunity; support for autonomous vehicles
October 14, 2016
A team of researchers at the University of California, Riverside has developed a highly reliable and accurate navigation system that exploits existing environmental signals such as cellular and Wi-Fi, rather than the Global Positioning System (GPS). The technology can be used as a standalone alternative to GPS, or as a complement to current GPS-based systems to enable highly reliable, consistent, and tamper-proof navigation.
The researchers say the technology could be used to develop navigation systems that meet the stringent requirements of fully autonomous vehicles, such as driverless cars and unmanned drones.
Infineon acquires Innoluce BV for high-performance solid-state LiDAR systems
October 11, 2016
Semiconductor company Infineon has acquired 100% of Innoluce BV, a fabless semiconductor company headquartered in Nijmegen. Based on the know-how of Innoluce, Infineon will develop chip components for high-performance light detection and ranging (LiDAR) systems. Both companies agreed on confidentiality on the terms.
Innoluce was founded in 2010 as an entrepreneurial spin-off of Royal Philips. It is a fabless semiconductor company headquartered in Nijmegen, The Netherlands, near the Dutch-German border. The company has a strong expertise in micro-electro-mechanical systems (MEMS). Innoluce is a leading innovator of miniature laser scanning modules that integrate silicon-based solid-state MEMS micro-mirrors. Such micro-mirrors are necessary to adjust the laser beams in automotive LiDAR systems.
Autonomous vehicle tech company Nauto enters strategic relationships with 3 automakers, including BMW and Toyota, and Allianz
October 08, 2016
Autonomous vehicle technology company Nauto has entered into strategic agreements with three major auto companies, including BMW i Ventures and Toyota Research Institute, as well as with Allianz Ventures, part of the leading global financial service provider and insurance company Allianz Group.
These companies have invested in Nauto and are working with the company on autonomous vehicle development using the Nauto cloud-based data learning platform. Nauto’s deep-learning technology also runs on retrofit devices that can be mounted in any vehicle.
Renesas Electronics introduces V2V and V2I communications solutions
October 07, 2016
Renesas Electronics Corporation announced the global availability of its lineup of V2X solutions that will help accelerate the arrival of autonomous driving. The solutions include two system-on-chips (SoCs) that will ease the development process for vehicle-to-vehicle (V2V) and vehicle-to-infrastructure (V2I) communication systems.
One of the solutions includes the R-Car W1R 760 megahertz (MHz) band wireless SoC for the Japanese market, and the new R-Car W2H SoC that features a high-performance security engine that is indispensable for V2X systems designed for the Japanese, US, and European markets. The other solution consists of the W2H SoC combined with the R-Car W2R 5.9 GHz band wireless communication SoC developed for US and European markets.
DENSO invests in deep learning and vision processing startup THINCI; vision processing and deep learning for automotive
October 06, 2016
DENSO International America, Inc. has entered into an investment agreement with THINCI Inc., a deep-learning, vision processing startup developing innovative machine learning technology that enables the application of deep learning and vision processing in the automotive industry.
With this investment, DENSO is looking to accelerate the final development and integration of THINCI’s silicon and software technology into electronic systems that help enable driver assistance and autonomous driving, improve the efficiency of thermal systems, and optimize the productivity of the vehicle’s powertrain.
Volkswagen’s MEB for EVs: long electric range, open-platform, open-space, pricing for the volume market; “tablet on wheels”
October 05, 2016
Brands within the Volkswagen Group have been rolling out modular component matrices, or assembly toolkits, for their light-duty vehicles over the past few years. Until recently, the four main modular toolkits (modularen Baukästen) of the Group were: the MQB (transverse, driven by the Volkswagen brand); the MLB (longitudinal, driven by the Audi brand); the MSB (standard drive, driven by Porsche); and the NSF (New Small Family).
Development work on these continues; Audi, for example, is refining MLB evo—the second-generation of MLB and the foundation for the battery-electric e-tron quattro SUV due out in 2018. (Earlier post.) These four main kits are now joined by the all-new Modularer Elektrifizierungsbaukasten (“Modular Electric Drive kit”, or MEB), being developed by the Volkswagen brand. The MEB will be the foundation for an entirely new generation of battery-electric vehicles designed not only to be electric and feature extended range, but to be connected, autonomous, open and priced for the volume market as required by Volkswagen’s positioning.
Mercedes-Benz gets on the CASE with Generation EQ close-to-production electric concept
September 29, 2016
At the Paris Motor Show, Mercedes-Benz unveiled its close-to-production concept Generation EQ electric vehicle—the forerunner of Mercedes-Benz’s new product brand for electric mobility, EQ. The name EQ stands for “Electric Intelligence” and is derived from the Mercedes-Benz brand values of “Emotion and Intelligence”.
Dr Dieter Zetsche, CEO of Daimler AG and Head of Mercedes‑Benz Cars, said that the mobility of the future at Mercedes-Benz will stand on four pillars: Connected, Autonomous, Shared and Electric (CASE), adding that Mercedes-Benz has formed a CASE team. The Generation EQ is the logical fusion of all four pillars, he said.
NVIDIA introduces Xavier AI supercomputer designed for autonomous driving
At the inaugural GPU Technology Conference Europe, NVIDIA CEO Jen-Hsun Huang unveiled Xavier, an all-new AI supercomputer designed for use in self-driving cars. Xavier is a complete system-on-chip (SoC), integrating a new GPU architecture called Volta, a custom 8-core CPU architecture, and a new computer vision accelerator.
The processor will deliver 20 TOPS (trillion operations per second) of performance while consuming only 20 watts of power. As the brain of a self-driving car, Xavier is designed to be compliant with critical automotive standards, such as the ISO 26262 functional safety specification.
Volkswagen unveils I.D. EV concept; 1st MEB-based vehicle, to launch in 2020; up to 373 miles
September 28, 2016
At the Paris Motor Show, Volkswagen staged the world premiere of the I.D., a concept vehicle presaging the first of a new fleet of Volkswagen electric cars and highlighting Volkswagen’s vision for the future in a number of areas, including autonomous driving and a new Open Space concept for the interior.
The compact I.D. is driven by a 125 kW electric motor powered by a battery pack, and has a range of 400 - 600 km (249 - 373 miles) under European test conditions. I.D. will be the first Volkswagen built off the Modular Electric Drive kit (MEB). The production version of the I.D. is due to be launched in 2020 at a price on a par with comparably powerful and well-equipped Golf models. The I.D. concept car at the show further demonstrates a concept of autonomous driving for the year 2025.
NVIDIA and TomTom developing cloud-to-car mapping system for self-driving cars; DRIVE PX 2
NVIDIA and TomTom, the Dutch mapping and navigation group, are partnering to develop artificial intelligence to create a cloud-to-car mapping system for self-driving cars.
The work combines TomTom’s extensive HD map coverage, which already spans more than 120,000 km (75,000 miles) of highways and freeways, with the NVIDIA DRIVE PX 2 computing platform (earlier post). Together, the solution accelerates support for real-time in-vehicle localization and mapping for driving on the highway.
HERE unveils next-generation open platform real-time data services for automotive industry
On the eve of the Paris Motor Show, HERE, the high-definition mapping and location services business company acquired by Audi, BMW and Daimler (earlier post), announced next-generation vehicle-sourced data services. The HERE Open Location Platform will harness real-time data generated by the on-board sensors of connected vehicles—even from competing car brands—to create a live depiction of the road environment.
Drivers will be able to access this view of the road through four services that provide information on traffic conditions, potential road hazards, traffic signage and on-street parking at high quality. The goal is to ensure that drivers have more accurate and timely information with which they can make better driving decisions. HERE plans to make the services commercially available to any customers both within and outside the automotive industry from the first half of 2017.
DOT issues Federal Policy for safe testing and deployment of highly automated vehicles (SAE levels 3-5)
September 20, 2016
The US Department of Transportation issued Federal policy for highly automated vehicles (HAVs)—i.e., SAE Levels 3-5 vehicles with automated systems that are responsible for monitoring the driving environment as defined by SAE J3016.
Although the primary focus of the Federal Automated Vehicle Policy is on highly automated vehicles, or those in which the vehicle can take full control of the driving task in at least some circumstances, portions of the policy also apply to lower levels of automation, including some of the driver-assistance systems already being deployed by automakers today. The newly released policy embodies four key elements:
OmniVision introduces OV491 and OV495 companion chips for automotive image processing applications
September 19, 2016
OmniVision Technologies, Inc., a developer of advanced digital imaging solutions, is introducing the OV491 and OV495, two new companion chips that deliver performance and advanced features in combination with OmniVision’s portfolio of automotive RAW image sensors.
The OV491 is a compact image signal processor (ISP) companion chip specified to enable best-in-class image quality in surround view system architectures. The OV495 contains the same ISP features but is also equipped with electronic distortion and perspective correction, making it suited for rear video mirror and camera monitor system (CMS) applications. The OV491 and OV495 are both compatible with OmniVision’s OV2775, OV10650, and OV10640 automotive image sensors.
Ford embeds researchers in new U-M robotics lab to accelerate autonomous vehicle research
September 16, 2016
Ford and the University of Michigan are teaming up to accelerate autonomous vehicle research and development with a first-time arrangement that embeds Ford researchers and engineers into a new state-of-the-art robotics laboratory on U-M’s Ann Arbor campus.
While the new robotics laboratory opens in 2020, by the end of this year Ford will move a dozen researchers into the North Campus Research Complex (NCRC), kicking off the first phase of expanded presence.
Renesas to acquire Intersil for ~$3.2B; looking to lead in automotive, industrial, IoT system solutions
September 14, 2016
Renesas Electronics Corporation, a premier supplier of advanced semiconductor solutions, and Intersil Corporation, a leading provider of innovative power management and precision analog solutions, signed a definitive agreement for Renesas to acquire Intersil for US$22.50 per share in cash, representing an aggregate equity value of approximately US$3.2 billion. The transaction has been unanimously approved by the boards of directors of both companies. Closing of the transaction is expected in the first half of 2017, following approval by Intersil shareholders and the relevant governmental authorities.
Together, Renesas’ and Intersil’s deep expertise across a number of technologies and end markets will enable the combined company to become a complete solution provider of embedded systems to customers. By combining Renesas’ market-proven microcontroller (MCU) and system-on-chip (SoC) products and technologies and Intersil’s leading power management and precision analog capability, Renesas will be well positioned to address some of the most exciting opportunities in key areas such as automotive, industrial, cloud computing, healthcare, and the Internet of Things (IoT).
NVIDIA unveils single-processor configuration of DRIVE PX 2 for AutoCruise functions; Baidu deploying
September 13, 2016
NVIDIA unveiled a single-processor configuration of the NVIDIA DRIVE PX 2 AI computing platform (earlier post) that automakers can use to power automated and autonomous vehicles for driving and mapping.
The new computing platform for AutoCruise functions—which include highway automated driving and HD mapping—consumes just 10 watts of power and enables vehicles to use deep neural networks to process data from multiple cameras and sensors. It will be deployed by China’s Baidu as the in-vehicle car computer for its self-driving cloud-to-car system.
Tesla leans on radar for Autopilot in Version 8 software
September 12, 2016
With the release of Version 8 of its software, Tesla has made many updates to its Autopilot function. The most significant change however, is its new heavy reliance on the onboard radar. The radar was added to all Tesla vehicles in October 2014 as part of the Autopilot hardware suite, but originally was only meant to be a supplementary sensor to the primary camera and image processing system.
Now, however, Tesla is using more advanced signal processing to use the radar as a primary control sensor without requiring the camera to confirm visual image recognition. The changes come in the wake of the fatal crash (earlier post) in which Autopilot apparently did not detect the white side of the trailer against a brightly lit sky; nor did the driver. As a result, no brake was applied. In the aftermath of that incident, Tesla CEO Elon Musk tweeted that the company was “Working on using existing Tesla radar by itself (decoupled from camera) w temporal smoothing to create a coarse point cloud, like lidar”. (Earlier post.)
DENSO looks to increase holding in FUJITSU TEN, making it a group company
September 10, 2016
Auto parts supplier DENSO Corporation, Fujitsu Limited, and Toyota Motor Corporation have reached a basic agreement to start consideration of changing the capital structure of automotive electronics manufacturer FUJITSU TEN, in which the three companies have stakes. DENSO is part of the Toyota Group.
In the automotive field, the interface between the driver and vehicle is becoming increasingly important due to remarkable technological innovations. Against this backdrop, DENSO has agreed with Fujitsu and Toyota to review specific changes to make FUJITSU TEN a group company of DENSO and to enhance cooperation between the two companies in developing in-vehicle ECUs, millimeter-wave radar (earlier post), advanced driver assistance / automated driving technologies, and basic electronic technologies, among others.
NSF awards $4.6M to improve human control of automated cars, drones
September 08, 2016
NSF has awarded (Award Nº 1545126) $4.6 million to a team led by UC Berkeley exploring human cyber-physical systems (h-CPS)—systems that operate in concert with human operators—with the aim of improving the interaction between humans, computers and the physical world. The research outcome of the project, called Verified Human Interfaces, Control, and Learning for Semi-Autonomous Systems, or VeHICaL, will have applications in emerging technologies such as semi-autonomous cars and autonomous aerial vehicles (drones).
The award was part of a total of $13 million NSF awarded to three five-year “Frontier” projects to advance cyber-physical systems (CPS). The other two projects are tackling monitoring and mitigating noise pollution in cities and quickly identifying and overcoming problems in manufacturing environments.
LeddarTech launches LeddarVu, a new scalable platform towards high-resolution LiDAR; Vu8 solid-state LiDAR
September 07, 2016
LeddarTech, a developer of solid-state LiDAR technology (earlier post), introduced LeddarVu, a new platform for the next generation of its Leddar detection and ranging modules. The LeddarVu platform combines the benefits of a very compact, modular architecture with superior performance, robustness and cost efficiency towards high-resolution LiDAR applications, such as autonomous driving.
Leveraging LeddarTech’s advanced, patented signal processing and algorithms, LeddarVu sensors will evolve along with the future generations of the LeddarCore ICs. As previously announced with the company’s development roadmap, upcoming iterations of LeddarCore ICs are expected to deliver ranges reaching 250 m, fields of view up to 140°, and up to 480,000 points per second (with a resolution down to 0.25° both horizontal and vertical), enabling the design of affordable LiDARs for all levels of autonomous driving, including the capability of mapping the environment over 360° around the vehicle.
Volvo Cars and Autoliv to create joint venture for next-gen autonomous driving software
September 06, 2016
Volvo Cars and Autoliv have signed a letter of intent to set up a new jointly-owned company to develop next-generation autonomous driving software. The planned new company will have its headquarters in Gothenburg, Sweden, and an initial workforce taken from both companies of around 200, increasing to over 600 in the medium term. The company is expected to start operations in the beginning of 2017.
The new company, which has yet to be named, will develop advanced driver assistance systems (ADAS) and autonomous drive (AD) systems for use in Volvo cars and for sale exclusively by Autoliv to all car makers globally, with revenues shared by both companies.
Japan 3D map JV beginning process of creating high-def 3-D maps for autonomous vehicles; dynamic mapping
September 05, 2016
The Nikkei reports that Tokyo-based Dynamic Map Planning (DMP), formed earlier this year by Mitsubishi Electric, mapmaker Zenrin and nine automakers, will begin creating high-definition 3-D maps for self-driving cars this month.
The project is supported by the Japanese government (SIP-adus, Strategic Innovation Promotion Program Innovation of Automated Driving for Universal Services) in support of an effort to have autonomous vehicles on the road by 2020 equipped with a dynamic mapping system that combines real-time, “dynamic” information with a high-definition 3D map base. The result, say the planners, is a “one stop” information source that can provide necessary information for automated driving.
Renesas Electronics and TSMC announce 28nm MCU collaboration for next-generation green and autonomous vehicles
September 02, 2016
Renesas Electronics Corporation and TSMC are collaborating on 28nm (nanometer) embedded flash (eFlash) process technology for manufacturing microcontrollers (MCUs) targeted at next-generation green and autonomous vehicles. The automotive MCUs employing this new 28nm process technology are slated for sample shipment and mass production in 2017 and 2020, respectively.
More specifically, Renesas’ reliable and fast Metal-Oxide-Nitride-Oxide-Silicon (MONOS) eFlash technology will combine with TSMC’s high-performance, low-power 28nm high-K metal gate process technology to produce automotive MCUs for a broader range of applications such as autonomous vehicle sensor control; coordinated control among electronic control units (ECUs); fuel-efficient engine control for green vehicles; and highly efficient motor inverter control for electric vehicles.
Baidu and NVIDIA team up on cloud-to-car platform for self-driving cars; HD maps, Level 3 control, automated parking
September 01, 2016
At the Baidu World Conference in Beijing, Baidu CEO Robin Li together with NVIDIA CEO Jen-Hsun Huang announced a partnership to use artificial intelligence (AI) in the creation of a cloud-to-car autonomous car platform for local Chinese and global car makers.
NVIDIA and Baidu have a long history of working together on AI. The latest addition to their partnership will combine Baidu’s cloud platform and mapping technology with NVIDIA’s self-driving computing platform to develop solutions for HD maps, Level 3 autonomous vehicle control and automated parking.
Quanergy acquires Otus People Tracker software from Raytheon BBN for advanced autonomous driving and security LiDAR applications
August 29, 2016
Quanergy Systems, Inc., the provider of solid-state LiDAR sensors and smart sensing solutions (earlier post), has acquired the Otus People Tracker software from Raytheon BBN Technologies. The software complements Quanergy’s existing software portfolio and, when used with Quanergy’s LiDAR sensors, creates an integrated hardware and software solution for advanced people detection and tracking applications within the security and autonomous driving markets.
Otus (named after a genus of owls) uses advanced algorithms to identify and to track people for safety and security in crowded environments at ranges exceeding 100 meters when used with Quanergy LiDAR sensors. The system features segmentation techniques identifying humans; background extraction; object clustering; sophisticated merge and split algorithms; persistent tracking algorithms; and other advanced features supporting robust crowd control. Support for multiple zones of interest is included, allowing users fine control over active monitoring.
ORNL team presents solution for coordinating connected and automated vehicles at merging roadways; reduced fuel consumption and travel time
A team of researchers at Oak Ridge National Laboratory (ORNL) has developed an optimization framework and an analytical closed-form solution that addresses the problem of optimally coordinating connected and automated vehicles (CAVs) at merging roadways to achieve smooth traffic flow without stop-and-go driving.
They validated the effectiveness of the efficiency of their proposed solution through a simulation, showing that coordination of vehicles can significantly reduce both fuel consumption and travel time. A paper on the work is published in IEEE Transactions On Intelligent Transportation Systems.
NVIDIA dives into Parker mobile processor for next generation of autonomous vehicles
August 23, 2016
Speaking at the Hot Chips conference in Cupertino, California, NVIDIA revealed the architecture and underlying technology of its new Parker processor, which is suited for automotive applications such as self-driving cars and digital cockpits. Hot Chips, a symposium on high performance chips, is sponsored by the IEEE Technical Committee on Microprocessors and Microcomputers in cooperation with ACM SIGARCH.
NVIDIA mentioned Parker at CES 2016 earlier this year, when it introduced the NVIDIA DRIVE PX 2 platform (earlier post). That platform uses two Parker processors and two Pascal architecture-based GPUs to power deep learning applications.
Mobileye and Delphi to partner on SAE Level 4/5 automated driving solution for 2019
Mobileye and Delphi Automotive PLC are partnering to develop a complete SAE Level 4/5 automated driving solution. The program will result in an end-to-end production-intent fully automated vehicle solution, with the level of performance and functional safety required for rapid integration into diverse vehicle platforms for a range of customers worldwide.
The partners’ “Central Sensing Localization and Planning” (CSLP) platform will be demonstrated in combined urban and highway driving at the 2017 Consumer Electronics Show in Las Vegas and production ready for 2019.