[Due to the increasing size of the archives, each topic page now contains only the prior 365 days of content. Access to older stories is now solely through the Monthly Archive pages or the site search function.]
BMW Group expands BMW i Ventures role with new €500M fund; widened scope of investment, greater independence
December 01, 2016
The BMW Group is expanding the remit of its BMW i Ventures venture capital unit and creating a new fund of up to €500 million (US$531 million) over ten years to support it. The new fund will allow BMW i Ventures to make investments in a wider range of areas, such as autonomous driving and digitalization, and to secure continued access to the technologies of the future.
BMW i Ventures’ previous focus on mobility services and electro-mobility will be expanded to cover the BMW Group’s full innovation spectrum in all areas of Strategy Number ONE > NEXT, even those outside of the traditional automotive value chain. Future topics for exploration will focus on “Enabling Technology and Digital Vehicle Technology”, “Mobility and Digital Services”, “Customer Experience” and “Advanced Production Technology”.
ORNL study finds even low penetration of CAVs delivers significant fuel economy benefits, but increases travel time slightly
A new study by a team at Oak Ridge National Laboratory (ORNL) suggests that low penetration rates of connected and automated vehicles (CAVs) can deliver significant fuel consumption benefits, but that the total travel time increases slightly. The study, presented presented at the 9th ACM SIGSPATIAL on Computational Transportation Science, also found that benefits in travel time increase with higher penetration rates of CAVs.
The study—one of the first that captures the impact of different penetration rates of CAVs on fuel consumption and travel time—builds off of earlier work done by the team of Jackeline Rios-Torres and Andreas Malikopoulos to develop an optimization framework and an analytical closed-form solution that addresses the problem of optimally coordinating connected and automated vehicles (CAVs) at merging roadways to achieve smooth traffic flow without stop-and-go driving. (Earlier post.)
Delphi & Mobileye to showcase Centralized Sensing Localization and Planning (CSLP) autonomous driving system in public demo at CES 2017
November 30, 2016
Delphi Automotive PLC and Mobileye will showcase their Centralized Sensing Localization and Planning (CSLP) automated driving system—which will be ready for production by 2019—on a 6.3-mile urban and highway combined public route in Las Vegas for CES 2017. (Earlier post.)
The partners said that CSLP is the first turnkey, fully integrated automated driving solution with an industry-leading perception system and computing platform. (Intel will provide the system-on-a-chip (SOC) for the systems.) The Las Vegas drive will tackle everyday driving challenges such as highway merges, congested city streets with pedestrians and cyclists and a tunnel.
Department of Transportation seeking proposals for automated vehicle proving grounds pilot
November 28, 2016
The US Department of Transportation (DOT) is requesting proposals from applicants (DOT-OST-2016-0233) to form an initial network of multiple proving grounds focused on the advancement of autonomous vehicle technology. The proving grounds will develop and share best practices around the safe testing, demonstration and deployment of autonomous vehicle technology.
The selected proving grounds will be designated “USDOT Automated Vehicle Proving Grounds.” DOT anticipates that the designation will encourage new levels of public safety while contributing to a foundation able to transform personal and commercial mobility and provide new opportunities to disadvantaged people and communities.
Volkswagen’s 10-year evolution of Park Assist; heading toward trained parking and higher levels of autonomy
November 26, 2016
Volkswagen first introduced a parking assistance system based on ultrasonic sensors in the early 1990s. However, it was the “Park Assist” Gen 1 system presented in the Touran in 2007 that marked a foundational point in the commercial development of the technology. After it was activated, Park Assist was able to detect parallel parking spaces on the left and right sides of the road as the car passed them using special, side-oriented ultrasonic sensors, enabling semi-automatic parking for the first time.
Volkswagen engineers have continued to enhance the functionality, leading to the release of Gen 3 Park Assist in 2014, with a clear roadmap to the deployment of higher levels of autonomy, including trained parking: fully automated parking with a one-off training process. At a recent visit to Volkswagen’s Ehra proving ground (Prüfgelände Ehra), Green Car Congress had the opportunity to see a prototype of trained parking in action.
Juniper Research: taxi sector to lead self-driving market to >22M consumer vehicles on the road by 2025
November 23, 2016
New findings from Juniper Research project that the annual production of self-driving cars will reach 14.5 million in 2025, up significantly from only a few thousands in 2020, resulting in a global installed base of more than 22 million consumer vehicles by 2025.
The new research, Autonomous Vehicles & ADAS: Adoption, Regulation & Business Models 2016-2025, found that the market adoption of AV (Autonomous Vehicle) technology is set to accelerate over the next few years, driven by increasingly stringent vehicle safety specifications; environmental pressures; and rapid technological developments.
nuTonomy to test its self-driving cars on specific public roads in Boston
November 21, 2016
nuTonomy, developer of software for self-driving cars, has signed a Memorandum of Understanding (MOU) with the City of Boston and the Massachusetts Department of Transportation that authorizes nuTonomy to begin testing its growing fleet of self-driving cars on specific public streets in a designated area of Boston.
nuTonomy will begin testing its self-driving Renault Zoe electric vehicle before the end of the year in the Raymond L. Flynn Marine Park in the Seaport section of the city. nuTonomy outfits its vehicles with a software system which has been integrated with high-performance sensing and computing components to enable safe operation without a driver. The company’s autonomous and robotics technology system grew out of research conducted in MIT labs run by nuTonomy co-founders Karl Iagnemma and Emilio Frazzoli.
U-Michigan, China enter two new automated and connected vehicle partnerships
November 17, 2016
The University of Michigan is entering two separate agreements with Chinese institutions targeting automated and connected vehicles. Together with a third new agreement focused on clean water, the three agreements add up to more than $54 million to advance research in these key areas.
First, a $27-million research agreement with Shenzhen-based investment firm Frontt Capital Management will advance autonomous, connected vehicles and robotic technologies. This agreement puts in place measures that U-M and Frontt agreed to in a memorandum of understanding signed last month in China. It establishes a Joint Research Center for Intelligent Vehicles at U-M. It contributes toward construction of the recently approved Robotics Laboratory and a vehicle garage on U-M’s North Campus near Mcity, the simulated urban-suburban environment for testing connected and automated vehicles.
Hyundai introduces new autonomous IONIQ concept at AutoMobility LA
November 16, 2016
Hyundai Motor Company introduced the Autonomous IONIQ concept during its press conference at AutoMobility LA (Los Angeles Auto Show). With a design resembling the rest of the IONIQ lineup (earlier post), the vehicle is one of the few self-driving cars in development to have a LiDAR system hidden in its front bumper instead of installed on the roof, enabling it to look like any other car on the road and not a high school science project.
Hyundai’s goal for the autonomous IONIQ concept was to keep the self-driving systems as simple as possible. This was accomplished by using the production car’s Smart Cruise Control’s forward-facing radar, Lane Keep Assist cameras and integrated them with LiDAR technology.
Mitsubishi unveils new electric compact SUV concept; 400 km range
Mitsubishi Motors North America, Inc. (MMNA) introduced the Mitsubishi eX Concept, a compact SUV with a next-generation EV system, at the 2016 Los Angeles Auto Show. The concept car is a showcase of Mitsubishi’s electric vehicle (EV) technologies, a new iteration of the Dynamic Shield front design concept, autonomous driving capabilities, Artificial Intelligence (AI) as well as a range of other technologies.
The system is configured with a new drive battery that greatly improves energy density of previous batteries and front and rear compact high-output motors. Together with the reduction in weight and higher efficiency of the new EV system, a non-compromising reduction in the weight of the body has given the Mitsubishi eX Concept a cruising range of 400 km (248.5 miles).
U-M offers open-access automated cars to advance driverless research
New University of Michigan research vehicles will be open testbeds for academic and industry researchers rapidly to test self-driving and connected vehicle technologies at a world-class proving ground. These open connected and automated research vehicles (CAVs) are equipped with sensors including radar, LiDAR and cameras, among other features. They will be able to link to a robot operating system. An open development platform for connected vehicle communications will be added later.
The open CAVs are based at Mcity, U-M’s simulated urban and suburban environment for testing automated and connected vehicles. While a handful of other institutions may offer similar research vehicles, U-M says that it is the only one that also operates a high-tech, real-world testing facility.
Intel to invest more than $250M over next two years in autonomous driving; “Data is the new oil”
November 15, 2016
In a keynote address at the AutoMobility LA conference, Intel CEO Brian Krzanich announced that Intel Capital is targeting more than $250 million of additional new investments over the next two years to make fully autonomous driving a reality. This is the first time Intel is keynoting at an automotive conference, signifying how critical the automotive market has become for the company.
These investments will drive the development of technologies that push the boundaries on next-generation connectivity, communication, context awareness, deep learning, security, safety and more. Drilling down into the areas that will be fueled by the fresh investments, Krzanich highlighted technologies that will drive global Internet of Things (IoT) innovation in transportation; areas where technology can directly mitigate risks while improving safety, mobility, and efficiency at a reduced cost; and companies that harness the value of the data to improve reliability of automated driving systems.
Ford working with Bloomberg Aspen Initiative on Cities and Autonomous Vehicles
Bloomberg Philanthropies and the Aspen Institute recently launched the Bloomberg Aspen Initiative on Cities and Autonomous Vehicles, a new program for leading global mayors who will work together to prepare their cities for the emergence of autonomous vehicles.
The intention of the initiative is to galvanize experts and data to accelerate cities’ planning efforts, and to produce a set of principles and tools that participating cities, as well as cities around the world, can use to chart their own paths forward. The inaugural cities in the initiative include Austin, Texas; Buenos Aires, Argentina; Los Angeles, California; Paris, France; and Nashville, Tennessee. Five additional cities will be announced later this year. At AutoMobility LA, Ford CEO Mark Fields announced that his company is working with Bloomberg in this initiative.
Rolls-Royce and VTT Technical Research Centre partner to develop remote and autonomous ships
November 14, 2016
Rolls-Royce and VTT Technical Research Centre of Finland Ltd have formed a strategic partnership to design, to test and to validate the first generation of remote and autonomous ships. The new partnership will combine and integrate the two companies’ unique expertise to make such vessels a commercial reality. (Earlier post.)
Rolls-Royce is pioneering the development of remote-controlled and autonomous ships and believes a remote-controlled ship will be in commercial use by the end of the decade. The company is applying technology, skills and experience from across its businesses to this development.
Volkswagen unveils updated Golf; Millerized engines, semi-automated driving, digital cockpit and gesture control
November 10, 2016
Volkswagen presented a major update of the Golf in an event at the Autostadt in Wolfsburg. In addition to some design enhancements, the new Golf features new engines (including the Millerized 1.5L EA211 introduced at the Vienna Motor Symposium in April, earlier post), new assistance systems and a new generation of infotainment systems. As a world-first in the compact class, the top-of-the-range “Discover Pro” infotainment system can be operated by gesture control.
With its 9.2-inch screen it forms a conceptual and visual entity with the Active Info Display (digital instrument panel), which is also new to the Golf. (earlier post) The updated Golf is also one of the first compact cars to be available with semi-automated driving functions—the new Traffic Jam Assist function can guide the Golf at speeds of up to 60 km/h (37 mph) in strenuous stop-and-go traffic. It steers, brakes and accelerates the new Golf.
Renesas Electronics delivers 2nd-gen ADAS view solution kit for surround view, electronic mirrors and driver monitoring for autonomous driving
November 08, 2016
Renesas Electronics Corporation has introduced a new all-in-one Advanced Driver Assistance Systems (ADAS) view solution kit. Expanding the success of the first-generation ADAS surround view kit that was launched in October 2015, Renesas’ second-generation ADAS view solution kit with up to eight cameras realizes next-generation electronic mirrors, driver monitoring and surround view systems at the same time.
It has become a standard in autonomous driving and ADAS applications to enable sensor fusion combining and processing the collected information from automotive cameras and radars for vehicles to recognize their surroundings. 360-degree surround view is expected to become an essential feature available in all vehicle segments. Additionally, mirrors will be replaced by cameras, and driver monitoring features will be required for autonomous driving and to increase safety.
Groupe Renault announces strategic partnership with computer vision innovator Chronocam
Groupe Renault has entered into a strategic development agreement with Chronocam SA (earlier post), a developer of biologically-inspired vision sensors and computer vision solutions for automotive applications. This agreement will focus on further developing and applying Chronocam’s innovative approach to sensing and processing visual inputs to Renault’s Advanced Driver Assistance Systems (ADAS) and autonomous driving developments.
Renault previously announced an investment in Chronocam’s Series B round of funding, which raised $15 million for the Paris-based start-up and includes a group of international venture capital funds including: Intel Capital, Robert Bosch Venture Capital, iBionext, 360 Capital and CEA investissement.
NXP and DAF Trucks commit to set new benchmark in truck platooning: 30x faster than human reaction time
NXP Semiconductors N.V. and DAF Trucks announced plans to empower truck platoons to react 30 times faster than humans in 2017, enabling a reduced distance between platooning trucks. Achieving this goal would mark a significant milestone in the introduction of platooning to fleet operators who expect considerable efficiency and safety gains while maintaining a maximum level of data security.
In Munich, NXP and its partners are showcasing the progress of secure intelligent transport systems in advance of this year’s electronica show. The demonstrations include platooning live on Munich roads, traffic signal and vehicle synchronization, and technology that protects vulnerable road users based on secure vehicle-to-everything technology (V2X). Platooning promises to increase fuel efficiency up to 10%, improve road safety and reduce exhaust emissions.
Toshiba advances deep learning with extremely low-power neuromorphic processor; supporting IoT edge devices
November 07, 2016
Toshiba has developed what it calls Time Domain Neural Network (TDNN)—a neural network using a time-domain analog and digital mixed signal processing technique—based on a new, extremely low-power consumption neuromorphic semiconductor circuit to perform processing for Deep Learning. (The acronym TDNN (time-delay neural network) is also used broadly to describe feed-forward neural networks, first described in a 1989 paper (Waibel et al.).
Deep learning—as could be applied, for example, in autonomous driving—requires massive numbers of calculations, typically executed on high performance processors that consume a lot of power. However, bringing the power of deep learning to IoT edge devices such as sensors and smart phones requires highly energy-efficient ICs that can perform the large number of required operations while consuming extremely little energy.
New Telit autonomous navigation IoT module relies on internal sensors to deliver class-leading dead reckoning accuracy
November 06, 2016
Telit announced commercial availability of the SL869-3DR, a GNSS (global navigation satellite system) module for global use which leverages information from internal gyros, accelerometers and a barometric pressure sensor to perform dead reckoning (DR) navigation for application areas such as track & trace and in-vehicle systems.
The module delivers accurate position data either directly from its multi-constellation receiver or from a fully autonomous DR system, requiring no connections to external devices or components other than an antenna for satellite signal reception and power. The module allows integrators to design zero-installation, in-vehicle navigation and tracking devices for fleets and other commercial or consumer applications that operate simply perched on the dashboard, connected only to vehicle power.
Ford developing new ADAS technologies for stress-free parking, collision avoidance, wrong-way driving alerts
November 04, 2016
Ford Motor Company is expanding its portfolio of driver-assist technologies with a range of next-generation features designed to ease parking hassles, improve collision avoidance, detect objects in the road and prevent wrong-way driving.
Cross-traffic alert with braking technology in development at Ford is being designed to help reduce parking stress by detecting people and objects about to pass behind the vehicle, providing a warning to the driver and then automatically braking if the driver does not respond. Rear wide-view camera, on the in-car display, will offer an alternative wide-angle view of the rear of the vehicle. Enhanced active park assist will parallel or perpendicular park at the push of a button.
ARPA-E awards $32M to 10 new projects to improve connected and automated vehicle efficiency
November 03, 2016
The Energy Department’s Advanced Research Projects Agency-Energy (ARPA-E) announced up to $32 million in funding for 10 innovative projects as part of the Next-Generation Energy Technologies for Connected and Autonomous On-Road Vehicles (NEXTCAR) program. (Earlier post.) With a goal of reducing individual vehicle energy usage by 20%, NEXTCAR projects will take advantage of the increasingly complex and connected systems in today’s—and tomorrow’s—cars and trucks to improve their energy efficiency.
Connected and automated vehicle (CAV) technology utilizes on-board or cloud-based sensors, data and computational capabilities to help a vehicle better process and react to its surrounding environment. This knowledge could include the location of stop signs and intersections, the actions of nearby vehicles, the location of congested areas, and much more. Currently, CAV technologies predominantly improve upon vehicle safety and add driving convenience. NEXTCAR projects will leverage these rapidly evolving technologies to greatly reduce vehicle energy use.
BlackBerry signs agreement with Ford for expanded use of BlackBerry’s QNX and security software
November 02, 2016
BlackBerry Limited has signed an agreement with Ford Motor Company for expanded use of BlackBerry’s QNX and security software. The deal signifies an acceleration in BlackBerry’s pivot from hardware to software in support of the automaker’s goal of providing connected vehicles and mobility to its customers.
As part of this agreement, BlackBerry will dedicate a team to work with Ford on expanding the use of BlackBerry’s QNX Neutrino Operating System, Certicom security technology, QNX hypervisor and QNX audio processing software. The terms of the deal are confidential.
Test deployment of new on-demand hub & shuttle mobility system at U Michigan; connected & automated vehicles & big data
November 01, 2016
A test deployment of a new hub-and-shuttle urban mobility system will take place on the University of Michigan’s North Campus. Its creators say the proposed system could deliver riders to their destinations in as little as half the time of the existing bus system at a lower cost, eventually using a fleet of autonomous shared vehicles. The limited test deployment would likely mark the world’s first on-the-ground implementation of such a system.
Called Reinventing Public Urban Transportation and Mobility (RITMO), the proposed system combines aspects of Uber-style ridesharing, fixed-route buses and light rail into the hub-and-shuttle system. It would combine high-frequency buses serving the busiest transportation hubs with a fleet of about 50 on-demand shared shuttles to get riders to and from those hubs.
Chronocam raises $15M in Series B; high-performance bio-inspired vision technology for autos and other machines
October 27, 2016
France-based Chronocam SA, a developer of biologically-inspired vision sensors and computer vision solutions for automotive, IoT and other applications requiring vision processing, raised $15 million in Series B financing. The funding comes from lead investor Intel Capital, along with iBionext, Robert Bosch Venture Capital GmbH, 360 Capital, CEAi and Renault Group.
Chronocam will use the investment to accelerate product development and commercialize its computer vision sensing and processing technology. The funding will also allow the company to expand into key markets, including the US and Asia.
Intel introducing new processor series dedicated for automotive applications
October 26, 2016
Intel is developing a new processor series dedicated for automotive applications. The A3900 series will enable a complete software-defined cockpit solution that incudes in-vehicle infotainment (IVI), digital instrument clusters and advanced driver assistance systems (ADAS)—all in a single, compact and cost-effective SoC.
Intel announced the new automotive processor family along with its introduction of the new Intel Atom processor E3900 series for the Internet of Things (IoT). The A3900 series will allow car makers to offer new levels of determinism for real-time decision-making required in next-generation cars. It is currently sampling with customers and will be available in Q1 2017.
Infineon launches next gen AURIX hexa-core microcontroller for automotive applications; 3x more performance than current
October 24, 2016
Infineon Technologies AG launched the next generation of its AURIX microcontroller family. The TC3xx microcontrollers offer the highest level of integration on the market and real-time performance that is three times higher than that available today.
With a high-performing hexa-core architecture and advanced features for connectivity, security and embedded safety, the AURIX family TC3xx is suited for a wide field of automotive applications. In addition to engine management and transmission control, powertrain applications include new systems in electrical and hybrid drives. Specifically hybrid domain control, inverter control, battery management, and DC-DC converters will benefit from the new architecture.
Tesla putting hardware for full autonomy in all models; temporary loss of some Gen1 Autopilot functions
October 20, 2016
Tesla announced that effective immediately, new Tesla vehicles—including Model 3—will have the hardware needed to support full autonomous driving.
The required software for full autonomous driving is still under development and will need validation and regulatory approval. In fact, Teslas with the new hardware will temporarily lack certain features currently available on Teslas with first-generation Autopilot hardware, including some standard safety features such as automatic emergency braking, collision warning, lane holding and active cruise control.
Infineon and Argus demonstrate cyber security solution for connected and automated cars; central gateway protection
October 19, 2016
At the VDI Kongress in Baden-Baden this week, Infineon Technologies AG and cyber security company Argus will demonstrate an integrated cyber security solution for connected and autonomous vehicles. The system is based on an Infineon AURIX multicore microcontroller with the Intrusion Detection and Prevention System (IDPS) and remote cloud platform from Argus. At the heart of a vehicle’s central gateway, the cyber security solution protects the vehicle’s internal network from remote cyber-attacks.
The central gateway is crucial in the automotive security architecture. It interconnects all electronic control units (ECU) of in-vehicle domains, such as those used in the powertrain, driver assistance, chassis, as well as body and convenience control. The central gateway routes and controls the complete data communication between the ECUs. In addition, it is the central access point for software updates over the air (SOTA) and for diagnostics processes and maintenance updates via the On-Board Diagnostics (OBD) port.
Oryx Vision raises $17M to create novel depth-sensing solution for autonomous vehicles; LiDAR replacement
Oryx Vision has emerged from stealth with a veteran team from the Israeli high-tech industry to build a novel depth-sensing solution for autonomous vehicles that overcomes some of the limitations of current LiDAR systems. Oryx has raised $17 million in Series A funding led by Bessemer Venture Partners (BVP), with additional participation from Maniv Mobility and Trucks VC. BVP Partner Adam Fisher will join Oryx’s board of directors.
In order to drive accurately and safely, autonomous vehicles need a highly detailed 3D view of their environment. Existing depth-sensing solutions rely mostly on LiDAR devices, which send short laser pulses while rotating, receive the reflected light back with photo-electric sensors, and thus construct a 3D map of the car’s surroundings, pixel by pixel. However, current LiDAR is mechanically complicated, expensive and has a severe range limit due to eye-safety considerations, Oryx says.
Renesas Electronics delivers highly automated driving solution kit to accelerate development of autonomous vehicles
Renesas Electronics Corporation announced a highly automated driving (HAD) solution kit that delivers high computing performance targeted at automotive functional safety to reduce development time of electronic control units (ECUs).
The HAD solution kit is based on two Renesas R-Car H3 Starter Kit Premier and the automotive control RH850/P1H-C microcontroller (MCU), and is compliant with both ISO 26262 ASIL-B functionality safety standard and ISO 26262 ASIL-D standard. Each ASIL (Automotive Safety Integrity Level) stipulates requirements under the ISO 26262 functional safety standard and safety measures for avoiding unacceptable residual risk. There are four safety levels, A to D, with ASIL-D being the strictest.
Audi presenting piloted driving and Car-to-X technologies from Digital Motorway Test Bed; LTE-V for V2X
October 18, 2016
Twelve months after the launch of the “Digital Motorway Test Bed” in Germany, Audi is presenting new technologies for piloted driving and Car-to-X-communication at the German Federal Ministry of Transport. Audi is involved in six projects on the test bed; three of them focus on structural measures; the remaining three on communication technologies.
The Digital Motorway Test Bed is on the A9 between Munich and Nuremberg and enables the automotive industry, suppliers, the telecommunication and software industry as well as research centers to field-test their systems under development in mixed traffic. The “Digital Motorway Test Bed” is a joint initiative between the Federal Ministry of Transport and Digital Infrastructure, the Free State of Bavaria, the automotive and supply industry as well as the IT sector.
DENSO & Toshiba partner on Deep Neural Network-IP for image recognition systems for ADAS & automated driving
October 17, 2016
DENSO Corporation and Toshiba Corporation have reached a basic agreement jointly to develop an artificial intelligence technology called Deep Neural Network-Intellectual Property (DNN-IP), which will be used in image recognition systems which have been independently developed by the two companies to help achieve advanced driver assistance and automated driving technologies.
The partners expect DNN, an algorithm modeled after the neural networks of the human brain, to perform recognition processing as accurately as, or even better, the human brain.
Chinese firm invests $27M with U of Michigan to advance autonomous vehicle research and development
October 16, 2016
Frontt Capital Management Ltd, a Shenzhen-based investment firm focused on developing the intelligent vehicle industry in China, is making a $27-million investment to advance autonomous, connected vehicles and robotic technologies with the University of Michigan, along with industry and government partners.
Under the terms of a memorandum of understanding signed between Frontt and U-M, the funding will establish a Joint Research Center for Intelligent Vehicles at U-M to support faculty projects on autonomous vehicle technologies. The funding will also cContribute toward construction of the recently approved Robotics Laboratory and a vehicle garage on U-M's North Campus that would be located near Mcity, the simulated urban-suburban environment for testing connected and automated vehicles.
UC Riverside team developing nav system that uses signals of opportunity; support for autonomous vehicles
October 14, 2016
A team of researchers at the University of California, Riverside has developed a highly reliable and accurate navigation system that exploits existing environmental signals such as cellular and Wi-Fi, rather than the Global Positioning System (GPS). The technology can be used as a standalone alternative to GPS, or as a complement to current GPS-based systems to enable highly reliable, consistent, and tamper-proof navigation.
The researchers say the technology could be used to develop navigation systems that meet the stringent requirements of fully autonomous vehicles, such as driverless cars and unmanned drones.
Infineon acquires Innoluce BV for high-performance solid-state LiDAR systems
October 11, 2016
Semiconductor company Infineon has acquired 100% of Innoluce BV, a fabless semiconductor company headquartered in Nijmegen. Based on the know-how of Innoluce, Infineon will develop chip components for high-performance light detection and ranging (LiDAR) systems. Both companies agreed on confidentiality on the terms.
Innoluce was founded in 2010 as an entrepreneurial spin-off of Royal Philips. It is a fabless semiconductor company headquartered in Nijmegen, The Netherlands, near the Dutch-German border. The company has a strong expertise in micro-electro-mechanical systems (MEMS). Innoluce is a leading innovator of miniature laser scanning modules that integrate silicon-based solid-state MEMS micro-mirrors. Such micro-mirrors are necessary to adjust the laser beams in automotive LiDAR systems.
Autonomous vehicle tech company Nauto enters strategic relationships with 3 automakers, including BMW and Toyota, and Allianz
October 08, 2016
Autonomous vehicle technology company Nauto has entered into strategic agreements with three major auto companies, including BMW i Ventures and Toyota Research Institute, as well as with Allianz Ventures, part of the leading global financial service provider and insurance company Allianz Group.
These companies have invested in Nauto and are working with the company on autonomous vehicle development using the Nauto cloud-based data learning platform. Nauto’s deep-learning technology also runs on retrofit devices that can be mounted in any vehicle.
Renesas Electronics introduces V2V and V2I communications solutions
October 07, 2016
Renesas Electronics Corporation announced the global availability of its lineup of V2X solutions that will help accelerate the arrival of autonomous driving. The solutions include two system-on-chips (SoCs) that will ease the development process for vehicle-to-vehicle (V2V) and vehicle-to-infrastructure (V2I) communication systems.
One of the solutions includes the R-Car W1R 760 megahertz (MHz) band wireless SoC for the Japanese market, and the new R-Car W2H SoC that features a high-performance security engine that is indispensable for V2X systems designed for the Japanese, US, and European markets. The other solution consists of the W2H SoC combined with the R-Car W2R 5.9 GHz band wireless communication SoC developed for US and European markets.
DENSO invests in deep learning and vision processing startup THINCI; vision processing and deep learning for automotive
October 06, 2016
DENSO International America, Inc. has entered into an investment agreement with THINCI Inc., a deep-learning, vision processing startup developing innovative machine learning technology that enables the application of deep learning and vision processing in the automotive industry.
With this investment, DENSO is looking to accelerate the final development and integration of THINCI’s silicon and software technology into electronic systems that help enable driver assistance and autonomous driving, improve the efficiency of thermal systems, and optimize the productivity of the vehicle’s powertrain.
Volkswagen’s MEB for EVs: long electric range, open-platform, open-space, pricing for the volume market; “tablet on wheels”
October 05, 2016
Brands within the Volkswagen Group have been rolling out modular component matrices, or assembly toolkits, for their light-duty vehicles over the past few years. Until recently, the four main modular toolkits (modularen Baukästen) of the Group were: the MQB (transverse, driven by the Volkswagen brand); the MLB (longitudinal, driven by the Audi brand); the MSB (standard drive, driven by Porsche); and the NSF (New Small Family).
Development work on these continues; Audi, for example, is refining MLB evo—the second-generation of MLB and the foundation for the battery-electric e-tron quattro SUV due out in 2018. (Earlier post.) These four main kits are now joined by the all-new Modularer Elektrifizierungsbaukasten (“Modular Electric Drive kit”, or MEB), being developed by the Volkswagen brand. The MEB will be the foundation for an entirely new generation of battery-electric vehicles designed not only to be electric and feature extended range, but to be connected, autonomous, open and priced for the volume market as required by Volkswagen’s positioning.
Mercedes-Benz gets on the CASE with Generation EQ close-to-production electric concept
September 29, 2016
At the Paris Motor Show, Mercedes-Benz unveiled its close-to-production concept Generation EQ electric vehicle—the forerunner of Mercedes-Benz’s new product brand for electric mobility, EQ. The name EQ stands for “Electric Intelligence” and is derived from the Mercedes-Benz brand values of “Emotion and Intelligence”.
Dr Dieter Zetsche, CEO of Daimler AG and Head of Mercedes‑Benz Cars, said that the mobility of the future at Mercedes-Benz will stand on four pillars: Connected, Autonomous, Shared and Electric (CASE), adding that Mercedes-Benz has formed a CASE team. The Generation EQ is the logical fusion of all four pillars, he said.
NVIDIA introduces Xavier AI supercomputer designed for autonomous driving
At the inaugural GPU Technology Conference Europe, NVIDIA CEO Jen-Hsun Huang unveiled Xavier, an all-new AI supercomputer designed for use in self-driving cars. Xavier is a complete system-on-chip (SoC), integrating a new GPU architecture called Volta, a custom 8-core CPU architecture, and a new computer vision accelerator.
The processor will deliver 20 TOPS (trillion operations per second) of performance while consuming only 20 watts of power. As the brain of a self-driving car, Xavier is designed to be compliant with critical automotive standards, such as the ISO 26262 functional safety specification.
Volkswagen unveils I.D. EV concept; 1st MEB-based vehicle, to launch in 2020; up to 373 miles
September 28, 2016
At the Paris Motor Show, Volkswagen staged the world premiere of the I.D., a concept vehicle presaging the first of a new fleet of Volkswagen electric cars and highlighting Volkswagen’s vision for the future in a number of areas, including autonomous driving and a new Open Space concept for the interior.
The compact I.D. is driven by a 125 kW electric motor powered by a battery pack, and has a range of 400 - 600 km (249 - 373 miles) under European test conditions. I.D. will be the first Volkswagen built off the Modular Electric Drive kit (MEB). The production version of the I.D. is due to be launched in 2020 at a price on a par with comparably powerful and well-equipped Golf models. The I.D. concept car at the show further demonstrates a concept of autonomous driving for the year 2025.
NVIDIA and TomTom developing cloud-to-car mapping system for self-driving cars; DRIVE PX 2
NVIDIA and TomTom, the Dutch mapping and navigation group, are partnering to develop artificial intelligence to create a cloud-to-car mapping system for self-driving cars.
The work combines TomTom’s extensive HD map coverage, which already spans more than 120,000 km (75,000 miles) of highways and freeways, with the NVIDIA DRIVE PX 2 computing platform (earlier post). Together, the solution accelerates support for real-time in-vehicle localization and mapping for driving on the highway.
HERE unveils next-generation open platform real-time data services for automotive industry
On the eve of the Paris Motor Show, HERE, the high-definition mapping and location services business company acquired by Audi, BMW and Daimler (earlier post), announced next-generation vehicle-sourced data services. The HERE Open Location Platform will harness real-time data generated by the on-board sensors of connected vehicles—even from competing car brands—to create a live depiction of the road environment.
Drivers will be able to access this view of the road through four services that provide information on traffic conditions, potential road hazards, traffic signage and on-street parking at high quality. The goal is to ensure that drivers have more accurate and timely information with which they can make better driving decisions. HERE plans to make the services commercially available to any customers both within and outside the automotive industry from the first half of 2017.
DOT issues Federal Policy for safe testing and deployment of highly automated vehicles (SAE levels 3-5)
September 20, 2016
The US Department of Transportation issued Federal policy for highly automated vehicles (HAVs)—i.e., SAE Levels 3-5 vehicles with automated systems that are responsible for monitoring the driving environment as defined by SAE J3016.
Although the primary focus of the Federal Automated Vehicle Policy is on highly automated vehicles, or those in which the vehicle can take full control of the driving task in at least some circumstances, portions of the policy also apply to lower levels of automation, including some of the driver-assistance systems already being deployed by automakers today. The newly released policy embodies four key elements:
OmniVision introduces OV491 and OV495 companion chips for automotive image processing applications
September 19, 2016
OmniVision Technologies, Inc., a developer of advanced digital imaging solutions, is introducing the OV491 and OV495, two new companion chips that deliver performance and advanced features in combination with OmniVision’s portfolio of automotive RAW image sensors.
The OV491 is a compact image signal processor (ISP) companion chip specified to enable best-in-class image quality in surround view system architectures. The OV495 contains the same ISP features but is also equipped with electronic distortion and perspective correction, making it suited for rear video mirror and camera monitor system (CMS) applications. The OV491 and OV495 are both compatible with OmniVision’s OV2775, OV10650, and OV10640 automotive image sensors.
Ford embeds researchers in new U-M robotics lab to accelerate autonomous vehicle research
September 16, 2016
Ford and the University of Michigan are teaming up to accelerate autonomous vehicle research and development with a first-time arrangement that embeds Ford researchers and engineers into a new state-of-the-art robotics laboratory on U-M’s Ann Arbor campus.
While the new robotics laboratory opens in 2020, by the end of this year Ford will move a dozen researchers into the North Campus Research Complex (NCRC), kicking off the first phase of expanded presence.
Renesas to acquire Intersil for ~$3.2B; looking to lead in automotive, industrial, IoT system solutions
September 14, 2016
Renesas Electronics Corporation, a premier supplier of advanced semiconductor solutions, and Intersil Corporation, a leading provider of innovative power management and precision analog solutions, signed a definitive agreement for Renesas to acquire Intersil for US$22.50 per share in cash, representing an aggregate equity value of approximately US$3.2 billion. The transaction has been unanimously approved by the boards of directors of both companies. Closing of the transaction is expected in the first half of 2017, following approval by Intersil shareholders and the relevant governmental authorities.
Together, Renesas’ and Intersil’s deep expertise across a number of technologies and end markets will enable the combined company to become a complete solution provider of embedded systems to customers. By combining Renesas’ market-proven microcontroller (MCU) and system-on-chip (SoC) products and technologies and Intersil’s leading power management and precision analog capability, Renesas will be well positioned to address some of the most exciting opportunities in key areas such as automotive, industrial, cloud computing, healthcare, and the Internet of Things (IoT).
NVIDIA unveils single-processor configuration of DRIVE PX 2 for AutoCruise functions; Baidu deploying
September 13, 2016
NVIDIA unveiled a single-processor configuration of the NVIDIA DRIVE PX 2 AI computing platform (earlier post) that automakers can use to power automated and autonomous vehicles for driving and mapping.
The new computing platform for AutoCruise functions—which include highway automated driving and HD mapping—consumes just 10 watts of power and enables vehicles to use deep neural networks to process data from multiple cameras and sensors. It will be deployed by China’s Baidu as the in-vehicle car computer for its self-driving cloud-to-car system.
Tesla leans on radar for Autopilot in Version 8 software
September 12, 2016
With the release of Version 8 of its software, Tesla has made many updates to its Autopilot function. The most significant change however, is its new heavy reliance on the onboard radar. The radar was added to all Tesla vehicles in October 2014 as part of the Autopilot hardware suite, but originally was only meant to be a supplementary sensor to the primary camera and image processing system.
Now, however, Tesla is using more advanced signal processing to use the radar as a primary control sensor without requiring the camera to confirm visual image recognition. The changes come in the wake of the fatal crash (earlier post) in which Autopilot apparently did not detect the white side of the trailer against a brightly lit sky; nor did the driver. As a result, no brake was applied. In the aftermath of that incident, Tesla CEO Elon Musk tweeted that the company was “Working on using existing Tesla radar by itself (decoupled from camera) w temporal smoothing to create a coarse point cloud, like lidar”. (Earlier post.)
DENSO looks to increase holding in FUJITSU TEN, making it a group company
September 10, 2016
Auto parts supplier DENSO Corporation, Fujitsu Limited, and Toyota Motor Corporation have reached a basic agreement to start consideration of changing the capital structure of automotive electronics manufacturer FUJITSU TEN, in which the three companies have stakes. DENSO is part of the Toyota Group.
In the automotive field, the interface between the driver and vehicle is becoming increasingly important due to remarkable technological innovations. Against this backdrop, DENSO has agreed with Fujitsu and Toyota to review specific changes to make FUJITSU TEN a group company of DENSO and to enhance cooperation between the two companies in developing in-vehicle ECUs, millimeter-wave radar (earlier post), advanced driver assistance / automated driving technologies, and basic electronic technologies, among others.
NSF awards $4.6M to improve human control of automated cars, drones
September 08, 2016
NSF has awarded (Award Nº 1545126) $4.6 million to a team led by UC Berkeley exploring human cyber-physical systems (h-CPS)—systems that operate in concert with human operators—with the aim of improving the interaction between humans, computers and the physical world. The research outcome of the project, called Verified Human Interfaces, Control, and Learning for Semi-Autonomous Systems, or VeHICaL, will have applications in emerging technologies such as semi-autonomous cars and autonomous aerial vehicles (drones).
The award was part of a total of $13 million NSF awarded to three five-year “Frontier” projects to advance cyber-physical systems (CPS). The other two projects are tackling monitoring and mitigating noise pollution in cities and quickly identifying and overcoming problems in manufacturing environments.
LeddarTech launches LeddarVu, a new scalable platform towards high-resolution LiDAR; Vu8 solid-state LiDAR
September 07, 2016
LeddarTech, a developer of solid-state LiDAR technology (earlier post), introduced LeddarVu, a new platform for the next generation of its Leddar detection and ranging modules. The LeddarVu platform combines the benefits of a very compact, modular architecture with superior performance, robustness and cost efficiency towards high-resolution LiDAR applications, such as autonomous driving.
Leveraging LeddarTech’s advanced, patented signal processing and algorithms, LeddarVu sensors will evolve along with the future generations of the LeddarCore ICs. As previously announced with the company’s development roadmap, upcoming iterations of LeddarCore ICs are expected to deliver ranges reaching 250 m, fields of view up to 140°, and up to 480,000 points per second (with a resolution down to 0.25° both horizontal and vertical), enabling the design of affordable LiDARs for all levels of autonomous driving, including the capability of mapping the environment over 360° around the vehicle.
Volvo Cars and Autoliv to create joint venture for next-gen autonomous driving software
September 06, 2016
Volvo Cars and Autoliv have signed a letter of intent to set up a new jointly-owned company to develop next-generation autonomous driving software. The planned new company will have its headquarters in Gothenburg, Sweden, and an initial workforce taken from both companies of around 200, increasing to over 600 in the medium term. The company is expected to start operations in the beginning of 2017.
The new company, which has yet to be named, will develop advanced driver assistance systems (ADAS) and autonomous drive (AD) systems for use in Volvo cars and for sale exclusively by Autoliv to all car makers globally, with revenues shared by both companies.
Japan 3D map JV beginning process of creating high-def 3-D maps for autonomous vehicles; dynamic mapping
September 05, 2016
The Nikkei reports that Tokyo-based Dynamic Map Planning (DMP), formed earlier this year by Mitsubishi Electric, mapmaker Zenrin and nine automakers, will begin creating high-definition 3-D maps for self-driving cars this month.
The project is supported by the Japanese government (SIP-adus, Strategic Innovation Promotion Program Innovation of Automated Driving for Universal Services) in support of an effort to have autonomous vehicles on the road by 2020 equipped with a dynamic mapping system that combines real-time, “dynamic” information with a high-definition 3D map base. The result, say the planners, is a “one stop” information source that can provide necessary information for automated driving.
Renesas Electronics and TSMC announce 28nm MCU collaboration for next-generation green and autonomous vehicles
September 02, 2016
Renesas Electronics Corporation and TSMC are collaborating on 28nm (nanometer) embedded flash (eFlash) process technology for manufacturing microcontrollers (MCUs) targeted at next-generation green and autonomous vehicles. The automotive MCUs employing this new 28nm process technology are slated for sample shipment and mass production in 2017 and 2020, respectively.
More specifically, Renesas’ reliable and fast Metal-Oxide-Nitride-Oxide-Silicon (MONOS) eFlash technology will combine with TSMC’s high-performance, low-power 28nm high-K metal gate process technology to produce automotive MCUs for a broader range of applications such as autonomous vehicle sensor control; coordinated control among electronic control units (ECUs); fuel-efficient engine control for green vehicles; and highly efficient motor inverter control for electric vehicles.
Baidu and NVIDIA team up on cloud-to-car platform for self-driving cars; HD maps, Level 3 control, automated parking
September 01, 2016
At the Baidu World Conference in Beijing, Baidu CEO Robin Li together with NVIDIA CEO Jen-Hsun Huang announced a partnership to use artificial intelligence (AI) in the creation of a cloud-to-car autonomous car platform for local Chinese and global car makers.
NVIDIA and Baidu have a long history of working together on AI. The latest addition to their partnership will combine Baidu’s cloud platform and mapping technology with NVIDIA’s self-driving computing platform to develop solutions for HD maps, Level 3 autonomous vehicle control and automated parking.
Quanergy acquires Otus People Tracker software from Raytheon BBN for advanced autonomous driving and security LiDAR applications
August 29, 2016
Quanergy Systems, Inc., the provider of solid-state LiDAR sensors and smart sensing solutions (earlier post), has acquired the Otus People Tracker software from Raytheon BBN Technologies. The software complements Quanergy’s existing software portfolio and, when used with Quanergy’s LiDAR sensors, creates an integrated hardware and software solution for advanced people detection and tracking applications within the security and autonomous driving markets.
Otus (named after a genus of owls) uses advanced algorithms to identify and to track people for safety and security in crowded environments at ranges exceeding 100 meters when used with Quanergy LiDAR sensors. The system features segmentation techniques identifying humans; background extraction; object clustering; sophisticated merge and split algorithms; persistent tracking algorithms; and other advanced features supporting robust crowd control. Support for multiple zones of interest is included, allowing users fine control over active monitoring.
ORNL team presents solution for coordinating connected and automated vehicles at merging roadways; reduced fuel consumption and travel time
A team of researchers at Oak Ridge National Laboratory (ORNL) has developed an optimization framework and an analytical closed-form solution that addresses the problem of optimally coordinating connected and automated vehicles (CAVs) at merging roadways to achieve smooth traffic flow without stop-and-go driving.
They validated the effectiveness of the efficiency of their proposed solution through a simulation, showing that coordination of vehicles can significantly reduce both fuel consumption and travel time. A paper on the work is published in IEEE Transactions On Intelligent Transportation Systems.
NVIDIA dives into Parker mobile processor for next generation of autonomous vehicles
August 23, 2016
Speaking at the Hot Chips conference in Cupertino, California, NVIDIA revealed the architecture and underlying technology of its new Parker processor, which is suited for automotive applications such as self-driving cars and digital cockpits. Hot Chips, a symposium on high performance chips, is sponsored by the IEEE Technical Committee on Microprocessors and Microcomputers in cooperation with ACM SIGARCH.
NVIDIA mentioned Parker at CES 2016 earlier this year, when it introduced the NVIDIA DRIVE PX 2 platform (earlier post). That platform uses two Parker processors and two Pascal architecture-based GPUs to power deep learning applications.
Mobileye and Delphi to partner on SAE Level 4/5 automated driving solution for 2019
Mobileye and Delphi Automotive PLC are partnering to develop a complete SAE Level 4/5 automated driving solution. The program will result in an end-to-end production-intent fully automated vehicle solution, with the level of performance and functional safety required for rapid integration into diverse vehicle platforms for a range of customers worldwide.
The partners’ “Central Sensing Localization and Planning” (CSLP) platform will be demonstrated in combined urban and highway driving at the 2017 Consumer Electronics Show in Las Vegas and production ready for 2019.
Solid-state LiDAR company Quanergy raises $90M in Series B; valuation passes $1B
Quanergy Systems, Inc., a leading provider of solid-state LiDAR sensors and smart sensing solutions (earlier post), raised $90 million in Series B funding at a valuation well over $1 billion. Sensata Technologies, Delphi Automotive, Samsung Ventures, Motus Ventures and GP Capital participated in the round. This investment brings the company’s total funds raised to approximately $150 million.
Quanergy intends to use the investment and leverage its intellectual property to work with its partners in ramping up the production of its solid-state LiDAR sensors. These sensors use standard semiconductor manufacturing processes and have no moving parts on a macro scale or a micro scale, offering significantly lower cost, higher reliability, superior performance, increased capability, smaller size and lower weight when compared to traditional mechanical sensors, sometimes named hybrid solid state sensors.
3 autonomous vehicle startups move into U-M incubator
August 22, 2016
Three startups from the West Coast—Zendrive (earlier post), PolySync (earlier post) and Civil Maps (earlier post)—will join TechLab at Mcity this fall, moving resources to develop their driverless vehicle technologies in Ann Arbor.
The move is part of an expansion of an innovative program designed to drive the future of mobility, announced the University of Michigan Center for Entrepreneurship, in partnership with the U-M Mobility Transformation Center.
Volvo Cars and Uber join forces to develop next-gen autonomous driving cars; $300M joint project
August 18, 2016
Volvo Cars and ride-sharing company Uber have signed an agreement to establish a joint project that will develop new base vehicles that will be able to incorporate the latest developments in AD technologies, up to and including fully autonomous driverless cars. The base vehicles will be manufactured by Volvo Cars and then purchased from Volvo by Uber. Volvo Cars and Uber are contributing a combined $300 million to the project.
Uber also announced that it is acquiring Otto, a start-up focusing on self-driving trucks. (Earlier post). The announcement of the Volvo-Uber partnership also comes shortly after Ford CEO Mark Fields announced that his company intends to have a high-volume, highly autonomous SAE level 4-capable vehicle in commercial operation in 2021 in a ride-hailing or ride-sharing service. (Earlier post.)
ABI Research: highly automated driving to spark adoption of centralized advanced driver assistance systems
August 17, 2016
As vehicles become highly independent and begin to drive and react to traffic on their own, autonomous systems will aggregate and process data from a variety of on-board sensors and connected infrastructure. This will force the industry to hit a hard reset on advanced driver assistance systems (ADAS) architectures, currently dominated by distributed processing and smart sensors.
Automotive OEMs will need to adopt new platforms based on powerful, centralized processors and high-speed low latency networking (e.g., Audi zFAS, earlier post). ABI Research forecasts 13 million vehicles with centralized ADAS platforms will ship in 2025.
Ford targeting highly autonomous vehicle for ride-sharing in 2021; new tech company investments, staffing up in Silicon Valley
Ford intends to have a high-volume, highly autonomous SAE level 4-capable vehicle in commercial operation in 2021 in a ride-hailing or ride-sharing service. To achieve this, the company is investing in or collaborating with four startups to enhance its autonomous vehicle development, doubling its Silicon Valley team and more than doubling its Palo Alto campus.
Autonomous vehicles in 2021 are part of Ford Smart Mobility, the company’s plan to be a leader in autonomous vehicles, as well as in connectivity, mobility, the customer experience, and data and analytics.
Ford and Baidu invest $150M in Velodyne LiDAR
August 16, 2016
Velodyne LiDAR, Inc., a global leader in LiDAR (Light, Detection and Ranging) technology, announced the completion of a combined $150 million investment from co-investors Ford Motor Company and China’s leading search engine company Baidu, Inc. The investment will allow Velodyne to rapidly expand the design and production of high-performance, cost-effective automotive LiDAR sensors, accelerating mass adoption in autonomous vehicle and ADAS applications and therefore accelerating the critical, transformative benefits they provide.
Over the last decade, Velodyne developed four generations of hybrid solid-state LiDAR systems incorporating the company’s proprietary software and algorithms that interpret rich data gathered from the environment via highly accurate laser-based sensors to create high-resolution 3D digital images used for mapping, localization, object identification and collision avoidance.
Toyota Research Institute invests $22M in research on AI, robotics and autonomous driving at University of Michigan
August 10, 2016
Toyota Research Institute (TRI) (earlier post) is making a $22-million investment in research focused on artificial intelligence, robotics and autonomous driving at the University of Michigan (U-M). TRI- CEO Gill Pratt made the announcement in an address to the U-M faculty.
Under the agreement, TRI will provide an initial $22 million over four years for research collaborations with the U-M faculty in the areas of enhanced driving safety, partner robotics and indoor mobility, autonomous driving and student learning and diversity.
2017 Range Rover picks up suite of ADAS and semi-autonomous driving technologies
For 2017, Range Rover receives a suite of advanced ADAS and semi-autonomous driving technologies. Additionally, Land Rover will introduce the new Range Rover SVAutobiography Dynamic model for the 2017 lineup.
Standard Range Rover driver assistance technologies include Rear Park Distance Control; Cruise Control and Speed Limiter; Lane Departure Warning; and Autonomous Emergency Braking (AEB). This intelligent technology uses a forward facing camera to detect a risk of collision and warn the driver initiating full emergency braking if the driver fails to respond.
Panasonic acquires German software company OpenSynergy; virtualization for integrated cockpit solutions
August 09, 2016
Panasonic Corporation has acquired all the outstanding shares of the German software company OpenSynergy GmbH, making it a subsidiary. OpenSynergy specializes in embedded automotive software for cockpit solutions; it products enable the convergence of instrument clusters, head units, advanced driver assistance systems, and car connectivity systems.
A current cockpit system is controlled by different systems: one handles multimedia functions such as navigation and audio; another for driver support functions such as warning signals on head-up display. To integrate such different functions going forward, and to provide a better and intuitively easier driving support, system integration needs to be achieved from the operating system level, Panasonic said.
Fujitsu develops technology to enable high-speed deep learning
Fujitsu Laboratories Ltd. has developed software technology that uses multiple GPUs to enable high-speed deep learning powered by the application of supercomputer software parallelization technology. This newly developed technology was implemented in the Caffe deep learning framework, where, in a test measuring learning time using AlexNet on 64 GPU-equipped computers, it achieved a learning speed that is 27 times faster than a single GPU.
Compared with before this technology was applied, it achieved learning speed improvements of 46% for 16 GPUs and 71% for 64 GPUs (according to internal comparisons). Using this technology, the time required for deep learning R&D can be shortened, such as in the development of unique neural network models for the autonomous control of robots, automobiles, and so forth, or for healthcare and finance, such as with pathology classification or stock price forecasting, enabling the development of higher-quality models.
ZF and ibeo to develop new 3D LiDAR technology; ZF takes 40% stake
August 02, 2016
ZF has acquired a 40% stake in ibeo Automotive Systems GmbH. The Hamburg-based company, which was founded in 2009, is a developer of LiDAR technology and environmental recognition software with a particular focus on applications for autonomous driving (earlier post). Ibeo’s customers include several major global vehicle manufacturers.
The LiDAR generation being developed by ibeo in cooperation with ZF will reproduce a three-dimensional image of the environment without the rotating mirrors contained in current LiDAR systems. Using solid state technology, LiDAR technology will become more compact and easier to integrate into the vehicle.
Valeo Cruise4U car sets off on 13,000-mile partially automated drive across US
Valeo’s Cruise4U partially automated car, previously demonstrated at CES in Las Vegas in January 2016, set off from San Francisco on a 13,000-mile road trip around the US that is scheduled to conclude back in San Francisco on 15 September.
In partially automated driving mode, the trip will include stops in Los Angeles, Las Vegas, Seattle, Chicago, Detroit, Boston, New York, Miami, San Antonio and San Diego. The vehicle will travel both day and night in real traffic conditions.
Singapore LTA selects nuTonomy for trials of autonomous mobility-on-demand transportation service; new AV testing center
August 01, 2016
In addition to Delphi (earlier post), Singapore’s Land Transport Authority (LTA) has also selected nuTonomy to begin trials of an autonomous mobility-on-demand transportation service. The partnership will expand and accelerate the nuTonomy’s development efforts in Singapore as it progresses towards the launch of a commercial autonomous vehicle (AV) service in 2018.
Delphi and nuTonomy were shortlisted from several participants which submitted proposals for autonomous mobility-on-demand concepts under the Request for Information (RFI) issued by LTA in June last year. (Besides mobility-on-demand services, LTA is also exploring SDVs for other public transport applications, such as self-driving buses.)
Singapore Land Transport Authority selects Delphi for autonomous vehicle mobility-on-demand program
The Singapore Land Transport Authority (LTA) has selected Delphi Automotive PLC as a strategic partner to implement autonomous mobility concepts.
Delphi will provide a fleet of fully autonomous vehicles (AVs) and will develop a cloud-based mobility-on-demand software (AMoD) suite, opening up new potential autonomous markets for Delphi’s customers. Delphi will conduct a trial of an urban, point-to-point, low-speed, autonomous, mobility-on-demand service in Singapore’s Autonomous Vehicles Test Bed located at one-north, a business park in the western area of the city.
Report: Uber to invest $500M in global mapping project
July 31, 2016
The Financial Times reports that Uber will invest $500 million into a global mapping project in an effort to decrease its dependence on Google Maps and to prepare for autonomous driving. The FT cited “a person familiar with Uber’s plans” as the source.
Uber has already hired Brian McClendon, formerly the head of Google Maps for more than a decade. In a post on the Uber site last week, McClendon noted that “Accurate maps are at the heart of our service and the backbone of our business.”
NTSB issues preliminary report for investigation into Tesla Autopilot fatal crash
July 27, 2016
The US National Transportation Safety Board issued its preliminary report for the investigation of the fatal 7 May 2016, highway crash in Florida involving the Tesla Model S and Autopilot. The preliminary report does not contain any analysis of data and does not state probable cause for the crash.
The preliminary report details the collision involving a 53-foot semitrailer in combination with a 2014 Freightliner Cascadia truck tractor and the 2015 Tesla Model S. According to system performance data downloaded from the car, the indicated vehicle speed was 74 mph (119 km/h) just prior to impact; the posted speed limit was 65 mph (105 km/h).
NIRA Dynamics, InfoCar expand availability of Road Surface Information software with OBD plug-in
July 25, 2016
Sweden-based NIRA Dynamics, a software company developing sensor-fusion-based systems for different vehicle applications, is rolling out its Road Surface Information (RSI) software more broadly, in partnership with InfoCar AB.
Road Surface Information (RSI) by NIRA continuously monitors the quality and tire grip level of the road surface—without stereo cameras, adaptive suspension or other expensive sensors. With sensor-fusion-based algorithms, RSI can determine the level of road roughness and road friction.
Oxbotica launches Selenium mobile autonomy software
July 23, 2016
Oxbotica, a spin-out from Oxford University’s Mobile Robotics Group, launched its new Selenium mobile autonomy software solution with a purpose-built concept vehicle named Geni.
Selenium can work in pedestrianized environments as well as roads and motorways, and is not reliant on GPS to operate—i.e., it can easily transition between indoor and outdoor settings, over ground or underground. The system has been developed to be “vehicle agnostic”—it can be applied to cars, self-driving pods (e.g. for campuses and airports), and warehouse truck fleets.
Musk’s “Master Plan, Part Deux”; expands Tesla to heavy-duty electric trucks and urban transport; integrated energy generation and storage
July 21, 2016
Master Plan Part 1—public now for ten years—outlined (1) the creation of an expensive low-volume electric car (Roadster) to fund (2) a medium-volume electric car (Model S, X) at a lower price to create (3) an affordable high volume car (Model 3) and (4) provide solar power. Master Plan v2.0 takes Tesla into integrated energy generation and storage (i.e., Tesla’s acquisition of Solar City, earlier post) as well as into heavy-duty electric vehicles and urban transport.
CMU study: even partially-automated crash avoidance delivers financial and safety benefits
July 19, 2016
A new cost-benefit analysis by researchers at Carnegie Mellon College of Engineering shows that the public could derive economic and social benefits today if partially-automated collision avoidance features were deployed in all cars.
In a paper published in the journal Accident Analysis & Prevention, they evaluated the benefits and costs of fleet-wide deployment of three such technologies: blind spot monitoring; lane departure warning; and forward collision warning crash avoidance systems within the US light-duty vehicle fleet.
Daimler showcases semi-automated Mercedes-Benz Future Bus with CityPilot; driving 20 km BRT route
July 18, 2016
Daimler is highlighting its development of the semi-automated Mercedes-Benz Future Bus with CityPilot. The technology of the CityPilot in the Mercedes-Benz Future Bus is based on that of the autonomously driving Mercedes-Benz Actros truck with Highway Pilot presented two years ago. (Earlier post.)
The Mercedes-Benz Future Bus with CityPilot, which Daimler will showcase at the upcoming IAA 2016 in Hannover in September, is already making its first public journey on part of Europe’s longest BRT (Bus Rapid Transit) route in the Netherlands, linking Amsterdam’s Schiphol airport with the town of Haarlem.
Ford takes stake in Civil Maps; 3D mapping technologies for fully autonomous vehicles; AI and voxel hashing
July 16, 2016
Civil Maps, a start-up developing 3D mapping technology for fully autonomous vehicles, raised a $6.6-million seed funding round, led by Motus Ventures and including investment from Ford Motor Company, Wicklow Capital, StartX Stanford and Yahoo cofounder Jerry Yang’s AME Cloud Ventures.
Civil Maps’ mission is to make it possible for fully autonomous vehicles (SAE Levels 4-5) to drive anywhere smoothly and safely. The company’s focus is on building continental-scale maps for autonomous vehicles and providing precise localization using voxel-hashing algorithms. Although GPS and IMU (inertial measurement unit) technologies can in theory determine both position and orientation of vehicles, the accuracy is limited by atmospheric distortion, the start-up notes.
NSF leads $400M federal effort to boost advanced wireless research
July 15, 2016
The National Science Foundation (NSF) will invest more than $400 million over the next seven years to support fundamental wireless research and to develop platforms for advanced wireless research in support of the White House’s Advanced Wireless Research Initiative. These investments will support the research community in experimenting with and testing novel technologies, applications and services capable of making wireless communication faster, smarter, more responsive and more robust.
This new program will enable the deployment and use of four city-scale testing platforms for advanced wireless research over the next decade and builds upon the Federal Communications Commission’s (FCC) Spectrum Frontiers vote yesterday. (Earlier post.)
TTTech using VectorCAST platform for development of Audi zFAS to ISO 26262 ASIL-D compliance; domain controller for piloted driving
TTTech Computertechnik AG (TTTech) has selected Vector Software’s VectorCAST software test automation platform for use within TTTech’s development of Audi’s zFAS (zentrales Fahrerassistenzsteuergerät)—the domain controller for Audi piloted driving systems. (Earlier post.) VectorCAST provides TTTech with the tools necessary to ensure ISO 26262 compliance up to ASIL D level on all microcontrollers used in zFAS.
Under the guidance of Audi AG, TTTech developed the zFAS electronic control unit (ECU) that integrates various functionalities of advanced driver assistance systems (ADAS). The ECU uses numerous technology components from TTTech for various automotive assistance functions, such as piloted parking or autonomous driving.
Consumer Reports calls on Tesla to disable and update auto steering function, remove “Autopilot” name
July 14, 2016
Consumer Reports is calling on Tesla to disable the automatic steering function in the Autopilot driving-assist system available in its Model S vehicles until the company updates the function to confirm that the driver’s hands remain on the steering wheel at all times.
The consumer organization, which has owned and tested three Teslas (2013 Model S 85, 2014 Model S P85D, and 2016 Model X 90D), said that Tesla should also change the name of the Autopilot feature because it promotes a potentially dangerous assumption that the Model S is capable of driving on its own.
JLR to field more than 100 research vehicles in UK for wide range of connected and autonomous vehicle technologies
Jaguar Land Rover (JLR) plans to create a fleet of more than 100 research vehicles over the next four years, to develop and test a wide range of Connected and Autonomous Vehicle (CAV) technologies. The first of these research cars will be driven on a new 41-mile test route on motorways and urban roads around Coventry and Solihull in the UK later this year.
The initial tests will involve vehicle-to-vehicle (V2V) and vehicle-to-infrastructure (V2I) communications technologies that will allow cars to talk to each other and roadside signs, overhead gantries and traffic lights. Ultimately, data sharing between vehicles would allow future connected cars to co-operate and work together to assist the driver and make lane changing and crossing junctions easier and safer.
Nissan’s ProPILOT autonomous drive technology debuts in new Serena; autonomous drive first for Japanese automakers
July 13, 2016
Nissan Motor’s new Serena, scheduled to go on sale in Japan in late August, will come equipped with the company’s ProPILOT autonomous drive technology. ProPILOT is designed for highway use in single-lane traffic. Nissan is the first Japanese automaker to introduce a combination of steering, accelerator and braking that can be operated in full automatic mode, easing driver workload in heavy highway traffic and long commutes.
Employing advanced image-processing technology, the car’s ProPILOT system understands road and traffic situations and executes precise steering enabling the vehicle to perform naturally.
Jaguar Land Rover demonstrates all-terrain self-driving technology; off-road connected convoy
July 12, 2016
Jaguar Land Rover has demonstrated a range of innovative research technologies that would allow a future autonomous car to drive itself over any surface or terrain.
The multi-million pound autonomous all-terrain driving research project aims to make the self-driving car viable in the widest range of real life, on- and off-road driving environments and weather conditions. To enable this level of autonomous all-terrain capability, Jaguar Land Rover’s researchers are developing next-generation sensing technologies that will be the eyes of the future autonomous car.
Google teaching autonomous vehicles to share road safely with cyclists
July 11, 2016
Google currently has 24 Lexus RX450h SUVs and 34 prototype vehicles on the road with autonomous driving capability. The Google autonomous fleet has so far racked up 1,725,911 miles driven in autonomous mode, with 1,158,921 miles driven manually. The fleet is averaging 15K to 17K autonomous miles per week, with testing locations in Mountain View, California; Kirkland, Washington; Phoenix, Arizona; and Austin, Texas.
Automotive and telecoms sectors to launch EU project for connected and automated driving
July 07, 2016
Europe’s leading trade associations for the telecommunications and the automotive sectors intend to launch a large-scale, pre-deployment project to test connected and automated driving at the EU level.
The industry-led project will focus on use cases and test functionalities in three main areas: automated driving, road safety and traffic efficiency, and the digitalisation of transports and logistics. Functions that are being considered include high density platooning, cooperative collision avoidance, remote control parking, local-hazard warnings and traffic flow optimization. High definition maps will be updated with a fast connection to the internet on phone or other mobile devices.
Renault-Nissan Alliance delivers €4.3B synergy target early; autonomous drive and connectivity expected to deliver major savings
July 06, 2016
Renault-Nissan Alliance generated €4.3 billion (US$4.8 billion) in synergies in 2015, one year ahead of schedule and an increase of 13% from 2014. Purchasing, engineering and manufacturing were the main contributors.
The Common Module Family and cross-production continue to reduce costs, while the development of technologies including autonomous drive and vehicle connectivity is expected to generate major savings moving forward. With the convergence, the Alliance expects to generate at least €5.5 billion (US$6.1 billion) in synergies in 2018
NHTSA begins preliminary evaluation of Tesla Model S Autopilot fatality
July 01, 2016
National Highway Traffic Safety Administration’s (NHTSA) Office of Defects Investigation (ODI) has begun a preliminary evaluation of a fatal highway crash involving a 2015 Tesla Model S operating with Autopilot activated. ODI is opening the preliminary evaluation (PE16007) to examine the design and performance of any automated driving systems in use at the time of the crash.
In a blog post, Tesla Motors was quick to point out that this is the first known fatality in more than 130 million miles driven with Autopilot activated. Tesla also pointed out that among all vehicles in the US, there is a fatality every 94 million miles; worldwide, there is a fatality approximately every 60 million miles.
Report: combination of new mobility technologies creates opportunities for cutting emissions, but requires strategic policy interventions
June 30, 2016
The combination of connectivity, automation plus shared vehicle ownership and use has the potential to make car travel greener and cheaper, cutting energy use and helping accelerate the introduction of low carbon vehicles. However, these energy and carbon benefits are by no means guaranteed and will require strategic policy interventions to maximize them according to new report by the Institute for Transport Studies (ITS) at the University of Leeds, commissioned by the Low Carbon Vehicle Partnership (LowCVP) and the Institution of Mechanical Engineers (IMechE).
The study—Automated vehicles; Automatically low carbon?— was presented at the Low Carbon Vehicle Partnership Conference at the Olympic Park in London. According to the study, better coordination and connectivity between vehicles and infrastructure is likely to improve energy efficiency, as well as potentially make road transport safer and quicker.
WABCO and ZF demonstrate prototype of Evasive Maneuver Assist for commercial vehicles; connecting active braking and steering capability
WABCO Holdings and ZF have developed and demonstrated a prototype of a new collision avoidance technology for commercial vehicles. The Evasive Maneuver Assist (EMA) combines WABCO’s world-class braking, stability and vehicle dynamics control systems on trucks and trailers with ZF’s active steering technology—an industry first. EMA marks another step toward enabling autonomous driving in the commercial vehicle industry.
Evasive Maneuver Assist leverages the capabilities of WABCO’s OnGuardACTIVE, its radar-only collision mitigation system. A radar sensor identifies moving or stationary vehicles ahead and alerts the driver via visual, audio and haptic signals of impending rear-end collisions. Should the driver determine that the system cannot avoid a rear-end collision by driver-initiated or autonomous braking alone, Evasive Maneuver Assist engages to help the driver to safely steer around an obstructing vehicle and to bring truck and trailer to a complete and safe stop.
Beijing Foton launches “China Internet Super Truck Global Innovation Alliance”; Auman Energy Super Truck
June 23, 2016
Chinese truck and utility vehicle manufacturer Beijing Foton has launched the China Internet Super Truck Global Innovation Alliance and showcased the Auman EST (Energy Super Truck) at an event in Athens, Greece.
The Internet super truck project is a collaboration that brings together Foton Motor Group, Cummins, and Daimler AG and aims at building Internet-driven super trucks that are green, efficient, safe, and intelligent through the integration of global resources, the effective use of new energies, the establishment of vehicle networks, and the implementation of intelligent truck-loading technology.
Avoiding Obstacles Safely at High Speed: An Optimal Control Approach for Driving Heavy Autonomous Ground Vehicles Close to their Dynamic Limits
June 22, 2016
by Paul N. Blumberg, PhD
No one who follows automotive news and technology trends can be unaware of the intense research and experimentation taking place related to vehicles that drive themselves, i.e., “Autonomous Vehicles.” Among other potential benefits, these vehicles offer the promise of increased safety on the nation’s roadways and personal mobility to those who are not able or do not wish to drive themselves.
The Society of Automotive Engineers has already developed Standard J3016 which describes six different levels of “driving automation” increasing from Level 0 - No Automation, to Level 5 - Full Automation. At Level 0, a human driver performs all driving tasks, even if assisted by warning or other momentary intervention systems. At Level 5, a system comprising sensors, actuators and control algorithms performs all driving tasks such as steering, braking, acceleration and monitoring the vehicle and roadway under all modes of driving. At Level 5, a human driver is no longer necessary.
Lux Research: Toyota, Daimler, Honda leading auto OEMs for self-driving cars; innovative business models key
June 21, 2016
Although every major automotive company has announced plans for a car with self-driving capabilities, only five carmakers—Daimler, Honda, Hyundai, Toyota and Volvo—earn a positive take in Lux Research’s analysis of OEMs’ autonomous vehicle efforts.
In an emerging scenario of few significant technical differentiations and near-ubiquitous systems capabilities, Lux Research evaluated 12 carmakers and offered a “positive” rating based on three key criteria: demonstrated capability, investment and partnerships.
NVIDIA deep learning software platform gets trio of big updates
June 20, 2016
NVIDIA announced three major updates for its deep learning software platform. NVIDIA DIGITS 4 introduces a new object detection workflow, enabling data scientists to train deep neural networks to find faces, pedestrians, traffic signs, vehicles and other objects in a sea of images. This workflow enables advanced deep learning solutions—such as tracking objects from satellite imagery, security and surveillance, advanced driver assistance systems and medical diagnostic screening.
When training a deep neural network, researchers must repeatedly tune various parameters to get high accuracy out of a trained model. DIGITS 4 can automatically train neural networks across a range of tuning parameters, significantly reducing the time required to arrive at the most accurate solution.