[Due to the increasing size of the archives, each topic page now contains only the prior 365 days of content. Access to older stories is now solely through the Monthly Archive pages or the site search function.]
Tesla leans on radar for Autopilot in Version 8 software
September 12, 2016
With the release of Version 8 of its software, Tesla has made many updates to its Autopilot function. The most significant change however, is its new heavy reliance on the onboard radar. The radar was added to all Tesla vehicles in October 2014 as part of the Autopilot hardware suite, but originally was only meant to be a supplementary sensor to the primary camera and image processing system.
Now, however, Tesla is using more advanced signal processing to use the radar as a primary control sensor without requiring the camera to confirm visual image recognition. The changes come in the wake of the fatal crash (earlier post) in which Autopilot apparently did not detect the white side of the trailer against a brightly lit sky; nor did the driver. As a result, no brake was applied. In the aftermath of that incident, Tesla CEO Elon Musk tweeted that the company was “Working on using existing Tesla radar by itself (decoupled from camera) w temporal smoothing to create a coarse point cloud, like lidar”. (Earlier post.)
DENSO looks to increase holding in FUJITSU TEN, making it a group company
September 10, 2016
Auto parts supplier DENSO Corporation, Fujitsu Limited, and Toyota Motor Corporation have reached a basic agreement to start consideration of changing the capital structure of automotive electronics manufacturer FUJITSU TEN, in which the three companies have stakes. DENSO is part of the Toyota Group.
In the automotive field, the interface between the driver and vehicle is becoming increasingly important due to remarkable technological innovations. Against this backdrop, DENSO has agreed with Fujitsu and Toyota to review specific changes to make FUJITSU TEN a group company of DENSO and to enhance cooperation between the two companies in developing in-vehicle ECUs, millimeter-wave radar (earlier post), advanced driver assistance / automated driving technologies, and basic electronic technologies, among others.
LeddarTech launches LeddarVu, a new scalable platform towards high-resolution LiDAR; Vu8 solid-state LiDAR
September 07, 2016
LeddarTech, a developer of solid-state LiDAR technology (earlier post), introduced LeddarVu, a new platform for the next generation of its Leddar detection and ranging modules. The LeddarVu platform combines the benefits of a very compact, modular architecture with superior performance, robustness and cost efficiency towards high-resolution LiDAR applications, such as autonomous driving.
Leveraging LeddarTech’s advanced, patented signal processing and algorithms, LeddarVu sensors will evolve along with the future generations of the LeddarCore ICs. As previously announced with the company’s development roadmap, upcoming iterations of LeddarCore ICs are expected to deliver ranges reaching 250 m, fields of view up to 140°, and up to 480,000 points per second (with a resolution down to 0.25° both horizontal and vertical), enabling the design of affordable LiDARs for all levels of autonomous driving, including the capability of mapping the environment over 360° around the vehicle.
Quanergy acquires Otus People Tracker software from Raytheon BBN for advanced autonomous driving and security LiDAR applications
August 29, 2016
Quanergy Systems, Inc., the provider of solid-state LiDAR sensors and smart sensing solutions (earlier post), has acquired the Otus People Tracker software from Raytheon BBN Technologies. The software complements Quanergy’s existing software portfolio and, when used with Quanergy’s LiDAR sensors, creates an integrated hardware and software solution for advanced people detection and tracking applications within the security and autonomous driving markets.
Otus (named after a genus of owls) uses advanced algorithms to identify and to track people for safety and security in crowded environments at ranges exceeding 100 meters when used with Quanergy LiDAR sensors. The system features segmentation techniques identifying humans; background extraction; object clustering; sophisticated merge and split algorithms; persistent tracking algorithms; and other advanced features supporting robust crowd control. Support for multiple zones of interest is included, allowing users fine control over active monitoring.
Mobileye and Delphi to partner on SAE Level 4/5 automated driving solution for 2019
August 23, 2016
Mobileye and Delphi Automotive PLC are partnering to develop a complete SAE Level 4/5 automated driving solution. The program will result in an end-to-end production-intent fully automated vehicle solution, with the level of performance and functional safety required for rapid integration into diverse vehicle platforms for a range of customers worldwide.
The partners’ “Central Sensing Localization and Planning” (CSLP) platform will be demonstrated in combined urban and highway driving at the 2017 Consumer Electronics Show in Las Vegas and production ready for 2019.
Solid-state LiDAR company Quanergy raises $90M in Series B; valuation passes $1B
Quanergy Systems, Inc., a leading provider of solid-state LiDAR sensors and smart sensing solutions (earlier post), raised $90 million in Series B funding at a valuation well over $1 billion. Sensata Technologies, Delphi Automotive, Samsung Ventures, Motus Ventures and GP Capital participated in the round. This investment brings the company’s total funds raised to approximately $150 million.
Quanergy intends to use the investment and leverage its intellectual property to work with its partners in ramping up the production of its solid-state LiDAR sensors. These sensors use standard semiconductor manufacturing processes and have no moving parts on a macro scale or a micro scale, offering significantly lower cost, higher reliability, superior performance, increased capability, smaller size and lower weight when compared to traditional mechanical sensors, sometimes named hybrid solid state sensors.
TU Graz team uses monocrystalline Si as Li-ion anode; integrated micro batteries for on-board sensors
August 21, 2016
Electrochemists at TU Graz have used single crystalline acceptor-doped Si—as ubiquitously used in the semiconductor industry—as anode material for rechargeable Li-ion batteries. In an open access paper in the journal Scientific Reports, the team suggests that the use of such patterned monocrystalline Si (m-Si) anodes directly shaped out of the Si wafer is a highly attractive route to realize miniaturized, on-board fully integrated, power supplies for Si-based chips.
The microchip not only houses the electronics, but is at the same time an important part of a mini battery providing electrical energy, e.g. for sending and receiving information.
ABI Research: highly automated driving to spark adoption of centralized advanced driver assistance systems
August 17, 2016
As vehicles become highly independent and begin to drive and react to traffic on their own, autonomous systems will aggregate and process data from a variety of on-board sensors and connected infrastructure. This will force the industry to hit a hard reset on advanced driver assistance systems (ADAS) architectures, currently dominated by distributed processing and smart sensors.
Automotive OEMs will need to adopt new platforms based on powerful, centralized processors and high-speed low latency networking (e.g., Audi zFAS, earlier post). ABI Research forecasts 13 million vehicles with centralized ADAS platforms will ship in 2025.
Ford and Baidu invest $150M in Velodyne LiDAR
August 16, 2016
Velodyne LiDAR, Inc., a global leader in LiDAR (Light, Detection and Ranging) technology, announced the completion of a combined $150 million investment from co-investors Ford Motor Company and China’s leading search engine company Baidu, Inc. The investment will allow Velodyne to rapidly expand the design and production of high-performance, cost-effective automotive LiDAR sensors, accelerating mass adoption in autonomous vehicle and ADAS applications and therefore accelerating the critical, transformative benefits they provide.
Over the last decade, Velodyne developed four generations of hybrid solid-state LiDAR systems incorporating the company’s proprietary software and algorithms that interpret rich data gathered from the environment via highly accurate laser-based sensors to create high-resolution 3D digital images used for mapping, localization, object identification and collision avoidance.
ZF and ibeo to develop new 3D LiDAR technology; ZF takes 40% stake
August 02, 2016
ZF has acquired a 40% stake in ibeo Automotive Systems GmbH. The Hamburg-based company, which was founded in 2009, is a developer of LiDAR technology and environmental recognition software with a particular focus on applications for autonomous driving (earlier post). Ibeo’s customers include several major global vehicle manufacturers.
The LiDAR generation being developed by ibeo in cooperation with ZF will reproduce a three-dimensional image of the environment without the rotating mirrors contained in current LiDAR systems. Using solid state technology, LiDAR technology will become more compact and easier to integrate into the vehicle.
Valeo Cruise4U car sets off on 13,000-mile partially automated drive across US
Valeo’s Cruise4U partially automated car, previously demonstrated at CES in Las Vegas in January 2016, set off from San Francisco on a 13,000-mile road trip around the US that is scheduled to conclude back in San Francisco on 15 September.
In partially automated driving mode, the trip will include stops in Los Angeles, Las Vegas, Seattle, Chicago, Detroit, Boston, New York, Miami, San Antonio and San Diego. The vehicle will travel both day and night in real traffic conditions.
Audi AG developing automotive driver health as new business area; leveraging digitalization, connected vehicles
August 01, 2016
Audi AG has become a founding partner in Berlin’s “Flying Health Incubator”, a center supporting startups that develop digital innovations in the healthcare sector. The investment highlights Audi’s interest in developing “automotive health”—enhancing the customer’s health and fitness while driving—as a new business area. With the Audi Fit Driver offering, the brand is already testing innovative services and functionalities in this field.
In the Flying Health Incubator, Audi AG is entering into dialog with decision-makers from the startup scene and from the healthcare industry. Together, the partners will strive to identify trends, technical solutions and business models in the digital health market at an early stage.
Ford, MIT project uses LiDAR, cameras to measure pedestrian traffic & predict demand for new, on-demand electric shuttles
July 27, 2016
Ford Motor Company and MIT are collaborating on a new research project that measures how pedestrians move in urban areas to improve certain public transportation services, such as ride-hailing and point-to-point shuttles services.
The project will introduce a fleet of on-demand electric vehicle shuttles that operate on both city roads and campus walkways on the university’s Cambridge, Massachusetts, campus. The vehicles use LiDAR sensors and cameras to measure pedestrian flow, which ultimately helps predict demand for the shuttles. This, in turn, helps researchers and drivers route shuttles toward areas with the highest demand to better accommodate riders.
NTSB issues preliminary report for investigation into Tesla Autopilot fatal crash
The US National Transportation Safety Board issued its preliminary report for the investigation of the fatal 7 May 2016, highway crash in Florida involving the Tesla Model S and Autopilot. The preliminary report does not contain any analysis of data and does not state probable cause for the crash.
The preliminary report details the collision involving a 53-foot semitrailer in combination with a 2014 Freightliner Cascadia truck tractor and the 2015 Tesla Model S. According to system performance data downloaded from the car, the indicated vehicle speed was 74 mph (119 km/h) just prior to impact; the posted speed limit was 65 mph (105 km/h).
NIRA Dynamics, InfoCar expand availability of Road Surface Information software with OBD plug-in
July 25, 2016
Sweden-based NIRA Dynamics, a software company developing sensor-fusion-based systems for different vehicle applications, is rolling out its Road Surface Information (RSI) software more broadly, in partnership with InfoCar AB.
Road Surface Information (RSI) by NIRA continuously monitors the quality and tire grip level of the road surface—without stereo cameras, adaptive suspension or other expensive sensors. With sensor-fusion-based algorithms, RSI can determine the level of road roughness and road friction.
Oxbotica launches Selenium mobile autonomy software
July 23, 2016
Oxbotica, a spin-out from Oxford University’s Mobile Robotics Group, launched its new Selenium mobile autonomy software solution with a purpose-built concept vehicle named Geni.
Selenium can work in pedestrianized environments as well as roads and motorways, and is not reliant on GPS to operate—i.e., it can easily transition between indoor and outdoor settings, over ground or underground. The system has been developed to be “vehicle agnostic”—it can be applied to cars, self-driving pods (e.g. for campuses and airports), and warehouse truck fleets.
Ford takes stake in Civil Maps; 3D mapping technologies for fully autonomous vehicles; AI and voxel hashing
July 16, 2016
Civil Maps, a start-up developing 3D mapping technology for fully autonomous vehicles, raised a $6.6-million seed funding round, led by Motus Ventures and including investment from Ford Motor Company, Wicklow Capital, StartX Stanford and Yahoo cofounder Jerry Yang’s AME Cloud Ventures.
Civil Maps’ mission is to make it possible for fully autonomous vehicles (SAE Levels 4-5) to drive anywhere smoothly and safely. The company’s focus is on building continental-scale maps for autonomous vehicles and providing precise localization using voxel-hashing algorithms. Although GPS and IMU (inertial measurement unit) technologies can in theory determine both position and orientation of vehicles, the accuracy is limited by atmospheric distortion, the start-up notes.
TTTech using VectorCAST platform for development of Audi zFAS to ISO 26262 ASIL-D compliance; domain controller for piloted driving
July 15, 2016
TTTech Computertechnik AG (TTTech) has selected Vector Software’s VectorCAST software test automation platform for use within TTTech’s development of Audi’s zFAS (zentrales Fahrerassistenzsteuergerät)—the domain controller for Audi piloted driving systems. (Earlier post.) VectorCAST provides TTTech with the tools necessary to ensure ISO 26262 compliance up to ASIL D level on all microcontrollers used in zFAS.
Under the guidance of Audi AG, TTTech developed the zFAS electronic control unit (ECU) that integrates various functionalities of advanced driver assistance systems (ADAS). The ECU uses numerous technology components from TTTech for various automotive assistance functions, such as piloted parking or autonomous driving.
Jaguar Land Rover demonstrates all-terrain self-driving technology; off-road connected convoy
July 12, 2016
Jaguar Land Rover has demonstrated a range of innovative research technologies that would allow a future autonomous car to drive itself over any surface or terrain.
The multi-million pound autonomous all-terrain driving research project aims to make the self-driving car viable in the widest range of real life, on- and off-road driving environments and weather conditions. To enable this level of autonomous all-terrain capability, Jaguar Land Rover’s researchers are developing next-generation sensing technologies that will be the eyes of the future autonomous car.
Neos and Lockheed Martin to develop enhanced next-gen airborne gravity gradiometer to advance ability to find oil, gas & minerals
July 06, 2016
In partnership with Lockheed Martin, Neos Inc. will develop a new generation sensor to be used to find oil, gas and minerals beneath the earth’s surface from the air. The new Full Tensor Gradiometry (FTG) Plus technology has 20 times the sensitivity and 10 times greater bandwidth than current gravity gradiometers, according to Neos.
Gravity gradiometers have been commercially used for more than 20 years and militarily longer than that. The technology is based on the principle that earth’s gravity field varies with location, local topography and sub-surface geologic features. Measuring the gravity variation caused by items beneath the earth’s surface can help identify unique underground and undersea geologic structures. The new airborne FTG Plus sensor is so advanced it could find a 10-meter tall hill buried one kilometer below the earth’s surface.
BMW Group, Intel and Mobileye partner on open platform to bring fully autonomous driving to market by 2021
July 01, 2016
BMW Group, Intel, and Mobileye are collaborating to bring solutions for highly and fully automated driving into series production by 2021. The three are creating an a standards-based open platform—from door locks to the datacenter—for the next generation of cars.
The goal of the collaboration is to develop future-proofed solutions that enable drivers to not only take their hands off the steering wheel, but reach the “eyes off” (level 3) and ultimately the “mind off” (level 4) level transforming the driver’s in-car time into leisure or work time.
NHTSA begins preliminary evaluation of Tesla Model S Autopilot fatality
National Highway Traffic Safety Administration’s (NHTSA) Office of Defects Investigation (ODI) has begun a preliminary evaluation of a fatal highway crash involving a 2015 Tesla Model S operating with Autopilot activated. ODI is opening the preliminary evaluation (PE16007) to examine the design and performance of any automated driving systems in use at the time of the crash.
In a blog post, Tesla Motors was quick to point out that this is the first known fatality in more than 130 million miles driven with Autopilot activated. Tesla also pointed out that among all vehicles in the US, there is a fatality every 94 million miles; worldwide, there is a fatality approximately every 60 million miles.
Renesas Electronics develops two-port on-chip SRAM for improved video processing for autonomous vehicles
June 16, 2016
Renesas Electronics has developed a new two-port on-chip Static Random Access Memory (SRAM) for use in system-on-chips (SoCs) for in-vehicle infotainment systems. The new on-chip SRAM will be used as video processing buffer memory in high-performance SoCs that will play an important role in making the autonomous-driving vehicles of the future safer and more reliable.
The new SRAM is optimized for parallel processing of video data and will enable advanced video data processing such as obstacle recognition utilizing real-time processing of high-resolution vehicle camera videos and augmented reality (AR) display on the windshield.
Valeo to offer new low-cost solid-state LiDAR; co-developed with LeddarTech; mass production in 2018
June 01, 2016
Tier 1 supplier Valeo is adding a new low-cost solid-state LiDAR, developed together with LeddarTech, a specialist in advanced detection and ranging solutions, to its portfolio for driving and parking assistance. The new sensor will be ready for mass production in 2018.
The Solid-state LiDAR will have no mechanical moving parts and will be the least expensive LiDAR sensor on the market, Valeo said. With a proprietary receiver ASIC with 16 discrete detection segments, the sensor will provide best in class sensing performance being able to detect pedestrians, bicycles, motorcycles or cars which are only partly in the same lane.
RoMulus project: developing intelligent multi-sensor systems for Industry 4.0
May 29, 2016
The RoMulus (Robust multi-sensor technology for status monitoring in Industry 4.0 applications) research project project sponsored by the German Federal Ministry of Education and Research (BMBF) is focused on simplifying and accelerating the development and use of intelligent multi-sensor systems for Industry 4.0——the digitalization of production processes based on devices autonomously communicating with each other along the value chain. (Earlier post.)
Multi-sensor systems are key components for the success of Industry 4.0 applications. They record, process, and transmit a number of measurement parameters, such as pressure, acceleration, and temperature, all in a highly compact space. Machines are not the only ones to receive such sensors; workpieces are also increasingly being fitted with the intelligent sensor systems so that each product can provide its blueprint and report its manufacturing status. Based on this information, production is largely able to organize and monitor itself.
Audi A7 piloted driving concept “Jack” now driving more naturally
May 13, 2016
Audi’s latest version of its piloted driving research car, the Audi A7 concept “Jack,” has not only learned how to autonomously perform all of its driving maneuvers on the expressway, it has also learned how to show consideration for other road users. Jack exhibits a driving style that is adaptive to the given situation, safe and especially interactive, Audi says.
“Jack”—the internal nickname for the Audi A7 piloted driving concept technology platform now passes trucks with a slightly wider lateral gap. It also signals upcoming lane changes by activating the turn signal and moving closer to the lane marking first—just as human drivers would do to indicate their intentions.
Ford expands Smart Mobility pilot program to deliver improved access to healthcare in The Gambia; motorcycles with sensors
Pregnant women, children and those with medical conditions in The Gambia—one of Africa’s smallest, poorest countries—may have better access to healthcare through an expansion of a Ford Smart Mobility pilot program. Ford has equipped 50 motorcycles serving Riders for Health with sensor technology so the medical services group can collect a variety of data, including mapping coordinates, to improve the delivery of medical services and supplies—particularly in remote areas of the West African country.
The project uses Ford’s OpenXC (earlier post) sensor kits fitted to the motorcycles to gather information. OpenXC technology records every trip, and is accessed via an application on a mobile phone provided by Ford.
Bosch’s new electronic driver assistance system for trams adds collision warning with automatic braking; derived from automotive
May 03, 2016
Bosch has developed a new electronic driver system for trams that not only warns tram drivers of any impending collision but will engage the brakes independently to stop the tram and avoid an accident if the driver reacts to late or not at all.
Bosch Engineering successfully adapted the company’s large-scale automotive production technology for its new and enhanced collision warning system for city rail transportation. The new collision warning system combines a video sensor, a radar sensor, and a high-performance rail control unit.
SwRI to showcase Ranger precision localization technology for automated driving; non-GPS system with 2cm precision
April 27, 2016
Southwest Research Institute (SwRI) will showcase its award-winning Ranger precision localization solution at the AUVSI XPONENTIAL 2016 conference and trade show in New Orleans 2-5 May.
Ranger is a patented approach to vehicle localization that enables precise navigation for automated vehicles using commercially available hardware in combination with SwRI algorithms. The latest Ranger kit can be used for automated driving, valet parking in garages and structures, freight distribution, and docking of buses and large trucks.
Velodyne LiDAR introduces 32-channel ULTRA Puck VLP-32A high definition real-time 3D LiDAR
April 13, 2016
Velodyne LiDAR introduced the ULTRA Puck VLP-32A, combining best-in-class 32-channel performance with a small form factor and the high reliability, at the the 2016 SAE World Congress in Detroit. The ULTRA Puck VLP-32A is the company’s most advanced LiDAR sensor to date, delivering high performance at a cost-effective price point of around $500 at automotive-scale production.
The ULTRA Puck doubles the range and resolution (via number of laser channels) of its predecessor to 200 meters and 32 channels, providing enhanced resolution to identify objects easily. The 32 channels in the ULTRA Puck are deployed over a vertical field of view of 28° and are configured in a unique pattern to provide improved resolution in the horizon to be even more useful for automotive applications. By contrast, the earlier unit used equidistant, 2˚ spacing of the channels.
Ford tests Fusion Hybrid autonomous research vehicles driving in complete darkness
April 11, 2016
As part of its LiDAR sensor development, Ford has tested Fusion Hybrid autonomous research vehicles in complete darkness without headlights on desert roads, demonstrating the capability to perform beyond the limits of human drivers.
Driving in pitch black at Ford Arizona Proving Ground marks the next step on the company’s efforts to delivering fully autonomous vehicles. The development shows that even without cameras, which rely on light, Ford’s LiDAR (units from Velodyne), working with the car’s virtual driver software, is robust enough to steer flawlessly around winding roads. While it’s ideal to have all three modes of sensors—radar, cameras and LiDAR—the latter can function independently on roads without stoplights.
DENSO invests in semiconductor laser technology startup TriLumina; speeding up LiDAR adoption for ADAS, autonomous driving
April 08, 2016
DENSO International America, Inc. has invested in TriLumina Corp., a semiconductor laser technology company that focuses on providing light sources for LiDAR and interior illumination products. DENSO is looking to speed up the adoption of LiDAR and driver monitoring technologies in advanced driver assistance systems (ADAS) and in autonomous vehicles. This strategic investment will enable TriLumina to gain broader access to the automotive market. The laser technology company also received an investment last year from Caterpillar Ventures.
TriLumina has developed eye-safe semiconductor lasers that are among the most versatile laser illuminator solutions available in the market. TriLumina is hoping to accelerate the automotive industry’s adoption of semi-autonomous and autonomous vehicles by providing lasers for 100% solid-state LiDAR products and advanced driver monitoring systems (DMS).
2017 Ford Fusion offers adaptive cruise control with automatic stop-and-go technology
April 05, 2016
The 2017 Ford Features offers a new stop-and-go technology—piggybacking on the existing adaptive cruise control feature—which automatically accelerates and brakes for the driver while maintaining a safe distance from the vehicle ahead.
Using dedicated steering wheel buttons, adaptive cruise control with stop-and-go allows drivers to set cruise control speed and following distance from the vehicle ahead. The semi-autonomous technology can automatically adjust the set speed for comfortable travel—much like a human driver would—bringing the car to a full stop when traffic halts.
Saarbrücken engineers developing networked self-analyzing electric motors
March 23, 2016
Engineers from Saarland University are developing intelligent motor systems that function without the need for additional sensors. By essentially transforming the motor itself into a sensor, the team led by Professor Matthias Nienhaus is creating smart motors that can tell whether they are still running smoothly, can communicate and interact with other motors and can be efficiently controlled.
By using data collected from the motor while it is operating, the researchers are able to calculate quantities that in other systems would need to be measured by additional sensors. Further, they are teaching the drive how to make use of this knowledge.
Daimler demonstrates autonomous truck platooning; Highway Pilot Connect delivers ~7% lower fuel consumption
March 22, 2016
Daimler Trucks demonstrated the new Highway Pilot Connect system for autonomous truck platooning on the A52 autobahn near Düsseldorf. Three WiFi-connected, autonomously driving trucks operated on the autobahn with authorization for public traffic in a platoon formation.
Such a combination can reduce fuel consumption by up to 7% and the road space requirement on motorways by almost half, while improving traffic safety at the same time, Daimler said. Based on the Daimler Trucks Highway Pilot system for autonomously driving heavy trucks (earlier post), the three trucks link up to form an aerodynamically optimized, fully automated platoon.
Munich Re America launches transit bus collision avoidance pilot in Washington with Mobileye Shield+ system
March 17, 2016
Munich Reinsurance America, one of the largest reinsurers in the US, in collaboration with the Washington State Transit Insurance Pool (WSTIP), has launched a pilot program equipping transit buses with the award-winning collision avoidance system Mobileye Shield+. Rosco Vision Systems is the official North American provider and driver-interface manufacturer of this system.
Mobileye is a technology leader in the area of software algorithms, system-on-chips and customer applications that are based on processing visual information for the market of driver assistance systems (DAS). Shield+, designed for large vehicles operating in urban environments, enables early detection of cyclists and pedestrians by using an array of strategically placed artificial vision smart cameras.
20 automakers commit to make automatic emergency braking standard on new vehicles no later than 2022; faster than regulatory process
The US Department of Transportation’s National Highway Traffic Safety Administration (NHTSA) and the Insurance Institute for Highway Safety (IIHS) announced the commitment by 20 automakers representing more than 99% of the US. auto market to make automatic emergency braking (AEB) a standard feature on virtually all new cars in the US no later than NHTSA’s 2022 reporting year, which begins 1 Sept 2022.
Automakers making the commitment are Audi; BMW; FCA US LLC; Ford; General Motors; Honda; Hyundai; Jaguar Land Rover; Kia; Maserati; Mazda; Mercedes-Benz; Mitsubishi Motors; Nissan; Porsche; Subaru; Tesla Motors; Toyota; Volkswagen; and Volvo Car USA. The unprecedented commitment means that this important safety technology will be available to more consumers more quickly than would be possible through the regulatory process.
Honda R&D using IBM Watson IoT technology for real-time monitoring and data analysis in F1 racers
Honda R&D is monitoring and analyzing data from more than 160 sensors in Formula One (F1) cars using IBM Watson Internet of Things (IoT) technology. Drivers and crews can apply data and analytics in real-time to help streamline performance and improve fuel efficiency, enabling drivers to make real-time racing decisions based on this data, such as speed adjustments and pit stops.
To help mark its return to Formula One racing and reach new milestones in efficiency for both race cars and future consumer models, Honda R&D developed a new system to analyze data from the hybrid power units quickly and efficiently to check residual fuel levels and estimate the possibility of mechanical problems. Honda is using the IBM IoT for Automotive solution, based on IBM Watson IoT technology, to deliver data generated from cars, including temperature, pressure and power levels, directly to the cloud for real-time analysis.
New Buick LaCrosse upgrades computing power from 17 to 31 ECUs; new electronic control system
March 16, 2016
The all-new Buick LaCrosse, which launches this week in China, features significant upgrades in computing power and networking to advance connectivity and safety features.
There are 31 ECUs distributed in the all-new Buick LaCrosse—its predecessor utilized only 17. This 82% increase in the number of ECUs helps to optimize calculating efficiency. In order to facilitate the handling of large quantities of data, a specific data bus is arranged to connect ECUs, each of which can process data independently.
Continental acquires Hi-Res 3D Flash LIDAR business from ASC; highly or fully automated driving
March 03, 2016
International automotive supplier Continental has acquired the Hi-Res 3D Flash LIDAR business from Advanced Scientific Concepts, Inc. (ASC) based in Santa Barbara, California. The technology will further enhance the company’s Advanced Driver Assistance Systems (ADAS) product portfolio with a future-orientated solution to add to the group of surrounding sensors needed to achieve highly and fully automated driving.
The Hi-Res 3D Flash LIDAR sensor technology provides both real-time machine vision as well as environmental mapping functions. This technology will help to enable a significantly more detailed and accurate field of vision around the entire vehicle, independent of day or night time and robust in adverse weather conditions.
New algorithm improves speed and accuracy of pedestrian detection; cascade detection + deep learning
February 08, 2016
Researchers at the University of California, San Diego have developed a pedestrian detection system that performs in near real-time (2-4 frames per second) and with higher accuracy (close to half the error) compared to existing systems. The technology, which incorporates deep learning models, could be used in “smart” vehicles, robotics and image and video search systems.
The new pedestrian detection algorithm developed by Nuno Vasconcelos, electrical engineering professor at the UC San Diego Jacobs School of Engineering, and his team combines a traditional computer vision classification architecture—cascade detection—with deep learning models.
Renesas Electronics and TTTech collaborate on new ADAS-ECU development platform with high computing performance and advanced functional safety
Renesas Electronics Corporation and TTTech Computertechnik AG, a global leader in robust networking and safety controls, have agreed to collaborate on the development of a new automotive platform solution aimed at providing a future-proof, high-performance advanced electronic control unit (ECU) development platform for advanced driver assistance systems (ADAS) and automated driving functionality.
The automotive platform solution will integrate Renesas’ automotive control microcontroller (MCU), the RH850/P1x, and high-performance R-Car system-on-chips (SoCs) with TTTech’s TTIntegration, a software platform, to enable highly complex automotive solutions including highly automated driving. Additionally to the physical integration, the development platform achieves parallel, multi-vendor development and integration of individual software components.
Renesas camera video processing circuit block with low latency, high performance, and low power consumption for SOC for autonomous driving
At the International Solid-State Circuits Conference (ISSCC) held in San Francisco earlier this month, Renesas Electronics announced the development of a new video processing circuit block for use in automotive computing system-on-chips (SoCs) that will support autonomous vehicles.
The newly developed video processing circuit block will realize automotive computing systems integrating vehicle information systems and driving safety support systems by enabling massive video processing without imposing any additional load on the CPU and GPU, with real-time performance, low power consumption, and low delay. Renesas intends to incorporate the new video processing circuit block into its future automotive computing SoCs to contribute to a safer and more convenient driving experience.
Continental urea sensors for efficient SCR NOx aftertreatment in diesels; measuring level, quality and temperature
February 05, 2016
Continental has begun production of urea sensors for the first time to support more efficient exhaust-gas aftertreatment in diesel engines. The sensor measures the level, quality, and temperature of the aqueous urea solution in the “AdBlue” tank used in conjunction with selective catalytic reduction (SCR) for NOx reduction.
The sensor-aided denitrification supports fulfillment of the legal requirements and reinforces drivers’ trust that their car emits no more than the maximum permissible level of nitrogen oxides.
New QNX software platform enables ADAS and automated driving
January 23, 2016
QNX Software Systems Limited, a subsidiary of BlackBerry Limited, earlier this month introduced the QNX Platform for ADAS (advanced driver assistance systems), expanding its portfolio of automotive software products. The QNX Platform for ADAS is scheduled for general release in Q2 2016.
Designed for scalability, the platform will enable automotive companies to build a full range of automated driving systems, from informational ADAS modules that provide a 360° surround view of the vehicle, to sensor fusion systems that combine data from multiple sources such as cameras and radar, to high-performance processors that make control decisions in fully autonomous vehicles.
Ricardo white paper outlines needed developments to realize autonomous driving
January 14, 2016
Engineering firm Ricardo has published a white paper—Key Enablers for the Fully Autonomous Vehicle—highlighting the technologies and development processes that are needed to develop commercially feasible self-driving cars that meet consumer expectations while also achieving compliance with likely future transport regulations.
According to the Boston Consulting Group, the projected size of the global autonomous vehicle market in 2025 will be $36 billion for partially autonomous vehicles (levels 1–3) and $6 billion for fully autonomous vehicles (level 4). This includes both passenger and commercial vehicle uses. The realization of fully autonomous vehicles will require further evolution in software, sensors, integration and efficient system testing beyond what is in place for current advanced driver assistance systems.
Ford, U Michigan collaborating on enablers for autonomous driving in the snow; high-resolution 3D mapping
January 12, 2016
Typical autonomous vehicle sensors are useless on snow-covered roads, but researchers at the University of Michigan and Ford are collaborating on a solution. In Michigan and on U-M’s 32-acre Mcity simulated urban environment, they have conducted what they believe are the industry’s first tests of autonomous vehicles in wintry conditions.
Fully autonomous driving can’t rely on GPS, which is accurate only to several yards—not enough to localize or identify the position of the vehicle. It is essential for an autonomous vehicle to know its precise location, not just within a city or on a road, but in its actual driving lane—a variation of a few inches makes a big difference.
Kia Motors introduces new DRIVE WISE sub-brand for advanced driver assistance and autonomous driving technologies
January 06, 2016
At CES 2016, Kia Motors launched a new sub-brand—DRIVE WISE—to encompass its future Advanced Driver Assistance Systems (ADAS). Kia recently announced plans to manufacture partially-autonomous cars by 2020, and aims to bring its first fully-autonomous vehicle to market by 2030. (Earlier post.)
A preliminary $2-billion investment by Kia by 2018 is intended to fast-track development of the new DRIVE WISE technologies. The state of Nevada recently granted Kia a special licence to test the new technologies on public roads. Kia’s battery-electric Soul EV is acting as the brand’s testbed for the development of next-generation DRIVE WISE technologies.
NVIDIA introduces DRIVE PX 2 platform for autonomous driving
At CES 2016, NVIDIA introduced NVIDIA DRIVE PX 2—a high-performance computing platform for in-vehicle artificial intelligence applied to the complexities inherent in autonomous driving. DRIVE PX 2 utilizes deep learning on NVIDIA’s most advanced GPUs for 360-degree situational awareness around the car, to determine precisely where the car is and to compute a safe, comfortable trajectory.
DRIVE PX 2—which delivers processing power equivalent to 150 MacBook Pros—uses two next-generation Tegra processors plus two next-generation discrete GPUs, based on the Pascal architecture, to deliver up to 24 trillion deep learning operations per second, which are specialized instructions that accelerate the math used in deep learning network inference. That’s more than 10 times the computational horsepower than the previous-generation DRIVE PX.
Ford tripling autonomous vehicle development fleet; Gen 3 autonomous dev platform; new Velodyne LiDAR
January 05, 2016
Ford is tripling its fleet of fully autonomous Ford Fusion Hybrid test vehicles—making it the largest in the automotive industry, according to the company—and will use a new-generation sensor technology as the company further accelerates its autonomous vehicle development plans.
This year, Ford will add 20 Fusion Hybrid autonomous vehicles, bringing the company’s autonomous fleet to about 30 vehicles being tested on roads in California, Arizona and Michigan. The newest vehicles are on Ford’s third-generation autonomous vehicle development platform, built using Fusion Hybrid sedans, similar to the second-generation platform.
Toyota developing new cloud-based high-precision map generation system using data from production vehicles
December 22, 2015
To aid the safe implementation of automated driving, Toyota is developing a high-precision map generation system that will use data from on-board cameras and GPS devices installed in production vehicles. The new system will go on display at CES (Consumer Electronics Show) 2016 in Las Vegas next month.
Toyota’s new system uses camera-equipped production vehicles to gather road images and vehicle positional information. This information is sent to data centers, where it is automatically pieced together, corrected and updated to generate high precision road maps that cover a wide area.
2nd annual Bosch analysis of ADAS finds emergency braking and lane assist systems on the rise
December 21, 2015
An analysis of new car registration in Germany for 2014 reveals that driver assistance systems are playing an increasingly greater role when it comes to purchasing a car. The importance of lane assist and automatic emergency braking systems in particular has grown significantly, according to the analysis by Bosch—the company’s second annual assessment of adoption of the technology.
According to the evaluation, based on the 2014 registration statistics, one in five of the nearly three million newly registered passenger cars in Germany last year were equipped with such systems. By way of comparison, the evaluation for 2013 revealed that the two assistance systems featured in only one in ten new cars.
Ford to begin autonomous vehicle testing on California roads; Silicon Valley Lab accelerates Smart Mobility Plan
December 16, 2015
Fully autonomous Ford Fusion Hybrid sedans will begin testing on California streets next year, as Ford Research and Innovation Center Palo Alto continues growing.
Ford is officially enrolled in the California Autonomous Vehicle Tester Program to test autonomous vehicles on public roads. The testing is further advancement of Ford’s 10-year autonomous vehicle development program and a key element of Ford Smart Mobility, the plan to take the company to the next level in connectivity, mobility, autonomous vehicles, the customer experience, and data and analytics.
Mercedes adding semi-autonomous Active Lane Change Assist to new E-Class in spring
December 09, 2015
Mercedes-Benz will expand the new Driving Assistance package of the future E-Class for the market launch in spring with the semi-autonomous Active Lane Change Assist function. The radar- and camera-based assistance system supports the driver in changing lanes—for example, when passing on multi-lane roads. The system offers a significant further increase in comfort and can help to prevent collisions.
Active Lane Change Assist is a sub-function of DRIVE PILOT and thus a component of the Driving Assistance package from Mercedes-Benz, which will have its world premiere in the new E-Class next year. The new E-Class will be launched in Germany in spring 2016.
Audi’s production electric e-tron quattro may set a new standard for vehicle handling; advanced vehicle dynamics control
December 01, 2015
The production version of Audi’s battery electric e-tron quattro SUV (earlier post), due out in 2018, is an important vehicle for the brand, and especially for Audi of America, which is driving many of the requirements for the C-segment electric SUV. At the LA Auto Show, Audi of America President Scott Keogh said that the brand is targeting at least 25% of its US sales to be e-tron—i.e., plug-in hybrid or full electric—models by 2025. (Earlier post.)
He also observed, in a conversation on the eve of the LAAS, that if an automaker can launch a defining product, the game changes. Audi intends for the e-tron quattro to become such a defining product—an Audi electric SUV in the high-volume C-segment that is reasonably priced (in the luxury sector) and delivers longer range along with all the “known” benefits of electric drive: speedy acceleration, quiet, lower cost of fueling, and zero tailpipe emissions (as well as the high-value HOV sticker for some markets). But to make this a benchmark product, Audi intends to go further, leveraging the three-motor electric quattro powertrain through very advanced software controls to deliver a ride and handling experience not possible even in an all-wheel drive electric vehicle equipped with a torque vectoring differential mechanism (e.g., a Tesla).
IHS Automotive sees Google leading technology, testing, software development for autonomous driving
November 13, 2015
The key to self-driving cars is software that can interpret all of a vehicles’ sensors and learn to mimic the driving skills and experiences of the very best drivers. Google is the current technology leader in this arena, according to a report from industry analysts IHS Automotive: “Google Self-Driving Car Strategy and Implications”. IHS estimates suggest Google has invested nearly $60 million so far in autonomous vehicle research and development, at a run rate of nearly $30 million per year.
Unlike traditional vehicle manufacturers, Google also has the ability to leverage adjacent technologies and learnings from its other projects and investments—including robotics, drones and related technologies that help automotive operations, such as neural networks, artificial intelligence (AI), machine learning and machine vision. This provides Google researchers additional expertise not available directly to traditional OEMs.
Ford first to test autonomous vehicle at U Michigan Mcity
Ford is the first automaker to test autonomous vehicles at Mcity—the full-scale simulated real-world urban environment at the University of Michigan. (Earlier post.) The 32-acre facility is part of the university’s Mobility Transformation Center.
Ford has been testing autonomous vehicles for more than 10 years and is now expanding testing on the diversity of roads and realistic neighborhoods of Mcity near the North Campus Research Complex to accelerate research of advanced sensing technologies.
Marvell introduces 1st 1000BASE-T1 automotive Ethernet PHY transceiver; Gigabit Ethernet for connected cars
October 20, 2015
Marvell, a leading fabless semiconductor company, introduced the Marvell 88Q2112, the industry’s first 1000BASE-T1 automotive Ethernet physical layer (PHY) transceiver compliant with the draft IEEE 802.3bp 1000BASE-T1 standard—i.e., Gigabit Ethernet for cars. The 88Q2112 supports the industry’s highest in-vehicle connectivity bandwidth and is designed to meet the rigorous EMI requirements of an automotive system.
The 1000BASE-T1 standard allows high speed and bi-directional data traffic over light weight, low-cost, single pair cable harnesses. The Marvell 88Q2112 will sample to Marvell’s global customers starting in November 2015.
Tesla v7.0 software update boosts self-driving capabilities; Autopilot arrives
October 15, 2015
Beginning in October 2014, Tesla began equipping Model S vehicles with hardware to support the incremental introduction of self-driving technology: a forward radar; a forward-looking camera; 12 long-range ultrasonic sensors positioned to sense 16 feet around the car in every direction at all speeds; and a high-precision digitally-controlled electric assist braking system. (Earlier post.)
The company’s latest software release—Tesla Version 7.0—enables a range of new active safety and convenience features based on that hardware, designed to work in conjunction with the automated driving capabilities already offered in Model S: Lane Departure Warning; Blind Spot Warning; Intelligent Speed Assist; Traffic-Aware Cruise Control (TACC); Forward Collision Warning; and Automatic Emergency Braking.
Renesas Electronics America introduces new vehicle-level platform for development of autonomous and V2X driving
October 13, 2015
Renesas Electronics America has introduced a new, comprehensive, vehicle-level driving platform to accelerate the development of autonomous and connected driving systems.
Working in close collaboration with Harbrick, NewFoundry, Arada Systems, eTrans, and Cogent Embedded, Renesas developed a fleet of cars to operate as a modular and open laboratory for automotive customers. This strategy enables safe integrated solutions beyond silicon, while reducing design risk and accelerating time to market through pre-testing and integration.
Panasonic highlighted automated and connected vehicle technologies at ITS World Congress; anti-hacker security
October 11, 2015
Panasonic exhibited its portfolio of automated and connected vehicle technologies and Intelligent Transport Systems (ITS) big data solutions at ITS World Congress 2015, Bordeaux. Panasonic offers automotive and device solutions for pedestrians and cars, infrastructure solutions for homes, roads and more.
In the report “Analysis of the Advanced Driver Assistance Systems (ADAS) Market in Europe”, Frost & Sullivan Research Analyst Cathy Brown points out that “Driven by legislation and cost effective vehicles, the ADAS market in Europe is entering a new phase of applications integrating both safety and comfort.” According to Masahisa Shibata, President of Panasonic Automotive and Industrial Systems Europe, Panasonic’s global strategy focuses on three key business areas:
$90M UR:BAN research initiative presenting results on ADAS and traffic management for cities; intelligent vehicles
October 07, 2015
In Düsseldorf, the 31 partners—automobile and electronics manufacturers, suppliers, communication technology and software companies, research institutes and cities—involved in the UR:BAN research initiative (Urban Space: user-friendly assistance systems and network management) presented the results of four years of work in a two-day event.
UR:BAN’s goal is to develop advanced driver assistance and traffic management systems for cities, with a focus on the human element in all aspects of mobility and traffic. The project pursued its objectives in three main thematic target areas: Cognitive Assistance; Networked Traffic System; and Human Factors in Traffic.
Toyota testing new Highway Teammate automated driving vehicle; aiming for commercialization around 2020
October 06, 2015
Toyota has been testing a new automated driving platform, a modified Lexus GS called Highway Teammate, with the aim of launching related products by around 2020. In addition to demonstrating the capabilities of next-generation safety technologies, the vehicle represents Toyota’s view of the evolving driver-car relationship in the age of artificial intelligence.
Toyota believes that interactions between drivers and cars should mirror those between close friends who share a common purpose, sometimes watching over each other and sometimes helping each other out. Toyota refers to this approach as the Mobility Teammate Concept. Highway Teammate represents an important first effort to give form to this concept.
Cohda introduces new 360˚ radar for V2X connected vehicles
October 04, 2015
Cohda Wireless is introducing a low-cost, 360-degree radar for vehicles fitted with V2X connected car systems. The new V2X-Radar delivers a 360-degree sensor that can detect buildings, road signs and also older vehicles that are not equipped with V2X technology. Unlike current technologies, it is unaffected by rain, snow or fog, and can “see” around corners.
V2X-Radar takes advantage of current V2X systems that use IEEE 802.11-compliant wireless signals to share sensor information between vehicles and infrastructure. These radio signals bounce off many objects—walls, road signs and other vehicles—as they travel from transmitter to receiver. V2X-Radar can use these radio waves
Daimler Trucks testing the first series-production semi-autonomous truck on public roads
October 02, 2015
Daimler Trucks today demonstrated the first series-production truck to operate on an automated basis—automation level 2 (partially automated driving)—on the A8 public highway between Denkendorf and Stuttgart. The standard Mercedes-Benz Actros equipped with the Highway Pilot system is approved as a test vehicle in accordance with §19/6 StVZO (German road traffic type approval law).
The version of the Highway Pilot installed in the Actros allows semi-autonomous driving. Although the Highway Pilot is able to steer the truck by itself on motorways, the driver retains full responsibility; needs to monitor the traffic at all times; and must be able to intervene at any time. Daimler compares the Highway Pilot to the autopilot commonly used in aviation.
Renesas introducing R-Car W2R 5.9 GHz automotive wireless communication SoC for V2X communication
September 30, 2015
Renesas Electronics Corporation announced the R-Car W2R system-on-a-chip (SoC), the first member of the new Renesas R-Car Family of devices developed specifically for V2X applications. The new automotive wireless communication SoC is designed for Vehicle-to-Vehicle (V2V) and Vehicle-to-Infrastructure (V2I) communication in the 5.9-gigahertz (GHz) band.
Through the use of Renesas’ exclusive RF system design technology, the R-Car W2R is the first in the world to succeed in suppressing out-of-band transmission signal noise below the -65 dBm stipulated by the European Telecommunications Standards Institute (ETSI). This makes it possible for products that use the R-Car W2R to achieve the best noise characteristics in the world. This enables high-quality signals with little interference to be transmitted, making it possible to implement vehicle-to-everything (V2X) at a practical level in various kinds of ADAS systems for supporting safe driving.
DOE announces $70M for Innovation Institute on Smart Manufacturing; advanced sensors, controls, platforms, and modeling for manufacturing
September 18, 2015
The US Department of Energy announced up to $70 million in funding (DE-FOA-0001263) for the next Clean Energy Manufacturing Innovation Institute, which will be focused on smart manufacturing. With this investment, the DOE aims to support research and development advancements that can reduce the cost of deployment for technologies such as advanced sensors, controls, platforms, and modeling for manufacturing by as much as 50%. As part of President Obama’s National Network of Manufacturing Innovation (NNMI) institutes, the institute will also demonstrate these technologies in manufacturing processes with a goal to increase energy efficiency by at least 15% and improve energy productivity by at least 50%.
“Energy intensive industries, such as steelmaking, could see a 10 to 20 percent reduction in the cost of production, making products such as solar panels and chemical materials, such as plastics, as well as the cars and other products they go into, more affordable for American consumers,” said Energy Secretary Ernest Moniz. The goals of the Smart Manufacturing Institute are to: