[Due to the increasing size of the archives, each topic page now contains only the prior 365 days of content. Access to older stories is now solely through the Monthly Archive pages or the site search function.]
Koito and Quanergy collaborate to design automotive headlight concept with built–in solid-state LiDAR
January 07, 2017
Koito Manufacturing Co., Ltd., the largest global maker of automotive headlights, and Quanergy Systems, Inc., a leading provider of LiDAR sensors and smart sensing solutions, are collaborating to design an automotive headlight concept with built-in Quanergy S3 solid state LiDAR sensors (earlier post). The Koito headlight with built-in sensors is on display at CES 2017.
The Koito headlights, which will be located on the corners of a vehicle, each incorporates two compact Quanergy S3 solid state LiDARs that perform sensing forward and to the side, and provide real-time long-range 3D views of the environment around the vehicle and the ability to recognize and track objects.
Next-gen Audi A8 to feature MIB2+, series debut of zFAS domain controller, Mobileye image recognition with deep learning; Traffic Jam Pilot
January 05, 2017
Audi’s next-generation A8, premiering this year, will feature the first implementation of the MIB2+ (Modular Infotainment Platform). The key element in this new implementation of the MIB is NVIDIA’s Tegra K1 processor (earlier post), which makes new functions possible and has the computing power needed to support several high-resolution displays—including the second-generation Audi virtual cockpit. Onboard and online information will merge, making the car part of the cloud to a greater degree than ever.
The A8 also marks the series debut of the the central driver assistance controller (zFAS), which also features the K1; in the future, the X1 processor (earlier post) will be applied in this domain controller. The zFAS, developed in collaboration with TTTech, Mobileye, NVIDIA and Delphi, also integrates a Mobileye image processing chip. (Earlier post.)
Renesas Electronics and TTTech deliver highly automated driving platform
Renesas Electronics and TTTech Computertechnik AG, a global leader in robust networking and safety controls, have developed a highly automated driving platform (HADP). The new HADP is a prototype electronic control unit (ECU) for mass production vehicles with integrated software and tools, which demonstrates how to use Renesas and TTTech technologies combined in a true automotive environment for autonomous driving. The HADP accelerates the path to mass production for Tier 1s and OEMs.
The newly released HADP is the first outcome of the collaboration between TTTech and Renesas announced in January 2016 (earlier post), and is an extended version of the HAD solution kit released in October 2016. It is based on dual R-Car H3 system-on-chips (SoCs) (earlier post) and the RH850/P1H-C microcontroller (MCU).
Audi & NVIDIA partner to deliver fully automated driving with AI starting in 2020; piloted Q7 w/ neural network CES demo
Audi announced a partnership with NVIDIA to use artificial intelligence in delivering highly automated vehicles starting in 2020. Deep learning technology will enable skilled handling of real-road complexities, delivering safer automated vehicles earlier. The first phase of this expanded collaboration between the nearly decade-long partners focuses on NVIDIA DRIVE PX, which uses trained AI neural networks to understand the surrounding environment, and to determine a safe path forward. (Earlier post.)
Audi and NVIDIA have combined their engineering and visual computing technologies in the past on Audi innovations such as Audi MMI navigation and the Audi virtual cockpit. Later this year Audi will introduce the next-generation Audi A8 featuring Traffic Jam Pilot—the world’s first Level 3 automated vehicle (as defined by SAE International) equipped with a first-generation central driver assistance domain controller (zFAS) that integrates NVIDIA computing hardware and software. (Earlier post.)
BMW, Intel, Mobileye: 40 autonomous BMWs to be on road by 2H 2017; standards-based open platform for autonomy
BMW Group, Intel and Mobileye announced that a fleet of approximately 40 autonomous BMW vehicles will be on the roads by the second half of 2017, demonstrating the significant advancements made by the three companies towards fully autonomous driving. Revealing this at a podium discussion held during a joint press conference at CES, the companies further explained that the BMW 7 Series will employ advanced Intel and Mobileye technologies during global trials starting in the US and Europe.
In July 2016, BMW Group, Intel and Mobileye announced a collaboration to bring solutions for highly and fully automated driving into series production by 2021. The three said they would create a standards-based open platform—from door locks to the datacenter—for the next generation of cars. (Earlier post.) The companies have since developed a scalable architecture that can be adopted by other automotive developers and carmakers to pursue state of the art designs and create differentiated brands. The offerings scale from individual key integrated modules to a complete end-to-end solution providing a wide range of differentiated consumer experiences.
Renesas Electronics unveils RH850/V1R-M automotive radar solution for ADAS and autonomous driving vehicles
January 04, 2017
Advanced semiconductor supplier Renesas Electronics Corporation introduced the RH850/V1R—its first product from the new RH850-based, 32-bit, automotive radar microcontroller (MCU) series—that will deliver the high performance and features required for enabling future advanced driver assistance systems (ADAS) and autonomous driving vehicles. The RH850/V1R-M includes a digital signal processor (DSP) and high speed serial interfaces and is specifically designed for middle- to long-range radars.
Vehicles are being equipped with a broad spectrum of sensors such as cameras, LiDAR and ultrasonic sensors to support expanded advanced driver assistance (ADAS) and emerging autonomous driving functionality. Radar sensors are needed for ADAS applications—including advanced emergency braking and adaptive cruise control—because, unlike other sensors, radar sensors are not negatively affected by external environmental limitations which includes adverse weather conditions, such as rain, fog or whether the sun is shining or not.
Qualcomm introducing Drive Data platform for sensor fusion
Qualcomm is introducing the Qualcomm Drive Data Platform to collect and analyze intelligently information from a vehicle’s sensors. Cars will be able to determine their location up to lane-level accuracy, to monitor and to learn driving patterns, to perceive their surroundings, and to share this reliable and accurate data with the rest of the world.
These capabilities will be key for many connected car applications, from shared mobility and fleet management to 3D high-definition mapping and automated driving. Qualcomm Drive Data platform is built on three pillars: heterogeneous connectivity; precise positioning; and on-device machine learning, all integrated into the Qualcomm Snapdragon solution.
Mitsubishi Electric showcasing 3D Advanced Mobile Mapping System at CES 2017
January 03, 2017
Mitsubishi Electric Corporation, along with Mitsubishi Electric US, Inc., will display a future concept of the recently released new model of its Mitsubishi Mobile Mapping System, the MMS-G22, at CES 2017. The MMS-G220 is a highly accurate measuring system using car-mounted GPS antennas, laser scanners and cameras. (Earlier post.)
The system gathers 3-D positioning data of road surfaces and roadside features to an absolute accuracy of 4 inches (10 cm), allowing the creation of comprehensive 3D maps to the level of accuracy needed to support autonomous driving.
Lucid Motors chooses Mobileye as partner for autonomous vehicle technology
December 30, 2016
Luxury EV developer Lucid Motors and Mobileye N.V. announced a collaboration to enable autonomous driving capability on Lucid vehicles. Lucid will launch its first car, the Lucid Air (earlier post), with a complete sensor set for autonomous driving from day one, including camera, radar and lidar sensors.
Lucid chose Mobileye to provide the primary compute platform, full 8-camera surround view processing, sensor fusion software, Road Experience Management (REM) crowd-based localization capability, and reinforcement learning algorithms for Driving Policy. These technologies will enable a full Advanced Driver Assistance System (ADAS) suite at launch, and then enable a logical and safe transition to autonomous driving functionality through over-the-air software updates.
HERE and Mobileye to partner on crowd-sourced HD mapping for automated driving
December 29, 2016
High-definition (HD) mapping company HERE and Mobileye, developer of computer vision and machine learning, data analysis, localization and mapping for Advanced Driver Assistance Systems and autonomous driving, plan to enter a strategic partnership that links their respective autonomous driving technologies into an enhanced industry-leading offering for automakers.
Under the partnership, Mobileye’s Roadbook—a detailed, cloud-based map of localized drivable paths and visual landmarks constantly updated in real-time—will be integrated as a data layer in HERE HD Live Map, HERE’s real-time cloud service for partially, highly and fully automated vehicles. Roadbook information will provide an important additional layer of real-time contextual awareness by gathering landmark and roadway information to assist in making a vehicle more aware of—and better able to react to—its surroundings, as well as to allow for more accurate vehicle positioning on the road.
Ford introducing next-gen Fusion Hybrid autonomous development vehicle at CES and NAIAS in January
December 28, 2016
Ford Motor Company is introducing its next-generation Fusion Hybrid autonomous development vehicle; the car will first appear at CES 2017 and the North American International Auto Show in January. The new vehicle uses the current Ford autonomous vehicle platform, but ups the processing power with new computer hardware.
Electrical controls are closer to production-ready, and adjustments to the sensor technology, including placement, allow the car to better see what’s around it. New LiDAR sensors have a sleeker design and more targeted field of vision, which enables the car to now use just two sensors rather than four, while still getting just as much data.
TriLumina to demo 256-pixel 3D solid-state LiDAR and ADAS systems for autonomous driving at CES 2017
December 27, 2016
At CES 2017, TriLumina (earlier post)—a spin-out from Sandia National Laboratories—will demonstrate, in collaboration with LeddarTech (earlier post), an innovative 256-pixel, 3D LiDAR solution for autonomous driving applications powered by TriLumina’s breakthrough laser illumination module and LeddarTech’s LeddarCore ICs.
TriLumina has developed eye-safe, vertical-cavity surface-emitting lasers (VCSELs). The TriLumina illumination modules replace the expensive, bulky scanning LiDARs being used in current autonomous vehicle demonstration programs with high resolution and long-range sensing in a small, robust and cost-effective package.
U of Waterloo Autonomoose autonomous vehicle on the road in Canada
December 23, 2016
Researchers from the University of Waterloo Center for Automotive Research (WatCAR) in Canada are modifying a Lincoln MKZ Hybrid to autonomous drive-by-wire operation. The research platform, dubbed “Autonomoose” is equipped with a full suite of radar, sonar, lidar, inertial and vision sensors; NVIDIA DRIVE PX 2 AI platform (earlier post) to run a complete autonomous driving system, integrating sensor fusion, path planning, and motion control software; and a custom autonomy software stack being developed at Waterloo as part of the research.
Recently, the Autonomoose autonomously drove a crew of Ontario Ministry of Transportation officials to the podium of a launch event to introduce the first car approved to hit the roads under the province’s automated vehicle pilot program.
LeddarTech showcasing 2D and 3D solid-state LiDARs for mass-market autonomous driving deployments; Leddar Ecosystem
December 16, 2016
At CES 2017, LeddarTech will be showcasing 2D and 3D high-resolution LiDAR solutions for autonomous driving applications based on its next-generation LeddarCore ICs and developed with the collaboration of leading-edge suppliers and partners from the newly-established Leddar Ecosystem. (Earlier post.)
Presented publicly for the first time, these systems demonstrate the scalability of Leddar technology and its ability to meet the high levels of performance, resolution, and cost-effectiveness required by Tier-1 and OEMs for mass-market autonomous driving applications. These LiDAR systems’ production versions will offer resolutions of up to 512×64 on a field of view of 120×20 degrees, and detection ranges that exceed 200 m for pedestrians and over 300 m for vehicles.
Velodyne LiDAR announces new design for miniaturized, low-cost solid-state LiDAR sensors using GaN technology
December 13, 2016
Velodyne LiDAR announced a new design for a solid-state LiDAR sensor that can deliver a subsystem cost of less than US$50 when sold in high-volume manufacturing scale. The technology will impact the proliferation of LiDAR sensors in multiple industry sectors, including autonomous vehicles, ridesharing, 3D mapping, and drones.
LiDAR sensors that leverage this new design will be less expensive, easier to integrate due to their smaller size, and more reliable as a result of fewer moving parts. The technology can also be integrated in Velodyne LiDAR’s existing Puck form factors.
Daimler joining MIT CSAIL Alliance Program for AI work; cognitive vehicles
December 07, 2016
Daimler is becoming a new member of the MIT CSAIL Alliance Program. MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) is the largest research laboratory at MIT and one of the world’s most important centers of information technology research. With 1,000 members and more than 100 principal investigators coming from eight departments, CSAIL includes approximately 50 research groups organized into three focus areas: artificial intelligence, systems and theory.
Key CSAIL initiatives currently underway include tackling the challenges of big data, developing new models for wireless and mobile systems, securing computers and the cloud against cyber attacks, rethinking the field of artificial intelligence, and developing the next generation of robots. CSAIL Alliances is a gateway into the lab for organizations seeking a closer connection to the work, researchers and students of CSAIL.
Delphi & Mobileye to showcase Centralized Sensing Localization and Planning (CSLP) autonomous driving system in public demo at CES 2017
November 30, 2016
Delphi Automotive PLC and Mobileye will showcase their Centralized Sensing Localization and Planning (CSLP) automated driving system—which will be ready for production by 2019—on a 6.3-mile urban and highway combined public route in Las Vegas for CES 2017. (Earlier post.)
The partners said that CSLP is the first turnkey, fully integrated automated driving solution with an industry-leading perception system and computing platform. (Intel will provide the system-on-a-chip (SOC) for the systems.) The Las Vegas drive will tackle everyday driving challenges such as highway merges, congested city streets with pedestrians and cyclists and a tunnel.
Hyliion developing hybrid system for semi-trailers
November 27, 2016
Hyliion, a start-up founded by a group of graduate students at Carnegie Mellon, has developed an add-on hybrid system for semi-trailers. Combining regenerative braking and power boost for the trailer to reduce on-the-road fuel consumption, the system also functions as an auxiliary power unit to reduce engine-on idling. Their patent application describing the basics of the system was published earlier this month,
The Hyliion System—comprising motor, battery and control electronics—can power a truck cab for 20 hours, out-performing an industry standard, idle-free all-electric APU. Overall fuel savings are upwards of 30%, according the company.
Volkswagen’s 10-year evolution of Park Assist; heading toward trained parking and higher levels of autonomy
November 26, 2016
Volkswagen first introduced a parking assistance system based on ultrasonic sensors in the early 1990s. However, it was the “Park Assist” Gen 1 system presented in the Touran in 2007 that marked a foundational point in the commercial development of the technology. After it was activated, Park Assist was able to detect parallel parking spaces on the left and right sides of the road as the car passed them using special, side-oriented ultrasonic sensors, enabling semi-automatic parking for the first time.
Volkswagen engineers have continued to enhance the functionality, leading to the release of Gen 3 Park Assist in 2014, with a clear roadmap to the deployment of higher levels of autonomy, including trained parking: fully automated parking with a one-off training process. At a recent visit to Volkswagen’s Ehra proving ground (Prüfgelände Ehra), Green Car Congress had the opportunity to see a prototype of trained parking in action.
nuTonomy to test its self-driving cars on specific public roads in Boston
November 21, 2016
nuTonomy, developer of software for self-driving cars, has signed a Memorandum of Understanding (MOU) with the City of Boston and the Massachusetts Department of Transportation that authorizes nuTonomy to begin testing its growing fleet of self-driving cars on specific public streets in a designated area of Boston.
nuTonomy will begin testing its self-driving Renault Zoe electric vehicle before the end of the year in the Raymond L. Flynn Marine Park in the Seaport section of the city. nuTonomy outfits its vehicles with a software system which has been integrated with high-performance sensing and computing components to enable safe operation without a driver. The company’s autonomous and robotics technology system grew out of research conducted in MIT labs run by nuTonomy co-founders Karl Iagnemma and Emilio Frazzoli.
Hyundai introduces new autonomous IONIQ concept at AutoMobility LA
November 16, 2016
Hyundai Motor Company introduced the Autonomous IONIQ concept during its press conference at AutoMobility LA (Los Angeles Auto Show). With a design resembling the rest of the IONIQ lineup (earlier post), the vehicle is one of the few self-driving cars in development to have a LiDAR system hidden in its front bumper instead of installed on the roof, enabling it to look like any other car on the road and not a high school science project.
Hyundai’s goal for the autonomous IONIQ concept was to keep the self-driving systems as simple as possible. This was accomplished by using the production car’s Smart Cruise Control’s forward-facing radar, Lane Keep Assist cameras and integrated them with LiDAR technology.
Intel to invest more than $250M over next two years in autonomous driving; “Data is the new oil”
November 15, 2016
In a keynote address at the AutoMobility LA conference, Intel CEO Brian Krzanich announced that Intel Capital is targeting more than $250 million of additional new investments over the next two years to make fully autonomous driving a reality. This is the first time Intel is keynoting at an automotive conference, signifying how critical the automotive market has become for the company.
These investments will drive the development of technologies that push the boundaries on next-generation connectivity, communication, context awareness, deep learning, security, safety and more. Drilling down into the areas that will be fueled by the fresh investments, Krzanich highlighted technologies that will drive global Internet of Things (IoT) innovation in transportation; areas where technology can directly mitigate risks while improving safety, mobility, and efficiency at a reduced cost; and companies that harness the value of the data to improve reliability of automated driving systems.
Renesas Electronics delivers 2nd-gen ADAS view solution kit for surround view, electronic mirrors and driver monitoring for autonomous driving
November 08, 2016
Renesas Electronics Corporation has introduced a new all-in-one Advanced Driver Assistance Systems (ADAS) view solution kit. Expanding the success of the first-generation ADAS surround view kit that was launched in October 2015, Renesas’ second-generation ADAS view solution kit with up to eight cameras realizes next-generation electronic mirrors, driver monitoring and surround view systems at the same time.
It has become a standard in autonomous driving and ADAS applications to enable sensor fusion combining and processing the collected information from automotive cameras and radars for vehicles to recognize their surroundings. 360-degree surround view is expected to become an essential feature available in all vehicle segments. Additionally, mirrors will be replaced by cameras, and driver monitoring features will be required for autonomous driving and to increase safety.
Groupe Renault announces strategic partnership with computer vision innovator Chronocam
Groupe Renault has entered into a strategic development agreement with Chronocam SA (earlier post), a developer of biologically-inspired vision sensors and computer vision solutions for automotive applications. This agreement will focus on further developing and applying Chronocam’s innovative approach to sensing and processing visual inputs to Renault’s Advanced Driver Assistance Systems (ADAS) and autonomous driving developments.
Renault previously announced an investment in Chronocam’s Series B round of funding, which raised $15 million for the Paris-based start-up and includes a group of international venture capital funds including: Intel Capital, Robert Bosch Venture Capital, iBionext, 360 Capital and CEA investissement.
Toshiba advances deep learning with extremely low-power neuromorphic processor; supporting IoT edge devices
November 07, 2016
Toshiba has developed what it calls Time Domain Neural Network (TDNN)—a neural network using a time-domain analog and digital mixed signal processing technique—based on a new, extremely low-power consumption neuromorphic semiconductor circuit to perform processing for Deep Learning. (The acronym TDNN (time-delay neural network) is also used broadly to describe feed-forward neural networks, first described in a 1989 paper (Waibel et al.).
Deep learning—as could be applied, for example, in autonomous driving—requires massive numbers of calculations, typically executed on high performance processors that consume a lot of power. However, bringing the power of deep learning to IoT edge devices such as sensors and smart phones requires highly energy-efficient ICs that can perform the large number of required operations while consuming extremely little energy.
Daimler and Valens partner to bring HDBase T Automotive to vehicles in near future
At Electronica 2016 in Munich, Israeli HDBaseT chip maker Valens and Daimler announced their collaboration to bring HDBaseT Automotive into cars in the near future. Daimler has selected HDBaseT Automotive as the technology of choice to guarantee high performance of advanced infotainment, ADAS, and telematics systems.
Valens, as the inventor of HDBaseT and founder of the HDBaseT Alliance, brings the technology and expertise to accomplish the goal of commercializing HDBaseT-enabled vehicles in the near future. (Earlier post.)
New Telit autonomous navigation IoT module relies on internal sensors to deliver class-leading dead reckoning accuracy
November 06, 2016
Telit announced commercial availability of the SL869-3DR, a GNSS (global navigation satellite system) module for global use which leverages information from internal gyros, accelerometers and a barometric pressure sensor to perform dead reckoning (DR) navigation for application areas such as track & trace and in-vehicle systems.
The module delivers accurate position data either directly from its multi-constellation receiver or from a fully autonomous DR system, requiring no connections to external devices or components other than an antenna for satellite signal reception and power. The module allows integrators to design zero-installation, in-vehicle navigation and tracking devices for fleets and other commercial or consumer applications that operate simply perched on the dashboard, connected only to vehicle power.
Chronocam raises $15M in Series B; high-performance bio-inspired vision technology for autos and other machines
October 27, 2016
France-based Chronocam SA, a developer of biologically-inspired vision sensors and computer vision solutions for automotive, IoT and other applications requiring vision processing, raised $15 million in Series B financing. The funding comes from lead investor Intel Capital, along with iBionext, Robert Bosch Venture Capital GmbH, 360 Capital, CEAi and Renault Group.
Chronocam will use the investment to accelerate product development and commercialize its computer vision sensing and processing technology. The funding will also allow the company to expand into key markets, including the US and Asia.
Intel introducing new processor series dedicated for automotive applications
October 26, 2016
Intel is developing a new processor series dedicated for automotive applications. The A3900 series will enable a complete software-defined cockpit solution that incudes in-vehicle infotainment (IVI), digital instrument clusters and advanced driver assistance systems (ADAS)—all in a single, compact and cost-effective SoC.
Intel announced the new automotive processor family along with its introduction of the new Intel Atom processor E3900 series for the Internet of Things (IoT). The A3900 series will allow car makers to offer new levels of determinism for real-time decision-making required in next-generation cars. It is currently sampling with customers and will be available in Q1 2017.
Tesla putting hardware for full autonomy in all models; temporary loss of some Gen1 Autopilot functions
October 20, 2016
Tesla announced that effective immediately, new Tesla vehicles—including Model 3—will have the hardware needed to support full autonomous driving.
The required software for full autonomous driving is still under development and will need validation and regulatory approval. In fact, Teslas with the new hardware will temporarily lack certain features currently available on Teslas with first-generation Autopilot hardware, including some standard safety features such as automatic emergency braking, collision warning, lane holding and active cruise control.
Oryx Vision raises $17M to create novel depth-sensing solution for autonomous vehicles; LiDAR replacement
October 19, 2016
Oryx Vision has emerged from stealth with a veteran team from the Israeli high-tech industry to build a novel depth-sensing solution for autonomous vehicles that overcomes some of the limitations of current LiDAR systems. Oryx has raised $17 million in Series A funding led by Bessemer Venture Partners (BVP), with additional participation from Maniv Mobility and Trucks VC. BVP Partner Adam Fisher will join Oryx’s board of directors.
In order to drive accurately and safely, autonomous vehicles need a highly detailed 3D view of their environment. Existing depth-sensing solutions rely mostly on LiDAR devices, which send short laser pulses while rotating, receive the reflected light back with photo-electric sensors, and thus construct a 3D map of the car’s surroundings, pixel by pixel. However, current LiDAR is mechanically complicated, expensive and has a severe range limit due to eye-safety considerations, Oryx says.
DENSO & Toshiba partner on Deep Neural Network-IP for image recognition systems for ADAS & automated driving
October 17, 2016
DENSO Corporation and Toshiba Corporation have reached a basic agreement jointly to develop an artificial intelligence technology called Deep Neural Network-Intellectual Property (DNN-IP), which will be used in image recognition systems which have been independently developed by the two companies to help achieve advanced driver assistance and automated driving technologies.
The partners expect DNN, an algorithm modeled after the neural networks of the human brain, to perform recognition processing as accurately as, or even better, the human brain.
Infineon acquires Innoluce BV for high-performance solid-state LiDAR systems
October 11, 2016
Semiconductor company Infineon has acquired 100% of Innoluce BV, a fabless semiconductor company headquartered in Nijmegen. Based on the know-how of Innoluce, Infineon will develop chip components for high-performance light detection and ranging (LiDAR) systems. Both companies agreed on confidentiality on the terms.
Innoluce was founded in 2010 as an entrepreneurial spin-off of Royal Philips. It is a fabless semiconductor company headquartered in Nijmegen, The Netherlands, near the Dutch-German border. The company has a strong expertise in micro-electro-mechanical systems (MEMS). Innoluce is a leading innovator of miniature laser scanning modules that integrate silicon-based solid-state MEMS micro-mirrors. Such micro-mirrors are necessary to adjust the laser beams in automotive LiDAR systems.
HERE unveils next-generation open platform real-time data services for automotive industry
September 28, 2016
On the eve of the Paris Motor Show, HERE, the high-definition mapping and location services business company acquired by Audi, BMW and Daimler (earlier post), announced next-generation vehicle-sourced data services. The HERE Open Location Platform will harness real-time data generated by the on-board sensors of connected vehicles—even from competing car brands—to create a live depiction of the road environment.
Drivers will be able to access this view of the road through four services that provide information on traffic conditions, potential road hazards, traffic signage and on-street parking at high quality. The goal is to ensure that drivers have more accurate and timely information with which they can make better driving decisions. HERE plans to make the services commercially available to any customers both within and outside the automotive industry from the first half of 2017.
Tesla leans on radar for Autopilot in Version 8 software
September 12, 2016
With the release of Version 8 of its software, Tesla has made many updates to its Autopilot function. The most significant change however, is its new heavy reliance on the onboard radar. The radar was added to all Tesla vehicles in October 2014 as part of the Autopilot hardware suite, but originally was only meant to be a supplementary sensor to the primary camera and image processing system.
Now, however, Tesla is using more advanced signal processing to use the radar as a primary control sensor without requiring the camera to confirm visual image recognition. The changes come in the wake of the fatal crash (earlier post) in which Autopilot apparently did not detect the white side of the trailer against a brightly lit sky; nor did the driver. As a result, no brake was applied. In the aftermath of that incident, Tesla CEO Elon Musk tweeted that the company was “Working on using existing Tesla radar by itself (decoupled from camera) w temporal smoothing to create a coarse point cloud, like lidar”. (Earlier post.)
DENSO looks to increase holding in FUJITSU TEN, making it a group company
September 10, 2016
Auto parts supplier DENSO Corporation, Fujitsu Limited, and Toyota Motor Corporation have reached a basic agreement to start consideration of changing the capital structure of automotive electronics manufacturer FUJITSU TEN, in which the three companies have stakes. DENSO is part of the Toyota Group.
In the automotive field, the interface between the driver and vehicle is becoming increasingly important due to remarkable technological innovations. Against this backdrop, DENSO has agreed with Fujitsu and Toyota to review specific changes to make FUJITSU TEN a group company of DENSO and to enhance cooperation between the two companies in developing in-vehicle ECUs, millimeter-wave radar (earlier post), advanced driver assistance / automated driving technologies, and basic electronic technologies, among others.
LeddarTech launches LeddarVu, a new scalable platform towards high-resolution LiDAR; Vu8 solid-state LiDAR
September 07, 2016
LeddarTech, a developer of solid-state LiDAR technology (earlier post), introduced LeddarVu, a new platform for the next generation of its Leddar detection and ranging modules. The LeddarVu platform combines the benefits of a very compact, modular architecture with superior performance, robustness and cost efficiency towards high-resolution LiDAR applications, such as autonomous driving.
Leveraging LeddarTech’s advanced, patented signal processing and algorithms, LeddarVu sensors will evolve along with the future generations of the LeddarCore ICs. As previously announced with the company’s development roadmap, upcoming iterations of LeddarCore ICs are expected to deliver ranges reaching 250 m, fields of view up to 140°, and up to 480,000 points per second (with a resolution down to 0.25° both horizontal and vertical), enabling the design of affordable LiDARs for all levels of autonomous driving, including the capability of mapping the environment over 360° around the vehicle.
Quanergy acquires Otus People Tracker software from Raytheon BBN for advanced autonomous driving and security LiDAR applications
August 29, 2016
Quanergy Systems, Inc., the provider of solid-state LiDAR sensors and smart sensing solutions (earlier post), has acquired the Otus People Tracker software from Raytheon BBN Technologies. The software complements Quanergy’s existing software portfolio and, when used with Quanergy’s LiDAR sensors, creates an integrated hardware and software solution for advanced people detection and tracking applications within the security and autonomous driving markets.
Otus (named after a genus of owls) uses advanced algorithms to identify and to track people for safety and security in crowded environments at ranges exceeding 100 meters when used with Quanergy LiDAR sensors. The system features segmentation techniques identifying humans; background extraction; object clustering; sophisticated merge and split algorithms; persistent tracking algorithms; and other advanced features supporting robust crowd control. Support for multiple zones of interest is included, allowing users fine control over active monitoring.
Mobileye and Delphi to partner on SAE Level 4/5 automated driving solution for 2019
August 23, 2016
Mobileye and Delphi Automotive PLC are partnering to develop a complete SAE Level 4/5 automated driving solution. The program will result in an end-to-end production-intent fully automated vehicle solution, with the level of performance and functional safety required for rapid integration into diverse vehicle platforms for a range of customers worldwide.
The partners’ “Central Sensing Localization and Planning” (CSLP) platform will be demonstrated in combined urban and highway driving at the 2017 Consumer Electronics Show in Las Vegas and production ready for 2019.
Solid-state LiDAR company Quanergy raises $90M in Series B; valuation passes $1B
Quanergy Systems, Inc., a leading provider of solid-state LiDAR sensors and smart sensing solutions (earlier post), raised $90 million in Series B funding at a valuation well over $1 billion. Sensata Technologies, Delphi Automotive, Samsung Ventures, Motus Ventures and GP Capital participated in the round. This investment brings the company’s total funds raised to approximately $150 million.
Quanergy intends to use the investment and leverage its intellectual property to work with its partners in ramping up the production of its solid-state LiDAR sensors. These sensors use standard semiconductor manufacturing processes and have no moving parts on a macro scale or a micro scale, offering significantly lower cost, higher reliability, superior performance, increased capability, smaller size and lower weight when compared to traditional mechanical sensors, sometimes named hybrid solid state sensors.
TU Graz team uses monocrystalline Si as Li-ion anode; integrated micro batteries for on-board sensors
August 21, 2016
Electrochemists at TU Graz have used single crystalline acceptor-doped Si—as ubiquitously used in the semiconductor industry—as anode material for rechargeable Li-ion batteries. In an open access paper in the journal Scientific Reports, the team suggests that the use of such patterned monocrystalline Si (m-Si) anodes directly shaped out of the Si wafer is a highly attractive route to realize miniaturized, on-board fully integrated, power supplies for Si-based chips.
The microchip not only houses the electronics, but is at the same time an important part of a mini battery providing electrical energy, e.g. for sending and receiving information.
ABI Research: highly automated driving to spark adoption of centralized advanced driver assistance systems
August 17, 2016
As vehicles become highly independent and begin to drive and react to traffic on their own, autonomous systems will aggregate and process data from a variety of on-board sensors and connected infrastructure. This will force the industry to hit a hard reset on advanced driver assistance systems (ADAS) architectures, currently dominated by distributed processing and smart sensors.
Automotive OEMs will need to adopt new platforms based on powerful, centralized processors and high-speed low latency networking (e.g., Audi zFAS, earlier post). ABI Research forecasts 13 million vehicles with centralized ADAS platforms will ship in 2025.
Ford and Baidu invest $150M in Velodyne LiDAR
August 16, 2016
Velodyne LiDAR, Inc., a global leader in LiDAR (Light, Detection and Ranging) technology, announced the completion of a combined $150 million investment from co-investors Ford Motor Company and China’s leading search engine company Baidu, Inc. The investment will allow Velodyne to rapidly expand the design and production of high-performance, cost-effective automotive LiDAR sensors, accelerating mass adoption in autonomous vehicle and ADAS applications and therefore accelerating the critical, transformative benefits they provide.
Over the last decade, Velodyne developed four generations of hybrid solid-state LiDAR systems incorporating the company’s proprietary software and algorithms that interpret rich data gathered from the environment via highly accurate laser-based sensors to create high-resolution 3D digital images used for mapping, localization, object identification and collision avoidance.
ZF and ibeo to develop new 3D LiDAR technology; ZF takes 40% stake
August 02, 2016
ZF has acquired a 40% stake in ibeo Automotive Systems GmbH. The Hamburg-based company, which was founded in 2009, is a developer of LiDAR technology and environmental recognition software with a particular focus on applications for autonomous driving (earlier post). Ibeo’s customers include several major global vehicle manufacturers.
The LiDAR generation being developed by ibeo in cooperation with ZF will reproduce a three-dimensional image of the environment without the rotating mirrors contained in current LiDAR systems. Using solid state technology, LiDAR technology will become more compact and easier to integrate into the vehicle.
Valeo Cruise4U car sets off on 13,000-mile partially automated drive across US
Valeo’s Cruise4U partially automated car, previously demonstrated at CES in Las Vegas in January 2016, set off from San Francisco on a 13,000-mile road trip around the US that is scheduled to conclude back in San Francisco on 15 September.
In partially automated driving mode, the trip will include stops in Los Angeles, Las Vegas, Seattle, Chicago, Detroit, Boston, New York, Miami, San Antonio and San Diego. The vehicle will travel both day and night in real traffic conditions.
Audi AG developing automotive driver health as new business area; leveraging digitalization, connected vehicles
August 01, 2016
Audi AG has become a founding partner in Berlin’s “Flying Health Incubator”, a center supporting startups that develop digital innovations in the healthcare sector. The investment highlights Audi’s interest in developing “automotive health”—enhancing the customer’s health and fitness while driving—as a new business area. With the Audi Fit Driver offering, the brand is already testing innovative services and functionalities in this field.
In the Flying Health Incubator, Audi AG is entering into dialog with decision-makers from the startup scene and from the healthcare industry. Together, the partners will strive to identify trends, technical solutions and business models in the digital health market at an early stage.
Ford, MIT project uses LiDAR, cameras to measure pedestrian traffic & predict demand for new, on-demand electric shuttles
July 27, 2016
Ford Motor Company and MIT are collaborating on a new research project that measures how pedestrians move in urban areas to improve certain public transportation services, such as ride-hailing and point-to-point shuttles services.
The project will introduce a fleet of on-demand electric vehicle shuttles that operate on both city roads and campus walkways on the university’s Cambridge, Massachusetts, campus. The vehicles use LiDAR sensors and cameras to measure pedestrian flow, which ultimately helps predict demand for the shuttles. This, in turn, helps researchers and drivers route shuttles toward areas with the highest demand to better accommodate riders.
NTSB issues preliminary report for investigation into Tesla Autopilot fatal crash
The US National Transportation Safety Board issued its preliminary report for the investigation of the fatal 7 May 2016, highway crash in Florida involving the Tesla Model S and Autopilot. The preliminary report does not contain any analysis of data and does not state probable cause for the crash.
The preliminary report details the collision involving a 53-foot semitrailer in combination with a 2014 Freightliner Cascadia truck tractor and the 2015 Tesla Model S. According to system performance data downloaded from the car, the indicated vehicle speed was 74 mph (119 km/h) just prior to impact; the posted speed limit was 65 mph (105 km/h).
NIRA Dynamics, InfoCar expand availability of Road Surface Information software with OBD plug-in
July 25, 2016
Sweden-based NIRA Dynamics, a software company developing sensor-fusion-based systems for different vehicle applications, is rolling out its Road Surface Information (RSI) software more broadly, in partnership with InfoCar AB.
Road Surface Information (RSI) by NIRA continuously monitors the quality and tire grip level of the road surface—without stereo cameras, adaptive suspension or other expensive sensors. With sensor-fusion-based algorithms, RSI can determine the level of road roughness and road friction.
Oxbotica launches Selenium mobile autonomy software
July 23, 2016
Oxbotica, a spin-out from Oxford University’s Mobile Robotics Group, launched its new Selenium mobile autonomy software solution with a purpose-built concept vehicle named Geni.
Selenium can work in pedestrianized environments as well as roads and motorways, and is not reliant on GPS to operate—i.e., it can easily transition between indoor and outdoor settings, over ground or underground. The system has been developed to be “vehicle agnostic”—it can be applied to cars, self-driving pods (e.g. for campuses and airports), and warehouse truck fleets.
Ford takes stake in Civil Maps; 3D mapping technologies for fully autonomous vehicles; AI and voxel hashing
July 16, 2016
Civil Maps, a start-up developing 3D mapping technology for fully autonomous vehicles, raised a $6.6-million seed funding round, led by Motus Ventures and including investment from Ford Motor Company, Wicklow Capital, StartX Stanford and Yahoo cofounder Jerry Yang’s AME Cloud Ventures.
Civil Maps’ mission is to make it possible for fully autonomous vehicles (SAE Levels 4-5) to drive anywhere smoothly and safely. The company’s focus is on building continental-scale maps for autonomous vehicles and providing precise localization using voxel-hashing algorithms. Although GPS and IMU (inertial measurement unit) technologies can in theory determine both position and orientation of vehicles, the accuracy is limited by atmospheric distortion, the start-up notes.
TTTech using VectorCAST platform for development of Audi zFAS to ISO 26262 ASIL-D compliance; domain controller for piloted driving
July 15, 2016
TTTech Computertechnik AG (TTTech) has selected Vector Software’s VectorCAST software test automation platform for use within TTTech’s development of Audi’s zFAS (zentrales Fahrerassistenzsteuergerät)—the domain controller for Audi piloted driving systems. (Earlier post.) VectorCAST provides TTTech with the tools necessary to ensure ISO 26262 compliance up to ASIL D level on all microcontrollers used in zFAS.
Under the guidance of Audi AG, TTTech developed the zFAS electronic control unit (ECU) that integrates various functionalities of advanced driver assistance systems (ADAS). The ECU uses numerous technology components from TTTech for various automotive assistance functions, such as piloted parking or autonomous driving.
Jaguar Land Rover demonstrates all-terrain self-driving technology; off-road connected convoy
July 12, 2016
Jaguar Land Rover has demonstrated a range of innovative research technologies that would allow a future autonomous car to drive itself over any surface or terrain.
The multi-million pound autonomous all-terrain driving research project aims to make the self-driving car viable in the widest range of real life, on- and off-road driving environments and weather conditions. To enable this level of autonomous all-terrain capability, Jaguar Land Rover’s researchers are developing next-generation sensing technologies that will be the eyes of the future autonomous car.
Neos and Lockheed Martin to develop enhanced next-gen airborne gravity gradiometer to advance ability to find oil, gas & minerals
July 06, 2016
In partnership with Lockheed Martin, Neos Inc. will develop a new generation sensor to be used to find oil, gas and minerals beneath the earth’s surface from the air. The new Full Tensor Gradiometry (FTG) Plus technology has 20 times the sensitivity and 10 times greater bandwidth than current gravity gradiometers, according to Neos.
Gravity gradiometers have been commercially used for more than 20 years and militarily longer than that. The technology is based on the principle that earth’s gravity field varies with location, local topography and sub-surface geologic features. Measuring the gravity variation caused by items beneath the earth’s surface can help identify unique underground and undersea geologic structures. The new airborne FTG Plus sensor is so advanced it could find a 10-meter tall hill buried one kilometer below the earth’s surface.
BMW Group, Intel and Mobileye partner on open platform to bring fully autonomous driving to market by 2021
July 01, 2016
BMW Group, Intel, and Mobileye are collaborating to bring solutions for highly and fully automated driving into series production by 2021. The three are creating an a standards-based open platform—from door locks to the datacenter—for the next generation of cars.
The goal of the collaboration is to develop future-proofed solutions that enable drivers to not only take their hands off the steering wheel, but reach the “eyes off” (level 3) and ultimately the “mind off” (level 4) level transforming the driver’s in-car time into leisure or work time.
NHTSA begins preliminary evaluation of Tesla Model S Autopilot fatality
National Highway Traffic Safety Administration’s (NHTSA) Office of Defects Investigation (ODI) has begun a preliminary evaluation of a fatal highway crash involving a 2015 Tesla Model S operating with Autopilot activated. ODI is opening the preliminary evaluation (PE16007) to examine the design and performance of any automated driving systems in use at the time of the crash.
In a blog post, Tesla Motors was quick to point out that this is the first known fatality in more than 130 million miles driven with Autopilot activated. Tesla also pointed out that among all vehicles in the US, there is a fatality every 94 million miles; worldwide, there is a fatality approximately every 60 million miles.
Renesas Electronics develops two-port on-chip SRAM for improved video processing for autonomous vehicles
June 16, 2016
Renesas Electronics has developed a new two-port on-chip Static Random Access Memory (SRAM) for use in system-on-chips (SoCs) for in-vehicle infotainment systems. The new on-chip SRAM will be used as video processing buffer memory in high-performance SoCs that will play an important role in making the autonomous-driving vehicles of the future safer and more reliable.
The new SRAM is optimized for parallel processing of video data and will enable advanced video data processing such as obstacle recognition utilizing real-time processing of high-resolution vehicle camera videos and augmented reality (AR) display on the windshield.
Valeo to offer new low-cost solid-state LiDAR; co-developed with LeddarTech; mass production in 2018
June 01, 2016
Tier 1 supplier Valeo is adding a new low-cost solid-state LiDAR, developed together with LeddarTech, a specialist in advanced detection and ranging solutions, to its portfolio for driving and parking assistance. The new sensor will be ready for mass production in 2018.
The Solid-state LiDAR will have no mechanical moving parts and will be the least expensive LiDAR sensor on the market, Valeo said. With a proprietary receiver ASIC with 16 discrete detection segments, the sensor will provide best in class sensing performance being able to detect pedestrians, bicycles, motorcycles or cars which are only partly in the same lane.
RoMulus project: developing intelligent multi-sensor systems for Industry 4.0
May 29, 2016
The RoMulus (Robust multi-sensor technology for status monitoring in Industry 4.0 applications) research project project sponsored by the German Federal Ministry of Education and Research (BMBF) is focused on simplifying and accelerating the development and use of intelligent multi-sensor systems for Industry 4.0——the digitalization of production processes based on devices autonomously communicating with each other along the value chain. (Earlier post.)
Multi-sensor systems are key components for the success of Industry 4.0 applications. They record, process, and transmit a number of measurement parameters, such as pressure, acceleration, and temperature, all in a highly compact space. Machines are not the only ones to receive such sensors; workpieces are also increasingly being fitted with the intelligent sensor systems so that each product can provide its blueprint and report its manufacturing status. Based on this information, production is largely able to organize and monitor itself.
Audi A7 piloted driving concept “Jack” now driving more naturally
May 13, 2016
Audi’s latest version of its piloted driving research car, the Audi A7 concept “Jack,” has not only learned how to autonomously perform all of its driving maneuvers on the expressway, it has also learned how to show consideration for other road users. Jack exhibits a driving style that is adaptive to the given situation, safe and especially interactive, Audi says.
“Jack”—the internal nickname for the Audi A7 piloted driving concept technology platform now passes trucks with a slightly wider lateral gap. It also signals upcoming lane changes by activating the turn signal and moving closer to the lane marking first—just as human drivers would do to indicate their intentions.
Ford expands Smart Mobility pilot program to deliver improved access to healthcare in The Gambia; motorcycles with sensors
Pregnant women, children and those with medical conditions in The Gambia—one of Africa’s smallest, poorest countries—may have better access to healthcare through an expansion of a Ford Smart Mobility pilot program. Ford has equipped 50 motorcycles serving Riders for Health with sensor technology so the medical services group can collect a variety of data, including mapping coordinates, to improve the delivery of medical services and supplies—particularly in remote areas of the West African country.
The project uses Ford’s OpenXC (earlier post) sensor kits fitted to the motorcycles to gather information. OpenXC technology records every trip, and is accessed via an application on a mobile phone provided by Ford.
Bosch’s new electronic driver assistance system for trams adds collision warning with automatic braking; derived from automotive
May 03, 2016
Bosch has developed a new electronic driver system for trams that not only warns tram drivers of any impending collision but will engage the brakes independently to stop the tram and avoid an accident if the driver reacts to late or not at all.
Bosch Engineering successfully adapted the company’s large-scale automotive production technology for its new and enhanced collision warning system for city rail transportation. The new collision warning system combines a video sensor, a radar sensor, and a high-performance rail control unit.
SwRI to showcase Ranger precision localization technology for automated driving; non-GPS system with 2cm precision
April 27, 2016
Southwest Research Institute (SwRI) will showcase its award-winning Ranger precision localization solution at the AUVSI XPONENTIAL 2016 conference and trade show in New Orleans 2-5 May.
Ranger is a patented approach to vehicle localization that enables precise navigation for automated vehicles using commercially available hardware in combination with SwRI algorithms. The latest Ranger kit can be used for automated driving, valet parking in garages and structures, freight distribution, and docking of buses and large trucks.
Velodyne LiDAR introduces 32-channel ULTRA Puck VLP-32A high definition real-time 3D LiDAR
April 13, 2016
Velodyne LiDAR introduced the ULTRA Puck VLP-32A, combining best-in-class 32-channel performance with a small form factor and the high reliability, at the the 2016 SAE World Congress in Detroit. The ULTRA Puck VLP-32A is the company’s most advanced LiDAR sensor to date, delivering high performance at a cost-effective price point of around $500 at automotive-scale production.
The ULTRA Puck doubles the range and resolution (via number of laser channels) of its predecessor to 200 meters and 32 channels, providing enhanced resolution to identify objects easily. The 32 channels in the ULTRA Puck are deployed over a vertical field of view of 28° and are configured in a unique pattern to provide improved resolution in the horizon to be even more useful for automotive applications. By contrast, the earlier unit used equidistant, 2˚ spacing of the channels.
Ford tests Fusion Hybrid autonomous research vehicles driving in complete darkness
April 11, 2016
As part of its LiDAR sensor development, Ford has tested Fusion Hybrid autonomous research vehicles in complete darkness without headlights on desert roads, demonstrating the capability to perform beyond the limits of human drivers.
Driving in pitch black at Ford Arizona Proving Ground marks the next step on the company’s efforts to delivering fully autonomous vehicles. The development shows that even without cameras, which rely on light, Ford’s LiDAR (units from Velodyne), working with the car’s virtual driver software, is robust enough to steer flawlessly around winding roads. While it’s ideal to have all three modes of sensors—radar, cameras and LiDAR—the latter can function independently on roads without stoplights.
DENSO invests in semiconductor laser technology startup TriLumina; speeding up LiDAR adoption for ADAS, autonomous driving
April 08, 2016
DENSO International America, Inc. has invested in TriLumina Corp., a semiconductor laser technology company that focuses on providing light sources for LiDAR and interior illumination products. DENSO is looking to speed up the adoption of LiDAR and driver monitoring technologies in advanced driver assistance systems (ADAS) and in autonomous vehicles. This strategic investment will enable TriLumina to gain broader access to the automotive market. The laser technology company also received an investment last year from Caterpillar Ventures.
TriLumina has developed eye-safe semiconductor lasers that are among the most versatile laser illuminator solutions available in the market. TriLumina is hoping to accelerate the automotive industry’s adoption of semi-autonomous and autonomous vehicles by providing lasers for 100% solid-state LiDAR products and advanced driver monitoring systems (DMS).
2017 Ford Fusion offers adaptive cruise control with automatic stop-and-go technology
April 05, 2016
The 2017 Ford Features offers a new stop-and-go technology—piggybacking on the existing adaptive cruise control feature—which automatically accelerates and brakes for the driver while maintaining a safe distance from the vehicle ahead.
Using dedicated steering wheel buttons, adaptive cruise control with stop-and-go allows drivers to set cruise control speed and following distance from the vehicle ahead. The semi-autonomous technology can automatically adjust the set speed for comfortable travel—much like a human driver would—bringing the car to a full stop when traffic halts.
Saarbrücken engineers developing networked self-analyzing electric motors
March 23, 2016
Engineers from Saarland University are developing intelligent motor systems that function without the need for additional sensors. By essentially transforming the motor itself into a sensor, the team led by Professor Matthias Nienhaus is creating smart motors that can tell whether they are still running smoothly, can communicate and interact with other motors and can be efficiently controlled.
By using data collected from the motor while it is operating, the researchers are able to calculate quantities that in other systems would need to be measured by additional sensors. Further, they are teaching the drive how to make use of this knowledge.
Daimler demonstrates autonomous truck platooning; Highway Pilot Connect delivers ~7% lower fuel consumption
March 22, 2016
Daimler Trucks demonstrated the new Highway Pilot Connect system for autonomous truck platooning on the A52 autobahn near Düsseldorf. Three WiFi-connected, autonomously driving trucks operated on the autobahn with authorization for public traffic in a platoon formation.
Such a combination can reduce fuel consumption by up to 7% and the road space requirement on motorways by almost half, while improving traffic safety at the same time, Daimler said. Based on the Daimler Trucks Highway Pilot system for autonomously driving heavy trucks (earlier post), the three trucks link up to form an aerodynamically optimized, fully automated platoon.
Munich Re America launches transit bus collision avoidance pilot in Washington with Mobileye Shield+ system
March 17, 2016
Munich Reinsurance America, one of the largest reinsurers in the US, in collaboration with the Washington State Transit Insurance Pool (WSTIP), has launched a pilot program equipping transit buses with the award-winning collision avoidance system Mobileye Shield+. Rosco Vision Systems is the official North American provider and driver-interface manufacturer of this system.
Mobileye is a technology leader in the area of software algorithms, system-on-chips and customer applications that are based on processing visual information for the market of driver assistance systems (DAS). Shield+, designed for large vehicles operating in urban environments, enables early detection of cyclists and pedestrians by using an array of strategically placed artificial vision smart cameras.
20 automakers commit to make automatic emergency braking standard on new vehicles no later than 2022; faster than regulatory process
The US Department of Transportation’s National Highway Traffic Safety Administration (NHTSA) and the Insurance Institute for Highway Safety (IIHS) announced the commitment by 20 automakers representing more than 99% of the US. auto market to make automatic emergency braking (AEB) a standard feature on virtually all new cars in the US no later than NHTSA’s 2022 reporting year, which begins 1 Sept 2022.
Automakers making the commitment are Audi; BMW; FCA US LLC; Ford; General Motors; Honda; Hyundai; Jaguar Land Rover; Kia; Maserati; Mazda; Mercedes-Benz; Mitsubishi Motors; Nissan; Porsche; Subaru; Tesla Motors; Toyota; Volkswagen; and Volvo Car USA. The unprecedented commitment means that this important safety technology will be available to more consumers more quickly than would be possible through the regulatory process.
Honda R&D using IBM Watson IoT technology for real-time monitoring and data analysis in F1 racers
Honda R&D is monitoring and analyzing data from more than 160 sensors in Formula One (F1) cars using IBM Watson Internet of Things (IoT) technology. Drivers and crews can apply data and analytics in real-time to help streamline performance and improve fuel efficiency, enabling drivers to make real-time racing decisions based on this data, such as speed adjustments and pit stops.
To help mark its return to Formula One racing and reach new milestones in efficiency for both race cars and future consumer models, Honda R&D developed a new system to analyze data from the hybrid power units quickly and efficiently to check residual fuel levels and estimate the possibility of mechanical problems. Honda is using the IBM IoT for Automotive solution, based on IBM Watson IoT technology, to deliver data generated from cars, including temperature, pressure and power levels, directly to the cloud for real-time analysis.
New Buick LaCrosse upgrades computing power from 17 to 31 ECUs; new electronic control system
March 16, 2016
The all-new Buick LaCrosse, which launches this week in China, features significant upgrades in computing power and networking to advance connectivity and safety features.
There are 31 ECUs distributed in the all-new Buick LaCrosse—its predecessor utilized only 17. This 82% increase in the number of ECUs helps to optimize calculating efficiency. In order to facilitate the handling of large quantities of data, a specific data bus is arranged to connect ECUs, each of which can process data independently.
Continental acquires Hi-Res 3D Flash LIDAR business from ASC; highly or fully automated driving
March 03, 2016
International automotive supplier Continental has acquired the Hi-Res 3D Flash LIDAR business from Advanced Scientific Concepts, Inc. (ASC) based in Santa Barbara, California. The technology will further enhance the company’s Advanced Driver Assistance Systems (ADAS) product portfolio with a future-orientated solution to add to the group of surrounding sensors needed to achieve highly and fully automated driving.
The Hi-Res 3D Flash LIDAR sensor technology provides both real-time machine vision as well as environmental mapping functions. This technology will help to enable a significantly more detailed and accurate field of vision around the entire vehicle, independent of day or night time and robust in adverse weather conditions.
New algorithm improves speed and accuracy of pedestrian detection; cascade detection + deep learning
February 08, 2016
Researchers at the University of California, San Diego have developed a pedestrian detection system that performs in near real-time (2-4 frames per second) and with higher accuracy (close to half the error) compared to existing systems. The technology, which incorporates deep learning models, could be used in “smart” vehicles, robotics and image and video search systems.
The new pedestrian detection algorithm developed by Nuno Vasconcelos, electrical engineering professor at the UC San Diego Jacobs School of Engineering, and his team combines a traditional computer vision classification architecture—cascade detection—with deep learning models.
Renesas Electronics and TTTech collaborate on new ADAS-ECU development platform with high computing performance and advanced functional safety
Renesas Electronics Corporation and TTTech Computertechnik AG, a global leader in robust networking and safety controls, have agreed to collaborate on the development of a new automotive platform solution aimed at providing a future-proof, high-performance advanced electronic control unit (ECU) development platform for advanced driver assistance systems (ADAS) and automated driving functionality.
The automotive platform solution will integrate Renesas’ automotive control microcontroller (MCU), the RH850/P1x, and high-performance R-Car system-on-chips (SoCs) with TTTech’s TTIntegration, a software platform, to enable highly complex automotive solutions including highly automated driving. Additionally to the physical integration, the development platform achieves parallel, multi-vendor development and integration of individual software components.
Renesas camera video processing circuit block with low latency, high performance, and low power consumption for SOC for autonomous driving
At the International Solid-State Circuits Conference (ISSCC) held in San Francisco earlier this month, Renesas Electronics announced the development of a new video processing circuit block for use in automotive computing system-on-chips (SoCs) that will support autonomous vehicles.
The newly developed video processing circuit block will realize automotive computing systems integrating vehicle information systems and driving safety support systems by enabling massive video processing without imposing any additional load on the CPU and GPU, with real-time performance, low power consumption, and low delay. Renesas intends to incorporate the new video processing circuit block into its future automotive computing SoCs to contribute to a safer and more convenient driving experience.
Continental urea sensors for efficient SCR NOx aftertreatment in diesels; measuring level, quality and temperature
February 05, 2016
Continental has begun production of urea sensors for the first time to support more efficient exhaust-gas aftertreatment in diesel engines. The sensor measures the level, quality, and temperature of the aqueous urea solution in the “AdBlue” tank used in conjunction with selective catalytic reduction (SCR) for NOx reduction.
The sensor-aided denitrification supports fulfillment of the legal requirements and reinforces drivers’ trust that their car emits no more than the maximum permissible level of nitrogen oxides.
New QNX software platform enables ADAS and automated driving
January 23, 2016
QNX Software Systems Limited, a subsidiary of BlackBerry Limited, earlier this month introduced the QNX Platform for ADAS (advanced driver assistance systems), expanding its portfolio of automotive software products. The QNX Platform for ADAS is scheduled for general release in Q2 2016.
Designed for scalability, the platform will enable automotive companies to build a full range of automated driving systems, from informational ADAS modules that provide a 360° surround view of the vehicle, to sensor fusion systems that combine data from multiple sources such as cameras and radar, to high-performance processors that make control decisions in fully autonomous vehicles.
Ricardo white paper outlines needed developments to realize autonomous driving
January 14, 2016
Engineering firm Ricardo has published a white paper—Key Enablers for the Fully Autonomous Vehicle—highlighting the technologies and development processes that are needed to develop commercially feasible self-driving cars that meet consumer expectations while also achieving compliance with likely future transport regulations.
According to the Boston Consulting Group, the projected size of the global autonomous vehicle market in 2025 will be $36 billion for partially autonomous vehicles (levels 1–3) and $6 billion for fully autonomous vehicles (level 4). This includes both passenger and commercial vehicle uses. The realization of fully autonomous vehicles will require further evolution in software, sensors, integration and efficient system testing beyond what is in place for current advanced driver assistance systems.
Ford, U Michigan collaborating on enablers for autonomous driving in the snow; high-resolution 3D mapping
January 12, 2016
Typical autonomous vehicle sensors are useless on snow-covered roads, but researchers at the University of Michigan and Ford are collaborating on a solution. In Michigan and on U-M’s 32-acre Mcity simulated urban environment, they have conducted what they believe are the industry’s first tests of autonomous vehicles in wintry conditions.
Fully autonomous driving can’t rely on GPS, which is accurate only to several yards—not enough to localize or identify the position of the vehicle. It is essential for an autonomous vehicle to know its precise location, not just within a city or on a road, but in its actual driving lane—a variation of a few inches makes a big difference.