Green Car Congress
Home Topics Archives About Contact  RSS Headlines
Google search

GCC Web

Sensors

[Due to the increasing size of the archives, each topic page now contains only the prior 365 days of content. Access to older stories is now solely through the Monthly Archive pages or the site search function.]

Renesas Electronics, Dibotics deliver real-time, power-efficient LiDAR processing based on R-Car SoC for autonomous driving; SLAM on a chip

December 15, 2017

Renesas Electronics Corporation and Dibotics, a leader in real-time 3D LiDAR processing, have collaborated to develop an automotive-grade embedded solution for LiDAR processing used in advanced driver assistance systems (ADAS) and automated driving applications. The jointly-developed solution will enable system manufacturers to develop real-time 3D mapping systems with high level functional safety (FuSa) and low-power consumption.

LiDAR processing today requires an efficient processing platform and advanced embedded software. By combining Renesas’ high-performance image processing, low-power automotive R-Car system-on-chip (SoC) with Dibotics’ 3D simultaneous localization and mapping (SLAM) technology, the companies deliver a SLAM on Chip. (SLAM is a computational algorithm capable of generating and updating a map of an unknown environment while simultaneously keeping track of the vehicle’s location.)

More... | Comments (0)

Continental will introduce fifth-generation automotive radar in 2019

December 06, 2017

Continental is now developing its fifth generation of automotive short- and long-range radar sensors, which will enter series production in 2019. Not only does the new generation have greater capacity, it is also based on a scalable modular principle, which, with its graduated function scopes, flexibly supports vehicle manufacturers’ different requirements and electrical-electronic (E/E) architectures.

Due to the worldwide trend of using 77 GHz technology, the resolution of the sensors is becoming higher and facilitating, for example, more accurate detection of smaller objects such as a lost spare wheel or an exhaust that has fallen off. With long-range radar, a range of up to 300 m and an opening angle of ±60° are possible in the highest expansion stage depending on the required performance.

More... | Comments (0)

LeddarTech to showcase first 3D solid-state LiDAR IC for autonomous driving at CES 2018

LeddarTech will present the LeddarCore LCA2, the industry’s first 3D solid-state LiDAR (SSL) integrated circuit (IC) enabling mass production of automotive LiDARs, in the Leddar Ecosystem pavilion at CES 2018 in Las Vegas next month.

LCA2 IC engineering samples and LCA2-based 3D Flash LiDAR modules (A-samples) will be on display in the pavilion. Also on display will be discrete implementations of LiDAR systems showcasing the LeddarCore LCA3, which is currently under development with the first samples to be made available in 2018.

More... | Comments (0)

Audi displays new AI project at NIPS: mono camera with semantic segmenting and depth estimates creates precise 3D model of environment

December 04, 2017

Audi is exhibiting an innovative pre-development project to support autonomous driving at the NIPS 2017 conference in Long Beach, California, this week. A project team from the Audi subsidiary Audi Electronics Venture (AEV) developed a mono camera that uses artificial intelligence to generate an extremely precise 3D model of the environment. This technology makes it possible to capture the exact surroundings of the car.

A conventional front camera acts as the sensor. It captures the area in front of the car within an angle of about 120 degrees and delivers 15 images per second at a resolution of 1.3 megapixels. These images are then processed in a neural network. This is where semantic segmenting occurs, in which each pixel is classified into one of 13 object classes. This enables the system to identify and differentiate other cars, trucks, houses, road markings, people and traffic signs.

More... | Comments (1)

Newsight Imaging and LeiShen Intelligent partner on new solid-state V-LiDAR for automotive

November 15, 2017

Israel-based Newsight Imaging, a developer of advanced CMOS image sensors for laser LiDAR and spectral analysis, is partnering with China-based LeiShen Intelligent, a global provider of high-performance laser LiDAR (Light Detection And Ranging) systems to deliver V-LiDAR—a solid-state 3D pulsed-based LiDAR for automotive applications used in ADAS systems and in autonomous vehicles.

The V-LiDAR (Vehicle-LiDAR)—to be based on LeiShen’s advanced 3D LiDAR and on Newsight’s NSI5000 CMOS image sensor using Newsight’s eTOF (Enhanced Time-of-Flight) technology—will become available in the first half of 2018.

More... | Comments (0)

Velodyne launches VLS-128, the world’s highest resolution LiDAR for autonomous vehicles; successor to the HDL-64

November 08, 2017

Velodyne Labs announced that it has produced a LiDAR with the world’s best resolution, longest range, and widest field of view, the result of which will boost the autonomous vehicle and advanced safety market.

The Velodyne VLS-128, with 128 laser beams, produces the best resolution and will replace the HDL-64 LiDAR as the industry standard for high performance. High-resolution LiDAR is critical to navigate autonomous cars and provide vehicle safety. Velodyne expects the VLS-128 to become the new standard for fully autonomous cars and cars equipped with advanced safety features because of the unchallenged quantity of data it produces in real time at top speed.

More... | Comments (4)

ORNL, City of Oak Ridge partner on sensor project to capture trends in cities

October 30, 2017

Researchers at the Department of Energy’s Oak Ridge National Laboratory (ORNL) are partnering with the city of Oak Ridge, Tennessee, to develop UrbanSense, a comprehensive sensor network and real-time visualization platform that helps cities evaluate trends in urban activity.

The project, initiated by ORNL’s Urban Dynamics Institute (UDI), centers on addressing cities’ real-world challenges through applied urban science. The prototype designed for Oak Ridge monitors population density, traffic flow and environmental data including air and water quality, with a total of seven sensors to be installed in the city.

More... | Comments (0)

Magna using torsional welding process for thermoplastics in autos; sensors to thin-wall bumpers

October 27, 2017

Magna is using a torsional welding process for joining thermoplastic materials in order to help automakers cut weight and costs. The torsional welding process, developed by Magna for automotive applications at its exteriors plant in Liberec, Czech Republic, in collaboration with Telsonic Ultrasonics, features a high-speed twisting motion that creates enough friction-based heat to join a plastic bracket to a thermoplastic fascia.

The innovative technology achieves an approximate 1% weight reduction because it allows thinner materials to be joined, which in turn reduces material costs. Torsional welding is currently used to make the front fascia of the 2017 Skoda Octavia, and it has potential for other applications where materials of similar composition need to be joined.

More... | Comments (0)

ON Semiconductor introduces first highly scalable family of next-generation automotive image sensors

October 26, 2017

ON Semiconductor is introducing a CMOS image sensor platform that brings new levels of performance and image quality to automotive applications such as ADAS, mirror replacement, rear and surround view systems, and autonomous driving.

The Hayabusa platform features a ground-breaking 3.0-micron backside illuminated pixel design that delivers a charge capacity of 100,000 electrons—the highest in the industry—with other key automotive features such as simultaneous on-chip high dynamic range (HDR) with LED flicker mitigation (LFM), plus real-time functional safety and automotive grade qualification.

More... | Comments (1)

Nissan ProPILOT Assist technology makes US debut on 2018 Rogue

October 20, 2017

Nissan’s 2018 Rogue crossover, which goes on sale 24 October at Nissan dealers in the US, will be the first model in the US to offer Nissan’s ProPILOT Assist technology. ProPILOT Assist provides assisted steering, braking and accelerating during single-lane highway driving. (Earlier post.)

ProPILOT Assist is the foundation for the fully autonomous vehicles of the future, helping drivers stay centered in the lane, navigate stop-and-go traffic, maintain a set vehicle speed and maintain a set distance to the vehicle ahead—all done with a simple two-button operation. ProPILOT Assist is available on the 2018 Rogue SL grade as part of the Platinum Package. Pricing starts at $24,6802 for the Rogue S front-wheel-drive model; the SL trim starts at $31,060.

More... | Comments (0)

New MIPI Alliance group collaborating with automotive industry on interface specifications

October 13, 2017

The MIPI Alliance, an international organization that develops interface specifications for mobile and mobile-influenced industries, has formed an Automotive Birds of a Feather (BoF) Group to solicit industry input from original equipment manufacturers (OEMs) and their suppliers to enhance existing or develop new interface specifications for automotive applications. The group is open to both MIPI Alliance member and non-member companies to represent the broader automotive ecosystem.

Automobiles have become a new platform for innovation, and manufacturers are already using MIPI Alliance specifications as they develop and implement applications for passive and active safety, infotainment and advanced driver assistance systems (ADAS).

More... | Comments (0)

Fisker and solid-state LiDAR partner Quanergy to showcase Fisker EMotion at CES in January

October 11, 2017

Fisker Inc. will showcase the all-new EMotion luxury electric vehicle with Quanergy Systems at the upcoming 2018 Consumer Electronics Show (CES) in Las Vegas. Quanergy System will be integrating five of Quanergy’s S3 solid-state LiDAR sensors (earlier post) into the Fisker EMotion.

Fisker and Quanergy Systems, are working closely in partnership to integrate autonomous hardware—specifically LiDAR—in the Fisker EMotion body design as a seamless part of the vehicle. Quanergy’s LiDAR sensors lead in all six key commercialization areas of price, performance, reliability, size, weight, and power efficiency, while meeting the mass deployment requirements of durability and dependability.

More... | Comments (0)

KAIST team develops ultra-fast and ultra-sensitive hydrogen sensor based on Pd nanowire array coated with MOF

October 02, 2017

A research group under Professor Il-Doo Kim in the Department of Materials Science and Engineering at KAIST, in collaboration with Professor Reginald M. Penner of the University of California-Irvine, has developed an ultra-fast hydrogen sensor based on a palladium (Pd) nanowire array coated with a metal-organic framework (MOF) that can detect hydrogen gas levels under 1% in less than seven seconds.

The sensor also can detect hundreds of parts per million levels of hydrogen gas within 60 seconds at room temperature.

More... | Comments (0)

Toyota Research Institute outlines progress in automated driving; Platform 2.1; new Luminar LiDAR

September 27, 2017

Toyota Research Institute (TRI) outlined some of its progress in the development of automated driving technology and other project work, including the development of robots for in-home support of people.

Since unveiling its Platform 2.0 research vehicle in March 2017 (earlier post), TRI has quickly updated its automated driving technology to Platform 2.1. In parallel with the creation of this test platform, TRI said it has made strong advances in deep learning computer perception models that allow the automated vehicle system to more accurately understand the vehicle surroundings, detecting objects and roadways, and better predict a safe driving route.

More... | Comments (1)

Solid-state LiDAR company LeddarTech raises $101M in Series C

September 07, 2017

Solid-state LiDAR company LeddarTech (earlier post) completed a combined investment of US$101 million led by Osram and including Delphi, Magneti Marelli and Integrated Device Technology, Inc. as strategic investors, as well as Fonds de solidarité FTQ.

Representing the Company’s largest capital raise to date, this round of funding will allow LeddarTech to enhance its ASIC development efforts, expand its R&D team, and accelerate ongoing LiDAR development programs with select Tier-1 automotive customers for rapid market deployment.

More... | Comments (1)

Solid-state LiDAR company Innoviz raises $65M; automotive-grade LiDAR for Level 3-5 autonomy available in 2019

Innoviz Technologies, a leading provider of LiDAR sensing solutions designed for the mass commercialization of autonomous vehicles, raised $65 million in Series B funding as its solid-state LiDAR solution moves into mass production. (Earlier post.)

Strategic partners Delphi Automotive and Magna International participated in the round, along with additional new investors including 360 Capital Partners, Glory Ventures, Naver and others. All Series A investors, including Zohar Zisapel, Vertex Venture Capital, Magma Venture Partners, Amiti Ventures and Delek Motors, participated in this round. A second closing of this round is expected to be announced soon, introducing additional strategic partners.

More... | Comments (2)

Magna unveils MAX4 autonomous driving platform

September 01, 2017

Magna unveiled MAX4, a fully integrated, customizable and scalable autonomous driving sensing and compute platform that can enable up to Level 4 autonomous driving capabilities in both urban and highway environments.

MAX4 combines cameras, RADAR, LiDAR and ultrasonic sensors with a compute platform; the systems are designed for easy integration with any automakers’ existing and future platforms. Magna’s compute platform, scalable for high-volume production, is flexible, upgradeable and fully functional with a fraction of power requirements as compared to alternative solutions, according to the company.

More... | Comments (0)

Stanford, UCSD team develops “4D” camera; use in autonomous vehicles

August 07, 2017

Engineers at Stanford University and the University of California San Diego (UCSD) have developed a monocentric lens with multiple sensors using microlens arrays, allowing light field (LF) capture with an unprecedented field of view (FOV)—a camera that generates four-dimensional images and can capture 138 degrees of information.

The new camera—the first single-lens, wide field of view, light field (LF) camera—could generate information-rich images and video frames that will enable robots to better navigate the world and understand certain aspects of their environment, such as object distance and surface texture. The researchers also see this technology being used in autonomous vehicles and augmented and virtual reality technologies. Researchers presented their new technology at the computer vision conference CVPR 2017 in July.

More... | Comments (1)

NVIDIA & AutonomouStuff speed development of autonomous vehicles with DRIVE PX on Wheels; full kits in cars

July 29, 2017

NVIDIA has partnered with AutonomouStuff, a supplier of components used within autonomy systems, to deliver kits packaging the NVIDIA DRIVE PX in vehicles with sensors. DRIVE PX on Wheels is available in three versions: advanced, basic and custom.

Each comes with a vehicle configured by AutonomouStuff with sensors and the NVIDIA end-to-end autonomous driving platform, allowing developers to focus on creating their own self-driving solutions. The advanced kit begins with a Ford Fusion, which is loaded with a NVIDIA DRIVE PX AI car computer, as well as cameras, LIDAR, radar, navigation sensors and a drive-by-wire system, including:

More... | Comments (2)

New Audi A8 debuting Level 3 autonomous AI traffic jam pilot; parking and remote garage pilots; zFAS controller

July 12, 2017

Audi AI is the umbrella term for its new generation of high-end assistance technologies extending all the way up to highly automated driving. Three of them will be available for the first time in the new A8: the Audi AI traffic jam pilot (the first Level 3 autonomous traffic jam feature on the market); the Audi AI (remote) parking pilot and the Audi AI remote garage pilot. The core of the systems which Audi is developing for piloted driving is the central driver assistance system control unit (zFAS), which is also making its debut in the new Audi A8.

Audi AI traffic jam pilot. The Audi AI traffic jam pilot is a Level 3 system, whereby the car takes over the task of driving in certain situations. The driver no longer needs to monitor it permanently, as with a Level 2 system; the driver must merely be capable of taking back responsibility on the system’s prompting.

More... | Comments (0)

Autoliv to use Velodyne LiDAR technology for automotive-grade LiDAR product

July 03, 2017

Autoliv Inc., the world’s largest automotive safety company, has joined Velodyne LiDAR’s Tier-1 Program and will develop and manufacture an automotive-grade LiDAR product using Velodyne’s core 3D software technology and proprietary LiDAR ASIC engine. The first applications will be in the robotaxi segment.

Pursuant to the agreement, Autoliv will develop and market a scalable automotive-grade LiDAR sensor using Velodyne’s core 3D software technology and proprietary LiDAR ASIC engine coupled with Autoliv’s component development and verification capability.

More... | Comments (1)

Daimler introducing first active emergency braking assistant for buses to feature pedestrian recognition

June 30, 2017

Daimler Buses and its product brands Mercedes-Benz and Setra are introducing Active Brake Assist 4 (ABA 4) with pedestrian recognition—the first emergency braking assistance system in a bus to automatically brake for pedestrians. Active Brake Assist 3 already carries out maximum full-stop braking for vehicles ahead and for stationary obstacles.

The new Active Brake Assist 4 with pedestrian recognition warns the driver visually and audibly of any potential collision with pedestrians and at the same time automatically triggering partial braking.

More... | Comments (0)

Audi and Johannes Kepler University of Linz to establish center for artificial intelligence; autonomous driving

June 21, 2017

Audi and the Johannes Kepler University of Linz (JKU) are establishing the “Audi.JKU deep learning center” in Linz to conduct joint research into the intelligent car of the future. Through cooperation with the Institute for Bioinformatics headed by Prof. Sepp Hochreiter, Audi plans to promote the use of artificial intelligence in automobiles.

Prof. Sepp Hochreiter is one of Europe’s leading experts in the field of artificial intelligence (AI). He has made major contributions with fundamental research into deep‑learning technologies – a methodology that is based on the learning processes of the human brain. The long short‑term memory (LSTM) that he developed is used for speech‑recognition software in all smartphones all over the world. (LSTM is a recurrent neural net (RNN) architecture proposed by Hochreiter and his colleague Jürgen Schmidhuber in 1996.)

More... | Comments (0)

HELLA and ZF form strategic partnership on sensors for ADAS and autonomous driving; first camera system due in 2020

ZF and HELLA are entering into a strategic partnership, with the focus on camera systems, imaging and radar sensor technology for advanced driver assistance systems (ADAS) and autonomous driving. Both automotive suppliers said they will each benefit from this cooperation on sensor technology, particularly for front camera systems, imaging and radar systems.

ZF will further strengthen its portfolio as a systems supplier offering both modern assistance systems and autonomous driving functions, while HELLA will drive technological development and benefits with a broader market access for its technologies. The first joint development project in camera technology will start immediately, with the objective of a market launch in 2020.

More... | Comments (0)

Honda targeting introduction of Level 4 automated driving capability by 2025; Level 3 by 2020

June 08, 2017

Honda is targeting the year 2025 for the introduction of vehicles with highly-automated driving capability in most driving situations (SAE Level 4). This new goal builds upon earlier-announced plans for Honda and Acura vehicles to have highly-automated freeway driving capability (SAE Level 3) by 2020.

Honda Motor Co., Ltd. President & CEO Takahiro Hachigo made the announcement at a media briefing held at Honda R&D Co., Ltd. in Japan, where media were able to test-drive Honda automated vehicle technologies, including systems with advanced artificial intelligence (AI), in several complex driving scenarios.

More... | Comments (0)

New low-voltage W-band millimeter-wave technology; applicable for cars, bikes, cellphones

June 05, 2017

Hiroshima University and Mie Fujitsu Semiconductor Limited (MIFS) have developed a low-power millimeter-wave amplifier that feeds on 0.5 V power supply and covers the frequency range from 80 GHz to 106 GHz. It was fabricated using MIFS’s Deeply Depleted Channel (DDC) technology.

This is the first W-band (75−110 GHz) amplifier that can operate even with such a low power-supply voltage. Details of the technology are being presented this week at the IEEE Radio Frequency Integrated Circuits Symposium (RFIC) 201 in Honolulu, Hawaii.

More... | Comments (1)

BMW Group, Intel and Mobileye bringing Delphi in as partner in autonomous driving platform work

May 16, 2017

The BMW Group, Intel and Mobileye will bring Delphi onboard as a development partner and system integrator for the autonomous driving platform currently under development. The common platform is intended to address level 3 to level 5 automated driving and will be made available to multiple car vendors and other industries who could benefit from autonomous machines and deep machine learning. (Earlier post.)

The four partners intend to jointly deploy a cooperation model to deliver and scale the developed solutions to the broader OEM automotive industry and potentially other industries.

More... | Comments (0)

Sendyne introduces first isolation monitor for EVs and HEVs capable of detecting potential electrical hazards during dynamic operation

April 27, 2017

Sendyn, a provider of technologies for battery system management including current, voltage and temperature measurement ICs and modules (earlier post), has introduced the SIM100, a new type of automotive-rated isolation monitoring safety device that is capable of detecting potential electrical hazards during the dynamic operation of high-voltage unearthed systems—such as electric and hybrid vehicles.

The SIM100 module is the first device of its kind capable of unambiguously detecting the electrical isolation state of a high-voltage system while the system is active and operating, and experiencing large voltage variations. State-of-the-art technology today is limited to detecting only resistive leakages and only when the system voltage does not vary significantly. In another first, the SIM100 detects both resistive leakages and capacitively stored energy that could be harmful to human operators.

More... | Comments (0)

Waymo adding 500 more Chrysler Pacifica Hybrid Minivans to self-driving program; early rider program

April 25, 2017

Waymo (formerly the Google self-driving car project) will add an additional 500 Chrysler Pacifica Hybrid minivans (earlier post) to expand its self-driving program. FCA previously delivered 100 minivans, modified for self-driving, to Waymo during the second half of 2016. (Earlier post.) Production of the additional 500 minivans will ramp up beginning next month. Waymo will then outfit these vehicles with its self-driving technology.

Waymo also is inviting members of the public to use its fleet of self-driving vehicles for everyday travel. Waymo’s early rider program will give selected Phoenix residents the opportunity to experience the self-driving Chrysler Pacifica Hybrid minivans for the first time at no charge.

More... | Comments (6)

Velodyne introduces new low-cost fixed-laser solid-state LiDAR for autonomous driving and ADAS applications

April 20, 2017

Velodyne LiDAR announced its new fixed-laser, solid-state Velarray LiDAR (Light Detection and Ranging) sensor, a cost-effective yet high-performance and rugged automotive product in a small form factor. (Earlier post.) The Velarray sensor can be seamlessly embedded in both autonomous vehicles and advanced driver-assist safety (ADAS) systems.

The new Velarray LiDAR sensor uses Velodyne’s proprietary ASICs (Application Specific Integrated Circuits) to achieve superior performance metrics in a small package size of 125mm x 50mm x 55mm that can be embedded into the front, sides, and corners of vehicles. It provides up to a 120-degree horizontal and 35-degree vertical field-of-view, with a 200-meter range even for low-reflectivity objects.

More... | Comments (4)

Cadillac Super Cruise hands-free system uses driver attention system and LiDAR map database

April 11, 2017

The 2018 Cadillac CT6 will feature Super Cruise (earlier post) autonomous driving technology for the highway. Unlike other driver assistance systems, Super Cruise utilizes two advanced technology systems—a driver attention system and precision LiDAR map data—to ensure safe and confident vehicle operation. These systems are added to the network of cameras and radar sensors in the CT6, providing a more data-rich approach to driver assistance.

Super Cruise will be offered as an option on the 2018 Cadillac CT6 prestige sedan, starting this fall in the US and Canadian markets.

More... | Comments (1)

Renesas introduces “autonomy” open platform for ADAS and automated driving

Renesas Electronics Corporation launched Renesas “autonomy”, a new advanced driving assistance systems (ADAS) and automated driving platform. As the first product under the new autonomy platform, Renesas released the R-Car V3M high-performance image recognition system-on-chip (SoC) optimized primarily for use in smart camera applications, as well as surround view systems or even LiDARs.

The new R-Car V3M SoC complies with the ISO 26262 functional safety standard, delivers low-power hardware acceleration for vision processing, and is equipped with a built-in image signal processor (ISP), freeing up board space and reducing system manufacturers’ system costs. Renesas is exhibiting its first Renesas autonomy demonstrator, developed based on the new R-Car V3M SoC, at DevCon Japan in Tokyo.

More... | Comments (0)

Continental developing Road Condition Observer for active safety

April 04, 2017

Continental is developing a new system called Road Condition Observer that uses vehicle sensors and cloud data to classify a road surface as dry, wet, covered with snow or icy and to assess the grip of the road surface.

This knowledge, in turn, allows the vehicle to adjust the functions of advanced driver assistance systems to the actual road conditions, said Bernd Hartmann, Head of the Enhanced ADAS (Advanced Driver Assistance Systems) & Tire Interactions project group within the Advanced Engineering department of Continental’s Chassis & Safety division.

More... | Comments (0)

Bosch and Daimler partner to develop Level 4, 5 autonomous driving systems

Bosch and Daimler are collaborating to advance the development of fully automated and driverless driving. The two companies have entered into a development agreement to bring fully automated (SAE Level 4) and driverless (SAE Level 5) driving to urban roads by the beginning of the next decade. The objective is to develop software and algorithms for an autonomous driving system.

The prime objective of the project is to achieve the production-ready development of a driving system which will allow cars to drive fully autonomously in the city. The idea behind it is that the vehicle should come to the driver rather than the other way round. Within a specified area of town, customers will be able to order an automated shared car via their smartphone. The vehicle will then make its way autonomously to the user and the onward journey can commence.

More... | Comments (0)

NASA’s hybrid computer enables Raven’s autonomous docking capability

March 22, 2017

A hybrid computing system developed at NASA’s Goddard Space Flight Center in Greenbelt, Maryland, is the enabling technology behind an ambitious experiment testing a relative navigation and autonomous docking capability known as Raven.

Developed by the Satellite Servicing Projects Division (SSPD) the carry-on luggage-sized module was launched 19 February aboard SpaceX’s Dragon spacecraft, along with other experiments deployed outside the International Space Station on an experiment pallet. Raven is testing and maturing visible, infrared and LiDAR sensors and machine-vision algorithms; the module will bring NASA one step closer to realizing a groundbreaking autopilot capability that can be applied to many NASA missions for decades to come.

More... | Comments (0)

Mitsubishi Electric develops technologies for automated mapping and extraction of transitions in mapping landscape for high-precision 3D maps

March 16, 2017

Mitsubishi Electric Corporation has developed two new technologies to support the creation and maintenance of high-precision 3D maps required for autonomous driving (earlier post): AI-based automated mapping; and the extraction of transitions in mapping landscapes.

The technologies are based on the company’s own Mobile Mapping System (MMS) (earlier post) for the creation of highly precise three-dimensional maps that provide static information of roads and surrounding objects. Both technologies will be exhibited for the first time at CeBIT 2017 in Hannover, Germany from 20-24 March 2017.

More... | Comments (0)

Intel to acquire Mobileye for $15.3 billion; targeting autonomous driving

March 13, 2017

Intel Corporation announced a definitive agreement under which Intel would acquire Mobileye, a global leader in the development of computer vision and machine learning, data analysis, localization and mapping for advanced driver assistance systems and autonomous driving. A subsidiary of Intel will commence a tender offer to acquire all of the issued and outstanding ordinary shares of Mobileye for $63.54 per share in cash, representing an equity value of approximately $15.3 billion and an enterprise value of $14.7 billion.

The combination is expected to accelerate innovation for the automotive industry and position Intel as a leading technology provider in the fast-growing market for highly and fully autonomous vehicles.

More... | Comments (10)

Renault and Heudiasyc create shared research facility for perception and localization systems for autonomous vehicles

March 03, 2017

Renault and Heudiasyc (Heuristics and Diagnostics for Complex Systems), a joint research unit formed by UTC (Université de Technologie de Campiègne) and the CNRS, have created SIVALab, a laboratory specializing in localization and perception systems for autonomous vehicles. SIVALab (a French acronym for Integrated Systems for Autonomous Vehicles Lab) is based in Compiègne, north of Paris, France.

This scientific and technological partnership has been set up for an initial, extendable period of four years. It is founded on an existing association that began more than 10 years ago and will use the Renault ZOE-based autonomous vehicle platforms developed by Heudiasyc. SIVALab is being created to provide a structure geared to long-term scientific developments and major programs.

More... | Comments (0)

Toyota Research Institute displays Gen 2 autonomous test vehicle; machine vision and machine learning

The Toyota Research Institute (TRI) displayed its 2.0 generation advanced safety research vehicle at the company’s Prius Challenge event in Sonoma California. (Earlier post.)The all-new test vehicle will be used to explore a full range of autonomous driving capabilities.

The new advanced safety research vehicle is the first autonomous testing platform developed entirely by TRI, and reflects the rapid progress of its autonomous driving program, said TRI CEO Gill Pratt. The system is computationally rich and perception-rich, focusing heavily on machine vision and machine learning. The layered and overlapping LiDAR, radar and camera sensor array reduce the need to depend too heavily on high-definition maps—especially for near-term systems which will be designed for use in areas where such maps don’t yet exist.

More... | Comments (1)

Qualcomm and TomTom partner on crowdsourcing high-definition mapping data for autonomous driving

February 27, 2017

Qualcomm Technologies is working with TomTom on using the Qualcomm Drive Data Platform for high-definition (HD) map crowdsourcing, to accelerate the future of autonomous driving. Qualcomm Drive Data Platform intelligently collects and analyzes data from different vehicle sensors, supporting smarter vehicles to determine their location, monitor and learn driving patterns, perceive their surroundings and share this perception with the rest of the world reliably and accurately.

TomTom’s HD Map, including RoadDNA, is a highly accurate, digital map-based product, which assists automated vehicles to precisely locate themselves on the road and help determine which way to maneuver, even when traveling at high speeds.

More... | Comments (0)

New ultrafast camera for self-driving vehicles and drones

February 17, 2017

Scientists from Nanyang Technological University, Singapore (NTU Singapore) have developed an ultrafast high-contrast camera that could help self-driving cars and drones see better in extreme road conditions and in bad weather.

Unlike typical optical cameras, which can be blinded by bright light and unable to make out details in the dark, NTU’s new smart camera can record the slightest movements and objects in real time. The new camera records the changes in light intensity between scenes at nanosecond intervals, much faster than conventional video, and it stores the images in a data format that is many times smaller as well.

More... | Comments (7)

Volkswagen and Mobileye partner on autonomous driving; REM in VWs in 2018

February 14, 2017

Volkswagen and Mobileye will implement a new navigation standard for autonomous driving starting in 2018. Future Volkswagen models will use the camera-based map and localization technology Road Experience Management (REM) from Mobileye. (Earlier post.)

REM uses crowd-sourcing (data from many cars – the swarm) to generate real-time data for precise localization and acquisition of high-definition track data. Volkswagen cars, which are equipped with front cameras, will acquire lane markings and road information via optical sensor systems from Mobileye; this information flows in compressed format into a cloud. This fleet data is used for continuous improvement of high-definition navigation maps with highly precise localization capability.

More... | Comments (3)

LeddarTech selects IDT to develop new LeddarCore IC for mass-market solid-state LiDARs; ADAS and autonomous driving

February 10, 2017

LeddarTech Inc. and Integrated Device Technology, Inc. (IDT) have entered into a partnership agreement jointly to develop and supply the LeddarCore LCA2 integrated circuits. (Earlier post.) The LeddarCore is a receiver IC which is a key element within an automotive LiDAR system. This newest generation of LeddarCore IC enables solid-state implementations of high-performance, low-cost automotive LiDARs, which are required for the mass-market deployment of semi-autonomous and autonomous vehicles.

As part of the agreement, IDT, a developer of complete mixed-signal solutions for automotive, communications, computing, consumer, and industrial markets, will leverage its advanced expertise for component requirements analysis, architecture, design, development, characterization, qualification and transfer to manufacturing of the LCA2.

More... | Comments (2)

New high-resolution time-to-digital converter from ams offers better object detection and avoidance for LiDAR

January 24, 2017

ams AG, a leading provider of high performance sensor solutions and analog ICs, has launched a new version of its market-leading time-to-digital converter (TDC) offering improved speed and precision together with low power consumption. The new TDC-GPX2 also features standard low-voltage differential signaling (LVDS) and serial peripheral (SPI) interfaces, and a new, smaller 9mm x 9mm QFN64 package.

TDCs from ams, which can measure short time intervals with great precision, are widely used in light detection and ranging (LIDAR) and laser-ranging devices, in positron emission tomography (PET) medical scanners, and in automated test equipment (ATE). The introduction of the TDC-GPX2 means that these applications can benefit from increased resolution up to 10ps and a new high sampling rate of up to 70 Msamples/s.

More... | Comments (3)

Koito and Quanergy collaborate to design automotive headlight concept with built–in solid-state LiDAR

January 07, 2017

Koito Manufacturing Co., Ltd., the largest global maker of automotive headlights, and Quanergy Systems, Inc., a leading provider of LiDAR sensors and smart sensing solutions, are collaborating to design an automotive headlight concept with built-in Quanergy S3 solid state LiDAR sensors (earlier post). The Koito headlight with built-in sensors is on display at CES 2017.

The Koito headlights, which will be located on the corners of a vehicle, each incorporates two compact Quanergy S3 solid state LiDARs that perform sensing forward and to the side, and provide real-time long-range 3D views of the environment around the vehicle and the ability to recognize and track objects.

More... | Comments (9)

Next-gen Audi A8 to feature MIB2+, series debut of zFAS domain controller, Mobileye image recognition with deep learning; Traffic Jam Pilot

January 05, 2017

Audi’s next-generation A8, premiering this year, will feature the first implementation of the MIB2+ (Modular Infotainment Platform). The key element in this new implementation of the MIB is NVIDIA’s Tegra K1 processor (earlier post), which makes new functions possible and has the computing power needed to support several high-resolution displays—including the second-generation Audi virtual cockpit. Onboard and online information will merge, making the car part of the cloud to a greater degree than ever.

The A8 also marks the series debut of the the central driver assistance controller (zFAS), which also features the K1; in the future, the X1 processor (earlier post) will be applied in this domain controller. The zFAS, developed in collaboration with TTTech, Mobileye, NVIDIA and Delphi, also integrates a Mobileye image processing chip. (Earlier post.)

More... | Comments (1)

Renesas Electronics and TTTech deliver highly automated driving platform

Renesas Electronics and TTTech Computertechnik AG, a global leader in robust networking and safety controls, have developed a highly automated driving platform (HADP). The new HADP is a prototype electronic control unit (ECU) for mass production vehicles with integrated software and tools, which demonstrates how to use Renesas and TTTech technologies combined in a true automotive environment for autonomous driving. The HADP accelerates the path to mass production for Tier 1s and OEMs.

The newly released HADP is the first outcome of the collaboration between TTTech and Renesas announced in January 2016 (earlier post), and is an extended version of the HAD solution kit released in October 2016. It is based on dual R-Car H3 system-on-chips (SoCs) (earlier post) and the RH850/P1H-C microcontroller (MCU).

More... | Comments (0)

Audi & NVIDIA partner to deliver fully automated driving with AI starting in 2020; piloted Q7 w/ neural network CES demo

Audi announced a partnership with NVIDIA to use artificial intelligence in delivering highly automated vehicles starting in 2020. Deep learning technology will enable skilled handling of real-road complexities, delivering safer automated vehicles earlier. The first phase of this expanded collaboration between the nearly decade-long partners focuses on NVIDIA DRIVE PX, which uses trained AI neural networks to understand the surrounding environment, and to determine a safe path forward. (Earlier post.)

Audi and NVIDIA have combined their engineering and visual computing technologies in the past on Audi innovations such as Audi MMI navigation and the Audi virtual cockpit. Later this year Audi will introduce the next-generation Audi A8 featuring Traffic Jam Pilot—the world’s first Level 3 automated vehicle (as defined by SAE International) equipped with a first-generation central driver assistance domain controller (zFAS) that integrates NVIDIA computing hardware and software. (Earlier post.)

More... | Comments (6)

BMW, Intel, Mobileye: 40 autonomous BMWs to be on road by 2H 2017; standards-based open platform for autonomy

BMW Group, Intel and Mobileye announced that a fleet of approximately 40 autonomous BMW vehicles will be on the roads by the second half of 2017, demonstrating the significant advancements made by the three companies towards fully autonomous driving. Revealing this at a podium discussion held during a joint press conference at CES, the companies further explained that the BMW 7 Series will employ advanced Intel and Mobileye technologies during global trials starting in the US and Europe.

In July 2016, BMW Group, Intel and Mobileye announced a collaboration to bring solutions for highly and fully automated driving into series production by 2021. The three said they would create a standards-based open platform—from door locks to the datacenter—for the next generation of cars. (Earlier post.) The companies have since developed a scalable architecture that can be adopted by other automotive developers and carmakers to pursue state of the art designs and create differentiated brands. The offerings scale from individual key integrated modules to a complete end-to-end solution providing a wide range of differentiated consumer experiences.

More... | Comments (0)

Renesas Electronics unveils RH850/V1R-M automotive radar solution for ADAS and autonomous driving vehicles

January 04, 2017

Advanced semiconductor supplier Renesas Electronics Corporation introduced the RH850/V1R—its first product from the new RH850-based, 32-bit, automotive radar microcontroller (MCU) series—that will deliver the high performance and features required for enabling future advanced driver assistance systems (ADAS) and autonomous driving vehicles. The RH850/V1R-M includes a digital signal processor (DSP) and high speed serial interfaces and is specifically designed for middle- to long-range radars.

Vehicles are being equipped with a broad spectrum of sensors such as cameras, LiDAR and ultrasonic sensors to support expanded advanced driver assistance (ADAS) and emerging autonomous driving functionality. Radar sensors are needed for ADAS applications—including advanced emergency braking and adaptive cruise control—because, unlike other sensors, radar sensors are not negatively affected by external environmental limitations which includes adverse weather conditions, such as rain, fog or whether the sun is shining or not.

More... | Comments (0)

Qualcomm introducing Drive Data platform for sensor fusion

Qualcomm is introducing the Qualcomm Drive Data Platform to collect and analyze intelligently information from a vehicle’s sensors. Cars will be able to determine their location up to lane-level accuracy, to monitor and to learn driving patterns, to perceive their surroundings, and to share this reliable and accurate data with the rest of the world.

These capabilities will be key for many connected car applications, from shared mobility and fleet management to 3D high-definition mapping and automated driving. Qualcomm Drive Data platform is built on three pillars: heterogeneous connectivity; precise positioning; and on-device machine learning, all integrated into the Qualcomm Snapdragon solution.

More... | Comments (1)

Mitsubishi Electric showcasing 3D Advanced Mobile Mapping System at CES 2017

January 03, 2017

Mitsubishi Electric Corporation, along with Mitsubishi Electric US, Inc., will display a future concept of the recently released new model of its Mitsubishi Mobile Mapping System, the MMS-G22, at CES 2017. The MMS-G220 is a highly accurate measuring system using car-mounted GPS antennas, laser scanners and cameras. (Earlier post.)

The system gathers 3-D positioning data of road surfaces and roadside features to an absolute accuracy of 4 inches (10 cm), allowing the creation of comprehensive 3D maps to the level of accuracy needed to support autonomous driving.

More... | Comments (3)

Lucid Motors chooses Mobileye as partner for autonomous vehicle technology

December 30, 2016

Luxury EV developer Lucid Motors and Mobileye N.V. announced a collaboration to enable autonomous driving capability on Lucid vehicles. Lucid will launch its first car, the Lucid Air (earlier post), with a complete sensor set for autonomous driving from day one, including camera, radar and lidar sensors.

Lucid chose Mobileye to provide the primary compute platform, full 8-camera surround view processing, sensor fusion software, Road Experience Management (REM) crowd-based localization capability, and reinforcement learning algorithms for Driving Policy. These technologies will enable a full Advanced Driver Assistance System (ADAS) suite at launch, and then enable a logical and safe transition to autonomous driving functionality through over-the-air software updates.

More... | Comments (5)

HERE and Mobileye to partner on crowd-sourced HD mapping for automated driving

December 29, 2016

High-definition (HD) mapping company HERE and Mobileye, developer of computer vision and machine learning, data analysis, localization and mapping for Advanced Driver Assistance Systems and autonomous driving, plan to enter a strategic partnership that links their respective autonomous driving technologies into an enhanced industry-leading offering for automakers.

Under the partnership, Mobileye’s Roadbook—a detailed, cloud-based map of localized drivable paths and visual landmarks constantly updated in real-time—will be integrated as a data layer in HERE HD Live Map, HERE’s real-time cloud service for partially, highly and fully automated vehicles. Roadbook information will provide an important additional layer of real-time contextual awareness by gathering landmark and roadway information to assist in making a vehicle more aware of—and better able to react to—its surroundings, as well as to allow for more accurate vehicle positioning on the road.

More... | Comments (0)

Ford introducing next-gen Fusion Hybrid autonomous development vehicle at CES and NAIAS in January

December 28, 2016

Ford Motor Company is introducing its next-generation Fusion Hybrid autonomous development vehicle; the car will first appear at CES 2017 and the North American International Auto Show in January. The new vehicle uses the current Ford autonomous vehicle platform, but ups the processing power with new computer hardware.

Electrical controls are closer to production-ready, and adjustments to the sensor technology, including placement, allow the car to better see what’s around it. New LiDAR sensors have a sleeker design and more targeted field of vision, which enables the car to now use just two sensors rather than four, while still getting just as much data.

More... | Comments (4)

TriLumina to demo 256-pixel 3D solid-state LiDAR and ADAS systems for autonomous driving at CES 2017

December 27, 2016

At CES 2017, TriLumina (earlier post)—a spin-out from Sandia National Laboratories—will demonstrate, in collaboration with LeddarTech (earlier post), an innovative 256-pixel, 3D LiDAR solution for autonomous driving applications powered by TriLumina’s breakthrough laser illumination module and LeddarTech’s LeddarCore ICs.

TriLumina has developed eye-safe, vertical-cavity surface-emitting lasers (VCSELs). The TriLumina illumination modules replace the expensive, bulky scanning LiDARs being used in current autonomous vehicle demonstration programs with high resolution and long-range sensing in a small, robust and cost-effective package.

More... | Comments (3)

U of Waterloo Autonomoose autonomous vehicle on the road in Canada

December 23, 2016

Researchers from the University of Waterloo Center for Automotive Research (WatCAR) in Canada are modifying a Lincoln MKZ Hybrid to autonomous drive-by-wire operation. The research platform, dubbed “Autonomoose” is equipped with a full suite of radar, sonar, lidar, inertial and vision sensors; NVIDIA DRIVE PX 2 AI platform (earlier post) to run a complete autonomous driving system, integrating sensor fusion, path planning, and motion control software; and a custom autonomy software stack being developed at Waterloo as part of the research.

Recently, the Autonomoose autonomously drove a crew of Ontario Ministry of Transportation officials to the podium of a launch event to introduce the first car approved to hit the roads under the province’s automated vehicle pilot program.

More... | Comments (0)

LeddarTech showcasing 2D and 3D solid-state LiDARs for mass-market autonomous driving deployments; Leddar Ecosystem

December 16, 2016

At CES 2017, LeddarTech will be showcasing 2D and 3D high-resolution LiDAR solutions for autonomous driving applications based on its next-generation LeddarCore ICs and developed with the collaboration of leading-edge suppliers and partners from the newly-established Leddar Ecosystem. (Earlier post.)

Presented publicly for the first time, these systems demonstrate the scalability of Leddar technology and its ability to meet the high levels of performance, resolution, and cost-effectiveness required by Tier-1 and OEMs for mass-market autonomous driving applications. These LiDAR systems’ production versions will offer resolutions of up to 512×64 on a field of view of 120×20 degrees, and detection ranges that exceed 200 m for pedestrians and over 300 m for vehicles.

More... | Comments (2)

Green Car Congress © 2017 BioAge Group, LLC. All Rights Reserved. | Home | BioAge Group