Green Car Congress
Home Topics Archives About Contact  RSS Headlines
Google search

GCC Web

Sensors

[Due to the increasing size of the archives, each topic page now contains only the prior 365 days of content. Access to older stories is now solely through the Monthly Archive pages or the site search function.]

Stanford, UCSD team develops “4D” camera; use in autonomous vehicles

August 07, 2017

Engineers at Stanford University and the University of California San Diego (UCSD) have developed a monocentric lens with multiple sensors using microlens arrays, allowing light field (LF) capture with an unprecedented field of view (FOV)—a camera that generates four-dimensional images and can capture 138 degrees of information.

The new camera—the first single-lens, wide field of view, light field (LF) camera—could generate information-rich images and video frames that will enable robots to better navigate the world and understand certain aspects of their environment, such as object distance and surface texture. The researchers also see this technology being used in autonomous vehicles and augmented and virtual reality technologies. Researchers presented their new technology at the computer vision conference CVPR 2017 in July.

More... | Comments (1)

NVIDIA & AutonomouStuff speed development of autonomous vehicles with DRIVE PX on Wheels; full kits in cars

July 29, 2017

NVIDIA has partnered with AutonomouStuff, a supplier of components used within autonomy systems, to deliver kits packaging the NVIDIA DRIVE PX in vehicles with sensors. DRIVE PX on Wheels is available in three versions: advanced, basic and custom.

Each comes with a vehicle configured by AutonomouStuff with sensors and the NVIDIA end-to-end autonomous driving platform, allowing developers to focus on creating their own self-driving solutions. The advanced kit begins with a Ford Fusion, which is loaded with a NVIDIA DRIVE PX AI car computer, as well as cameras, LIDAR, radar, navigation sensors and a drive-by-wire system, including:

More... | Comments (2)

New Audi A8 debuting Level 3 autonomous AI traffic jam pilot; parking and remote garage pilots; zFAS controller

July 12, 2017

Audi AI is the umbrella term for its new generation of high-end assistance technologies extending all the way up to highly automated driving. Three of them will be available for the first time in the new A8: the Audi AI traffic jam pilot (the first Level 3 autonomous traffic jam feature on the market); the Audi AI (remote) parking pilot and the Audi AI remote garage pilot. The core of the systems which Audi is developing for piloted driving is the central driver assistance system control unit (zFAS), which is also making its debut in the new Audi A8.

Audi AI traffic jam pilot. The Audi AI traffic jam pilot is a Level 3 system, whereby the car takes over the task of driving in certain situations. The driver no longer needs to monitor it permanently, as with a Level 2 system; the driver must merely be capable of taking back responsibility on the system’s prompting.

More... | Comments (0)

Autoliv to use Velodyne LiDAR technology for automotive-grade LiDAR product

July 03, 2017

Autoliv Inc., the world’s largest automotive safety company, has joined Velodyne LiDAR’s Tier-1 Program and will develop and manufacture an automotive-grade LiDAR product using Velodyne’s core 3D software technology and proprietary LiDAR ASIC engine. The first applications will be in the robotaxi segment.

Pursuant to the agreement, Autoliv will develop and market a scalable automotive-grade LiDAR sensor using Velodyne’s core 3D software technology and proprietary LiDAR ASIC engine coupled with Autoliv’s component development and verification capability.

More... | Comments (1)

Daimler introducing first active emergency braking assistant for buses to feature pedestrian recognition

June 30, 2017

Daimler Buses and its product brands Mercedes-Benz and Setra are introducing Active Brake Assist 4 (ABA 4) with pedestrian recognition—the first emergency braking assistance system in a bus to automatically brake for pedestrians. Active Brake Assist 3 already carries out maximum full-stop braking for vehicles ahead and for stationary obstacles.

The new Active Brake Assist 4 with pedestrian recognition warns the driver visually and audibly of any potential collision with pedestrians and at the same time automatically triggering partial braking.

More... | Comments (0)

Audi and Johannes Kepler University of Linz to establish center for artificial intelligence; autonomous driving

June 21, 2017

Audi and the Johannes Kepler University of Linz (JKU) are establishing the “Audi.JKU deep learning center” in Linz to conduct joint research into the intelligent car of the future. Through cooperation with the Institute for Bioinformatics headed by Prof. Sepp Hochreiter, Audi plans to promote the use of artificial intelligence in automobiles.

Prof. Sepp Hochreiter is one of Europe’s leading experts in the field of artificial intelligence (AI). He has made major contributions with fundamental research into deep‑learning technologies – a methodology that is based on the learning processes of the human brain. The long short‑term memory (LSTM) that he developed is used for speech‑recognition software in all smartphones all over the world. (LSTM is a recurrent neural net (RNN) architecture proposed by Hochreiter and his colleague Jürgen Schmidhuber in 1996.)

More... | Comments (0)

HELLA and ZF form strategic partnership on sensors for ADAS and autonomous driving; first camera system due in 2020

ZF and HELLA are entering into a strategic partnership, with the focus on camera systems, imaging and radar sensor technology for advanced driver assistance systems (ADAS) and autonomous driving. Both automotive suppliers said they will each benefit from this cooperation on sensor technology, particularly for front camera systems, imaging and radar systems.

ZF will further strengthen its portfolio as a systems supplier offering both modern assistance systems and autonomous driving functions, while HELLA will drive technological development and benefits with a broader market access for its technologies. The first joint development project in camera technology will start immediately, with the objective of a market launch in 2020.

More... | Comments (0)

Honda targeting introduction of Level 4 automated driving capability by 2025; Level 3 by 2020

June 08, 2017

Honda is targeting the year 2025 for the introduction of vehicles with highly-automated driving capability in most driving situations (SAE Level 4). This new goal builds upon earlier-announced plans for Honda and Acura vehicles to have highly-automated freeway driving capability (SAE Level 3) by 2020.

Honda Motor Co., Ltd. President & CEO Takahiro Hachigo made the announcement at a media briefing held at Honda R&D Co., Ltd. in Japan, where media were able to test-drive Honda automated vehicle technologies, including systems with advanced artificial intelligence (AI), in several complex driving scenarios.

More... | Comments (0)

New low-voltage W-band millimeter-wave technology; applicable for cars, bikes, cellphones

June 05, 2017

Hiroshima University and Mie Fujitsu Semiconductor Limited (MIFS) have developed a low-power millimeter-wave amplifier that feeds on 0.5 V power supply and covers the frequency range from 80 GHz to 106 GHz. It was fabricated using MIFS’s Deeply Depleted Channel (DDC) technology.

This is the first W-band (75−110 GHz) amplifier that can operate even with such a low power-supply voltage. Details of the technology are being presented this week at the IEEE Radio Frequency Integrated Circuits Symposium (RFIC) 201 in Honolulu, Hawaii.

More... | Comments (1)

BMW Group, Intel and Mobileye bringing Delphi in as partner in autonomous driving platform work

May 16, 2017

The BMW Group, Intel and Mobileye will bring Delphi onboard as a development partner and system integrator for the autonomous driving platform currently under development. The common platform is intended to address level 3 to level 5 automated driving and will be made available to multiple car vendors and other industries who could benefit from autonomous machines and deep machine learning. (Earlier post.)

The four partners intend to jointly deploy a cooperation model to deliver and scale the developed solutions to the broader OEM automotive industry and potentially other industries.

More... | Comments (0)

Sendyne introduces first isolation monitor for EVs and HEVs capable of detecting potential electrical hazards during dynamic operation

April 27, 2017

Sendyn, a provider of technologies for battery system management including current, voltage and temperature measurement ICs and modules (earlier post), has introduced the SIM100, a new type of automotive-rated isolation monitoring safety device that is capable of detecting potential electrical hazards during the dynamic operation of high-voltage unearthed systems—such as electric and hybrid vehicles.

The SIM100 module is the first device of its kind capable of unambiguously detecting the electrical isolation state of a high-voltage system while the system is active and operating, and experiencing large voltage variations. State-of-the-art technology today is limited to detecting only resistive leakages and only when the system voltage does not vary significantly. In another first, the SIM100 detects both resistive leakages and capacitively stored energy that could be harmful to human operators.

More... | Comments (0)

Waymo adding 500 more Chrysler Pacifica Hybrid Minivans to self-driving program; early rider program

April 25, 2017

Waymo (formerly the Google self-driving car project) will add an additional 500 Chrysler Pacifica Hybrid minivans (earlier post) to expand its self-driving program. FCA previously delivered 100 minivans, modified for self-driving, to Waymo during the second half of 2016. (Earlier post.) Production of the additional 500 minivans will ramp up beginning next month. Waymo will then outfit these vehicles with its self-driving technology.

Waymo also is inviting members of the public to use its fleet of self-driving vehicles for everyday travel. Waymo’s early rider program will give selected Phoenix residents the opportunity to experience the self-driving Chrysler Pacifica Hybrid minivans for the first time at no charge.

More... | Comments (6)

Velodyne introduces new low-cost fixed-laser solid-state LiDAR for autonomous driving and ADAS applications

April 20, 2017

Velodyne LiDAR announced its new fixed-laser, solid-state Velarray LiDAR (Light Detection and Ranging) sensor, a cost-effective yet high-performance and rugged automotive product in a small form factor. (Earlier post.) The Velarray sensor can be seamlessly embedded in both autonomous vehicles and advanced driver-assist safety (ADAS) systems.

The new Velarray LiDAR sensor uses Velodyne’s proprietary ASICs (Application Specific Integrated Circuits) to achieve superior performance metrics in a small package size of 125mm x 50mm x 55mm that can be embedded into the front, sides, and corners of vehicles. It provides up to a 120-degree horizontal and 35-degree vertical field-of-view, with a 200-meter range even for low-reflectivity objects.

More... | Comments (4)

Cadillac Super Cruise hands-free system uses driver attention system and LiDAR map database

April 11, 2017

The 2018 Cadillac CT6 will feature Super Cruise (earlier post) autonomous driving technology for the highway. Unlike other driver assistance systems, Super Cruise utilizes two advanced technology systems—a driver attention system and precision LiDAR map data—to ensure safe and confident vehicle operation. These systems are added to the network of cameras and radar sensors in the CT6, providing a more data-rich approach to driver assistance.

Super Cruise will be offered as an option on the 2018 Cadillac CT6 prestige sedan, starting this fall in the US and Canadian markets.

More... | Comments (1)

Renesas introduces “autonomy” open platform for ADAS and automated driving

Renesas Electronics Corporation launched Renesas “autonomy”, a new advanced driving assistance systems (ADAS) and automated driving platform. As the first product under the new autonomy platform, Renesas released the R-Car V3M high-performance image recognition system-on-chip (SoC) optimized primarily for use in smart camera applications, as well as surround view systems or even LiDARs.

The new R-Car V3M SoC complies with the ISO 26262 functional safety standard, delivers low-power hardware acceleration for vision processing, and is equipped with a built-in image signal processor (ISP), freeing up board space and reducing system manufacturers’ system costs. Renesas is exhibiting its first Renesas autonomy demonstrator, developed based on the new R-Car V3M SoC, at DevCon Japan in Tokyo.

More... | Comments (0)

Continental developing Road Condition Observer for active safety

April 04, 2017

Continental is developing a new system called Road Condition Observer that uses vehicle sensors and cloud data to classify a road surface as dry, wet, covered with snow or icy and to assess the grip of the road surface.

This knowledge, in turn, allows the vehicle to adjust the functions of advanced driver assistance systems to the actual road conditions, said Bernd Hartmann, Head of the Enhanced ADAS (Advanced Driver Assistance Systems) & Tire Interactions project group within the Advanced Engineering department of Continental’s Chassis & Safety division.

More... | Comments (0)

Bosch and Daimler partner to develop Level 4, 5 autonomous driving systems

Bosch and Daimler are collaborating to advance the development of fully automated and driverless driving. The two companies have entered into a development agreement to bring fully automated (SAE Level 4) and driverless (SAE Level 5) driving to urban roads by the beginning of the next decade. The objective is to develop software and algorithms for an autonomous driving system.

The prime objective of the project is to achieve the production-ready development of a driving system which will allow cars to drive fully autonomously in the city. The idea behind it is that the vehicle should come to the driver rather than the other way round. Within a specified area of town, customers will be able to order an automated shared car via their smartphone. The vehicle will then make its way autonomously to the user and the onward journey can commence.

More... | Comments (0)

NASA’s hybrid computer enables Raven’s autonomous docking capability

March 22, 2017

A hybrid computing system developed at NASA’s Goddard Space Flight Center in Greenbelt, Maryland, is the enabling technology behind an ambitious experiment testing a relative navigation and autonomous docking capability known as Raven.

Developed by the Satellite Servicing Projects Division (SSPD) the carry-on luggage-sized module was launched 19 February aboard SpaceX’s Dragon spacecraft, along with other experiments deployed outside the International Space Station on an experiment pallet. Raven is testing and maturing visible, infrared and LiDAR sensors and machine-vision algorithms; the module will bring NASA one step closer to realizing a groundbreaking autopilot capability that can be applied to many NASA missions for decades to come.

More... | Comments (0)

Mitsubishi Electric develops technologies for automated mapping and extraction of transitions in mapping landscape for high-precision 3D maps

March 16, 2017

Mitsubishi Electric Corporation has developed two new technologies to support the creation and maintenance of high-precision 3D maps required for autonomous driving (earlier post): AI-based automated mapping; and the extraction of transitions in mapping landscapes.

The technologies are based on the company’s own Mobile Mapping System (MMS) (earlier post) for the creation of highly precise three-dimensional maps that provide static information of roads and surrounding objects. Both technologies will be exhibited for the first time at CeBIT 2017 in Hannover, Germany from 20-24 March 2017.

More... | Comments (0)

Intel to acquire Mobileye for $15.3 billion; targeting autonomous driving

March 13, 2017

Intel Corporation announced a definitive agreement under which Intel would acquire Mobileye, a global leader in the development of computer vision and machine learning, data analysis, localization and mapping for advanced driver assistance systems and autonomous driving. A subsidiary of Intel will commence a tender offer to acquire all of the issued and outstanding ordinary shares of Mobileye for $63.54 per share in cash, representing an equity value of approximately $15.3 billion and an enterprise value of $14.7 billion.

The combination is expected to accelerate innovation for the automotive industry and position Intel as a leading technology provider in the fast-growing market for highly and fully autonomous vehicles.

More... | Comments (10)

Renault and Heudiasyc create shared research facility for perception and localization systems for autonomous vehicles

March 03, 2017

Renault and Heudiasyc (Heuristics and Diagnostics for Complex Systems), a joint research unit formed by UTC (Université de Technologie de Campiègne) and the CNRS, have created SIVALab, a laboratory specializing in localization and perception systems for autonomous vehicles. SIVALab (a French acronym for Integrated Systems for Autonomous Vehicles Lab) is based in Compiègne, north of Paris, France.

This scientific and technological partnership has been set up for an initial, extendable period of four years. It is founded on an existing association that began more than 10 years ago and will use the Renault ZOE-based autonomous vehicle platforms developed by Heudiasyc. SIVALab is being created to provide a structure geared to long-term scientific developments and major programs.

More... | Comments (0)

Toyota Research Institute displays Gen 2 autonomous test vehicle; machine vision and machine learning

The Toyota Research Institute (TRI) displayed its 2.0 generation advanced safety research vehicle at the company’s Prius Challenge event in Sonoma California. (Earlier post.)The all-new test vehicle will be used to explore a full range of autonomous driving capabilities.

The new advanced safety research vehicle is the first autonomous testing platform developed entirely by TRI, and reflects the rapid progress of its autonomous driving program, said TRI CEO Gill Pratt. The system is computationally rich and perception-rich, focusing heavily on machine vision and machine learning. The layered and overlapping LiDAR, radar and camera sensor array reduce the need to depend too heavily on high-definition maps—especially for near-term systems which will be designed for use in areas where such maps don’t yet exist.

More... | Comments (1)

Qualcomm and TomTom partner on crowdsourcing high-definition mapping data for autonomous driving

February 27, 2017

Qualcomm Technologies is working with TomTom on using the Qualcomm Drive Data Platform for high-definition (HD) map crowdsourcing, to accelerate the future of autonomous driving. Qualcomm Drive Data Platform intelligently collects and analyzes data from different vehicle sensors, supporting smarter vehicles to determine their location, monitor and learn driving patterns, perceive their surroundings and share this perception with the rest of the world reliably and accurately.

TomTom’s HD Map, including RoadDNA, is a highly accurate, digital map-based product, which assists automated vehicles to precisely locate themselves on the road and help determine which way to maneuver, even when traveling at high speeds.

More... | Comments (0)

New ultrafast camera for self-driving vehicles and drones

February 17, 2017

Scientists from Nanyang Technological University, Singapore (NTU Singapore) have developed an ultrafast high-contrast camera that could help self-driving cars and drones see better in extreme road conditions and in bad weather.

Unlike typical optical cameras, which can be blinded by bright light and unable to make out details in the dark, NTU’s new smart camera can record the slightest movements and objects in real time. The new camera records the changes in light intensity between scenes at nanosecond intervals, much faster than conventional video, and it stores the images in a data format that is many times smaller as well.

More... | Comments (7)

Volkswagen and Mobileye partner on autonomous driving; REM in VWs in 2018

February 14, 2017

Volkswagen and Mobileye will implement a new navigation standard for autonomous driving starting in 2018. Future Volkswagen models will use the camera-based map and localization technology Road Experience Management (REM) from Mobileye. (Earlier post.)

REM uses crowd-sourcing (data from many cars – the swarm) to generate real-time data for precise localization and acquisition of high-definition track data. Volkswagen cars, which are equipped with front cameras, will acquire lane markings and road information via optical sensor systems from Mobileye; this information flows in compressed format into a cloud. This fleet data is used for continuous improvement of high-definition navigation maps with highly precise localization capability.

More... | Comments (3)

LeddarTech selects IDT to develop new LeddarCore IC for mass-market solid-state LiDARs; ADAS and autonomous driving

February 10, 2017

LeddarTech Inc. and Integrated Device Technology, Inc. (IDT) have entered into a partnership agreement jointly to develop and supply the LeddarCore LCA2 integrated circuits. (Earlier post.) The LeddarCore is a receiver IC which is a key element within an automotive LiDAR system. This newest generation of LeddarCore IC enables solid-state implementations of high-performance, low-cost automotive LiDARs, which are required for the mass-market deployment of semi-autonomous and autonomous vehicles.

As part of the agreement, IDT, a developer of complete mixed-signal solutions for automotive, communications, computing, consumer, and industrial markets, will leverage its advanced expertise for component requirements analysis, architecture, design, development, characterization, qualification and transfer to manufacturing of the LCA2.

More... | Comments (2)

New high-resolution time-to-digital converter from ams offers better object detection and avoidance for LiDAR

January 24, 2017

ams AG, a leading provider of high performance sensor solutions and analog ICs, has launched a new version of its market-leading time-to-digital converter (TDC) offering improved speed and precision together with low power consumption. The new TDC-GPX2 also features standard low-voltage differential signaling (LVDS) and serial peripheral (SPI) interfaces, and a new, smaller 9mm x 9mm QFN64 package.

TDCs from ams, which can measure short time intervals with great precision, are widely used in light detection and ranging (LIDAR) and laser-ranging devices, in positron emission tomography (PET) medical scanners, and in automated test equipment (ATE). The introduction of the TDC-GPX2 means that these applications can benefit from increased resolution up to 10ps and a new high sampling rate of up to 70 Msamples/s.

More... | Comments (3)

Koito and Quanergy collaborate to design automotive headlight concept with built–in solid-state LiDAR

January 07, 2017

Koito Manufacturing Co., Ltd., the largest global maker of automotive headlights, and Quanergy Systems, Inc., a leading provider of LiDAR sensors and smart sensing solutions, are collaborating to design an automotive headlight concept with built-in Quanergy S3 solid state LiDAR sensors (earlier post). The Koito headlight with built-in sensors is on display at CES 2017.

The Koito headlights, which will be located on the corners of a vehicle, each incorporates two compact Quanergy S3 solid state LiDARs that perform sensing forward and to the side, and provide real-time long-range 3D views of the environment around the vehicle and the ability to recognize and track objects.

More... | Comments (9)

Next-gen Audi A8 to feature MIB2+, series debut of zFAS domain controller, Mobileye image recognition with deep learning; Traffic Jam Pilot

January 05, 2017

Audi’s next-generation A8, premiering this year, will feature the first implementation of the MIB2+ (Modular Infotainment Platform). The key element in this new implementation of the MIB is NVIDIA’s Tegra K1 processor (earlier post), which makes new functions possible and has the computing power needed to support several high-resolution displays—including the second-generation Audi virtual cockpit. Onboard and online information will merge, making the car part of the cloud to a greater degree than ever.

The A8 also marks the series debut of the the central driver assistance controller (zFAS), which also features the K1; in the future, the X1 processor (earlier post) will be applied in this domain controller. The zFAS, developed in collaboration with TTTech, Mobileye, NVIDIA and Delphi, also integrates a Mobileye image processing chip. (Earlier post.)

More... | Comments (1)

Renesas Electronics and TTTech deliver highly automated driving platform

Renesas Electronics and TTTech Computertechnik AG, a global leader in robust networking and safety controls, have developed a highly automated driving platform (HADP). The new HADP is a prototype electronic control unit (ECU) for mass production vehicles with integrated software and tools, which demonstrates how to use Renesas and TTTech technologies combined in a true automotive environment for autonomous driving. The HADP accelerates the path to mass production for Tier 1s and OEMs.

The newly released HADP is the first outcome of the collaboration between TTTech and Renesas announced in January 2016 (earlier post), and is an extended version of the HAD solution kit released in October 2016. It is based on dual R-Car H3 system-on-chips (SoCs) (earlier post) and the RH850/P1H-C microcontroller (MCU).

More... | Comments (0)

Audi & NVIDIA partner to deliver fully automated driving with AI starting in 2020; piloted Q7 w/ neural network CES demo

Audi announced a partnership with NVIDIA to use artificial intelligence in delivering highly automated vehicles starting in 2020. Deep learning technology will enable skilled handling of real-road complexities, delivering safer automated vehicles earlier. The first phase of this expanded collaboration between the nearly decade-long partners focuses on NVIDIA DRIVE PX, which uses trained AI neural networks to understand the surrounding environment, and to determine a safe path forward. (Earlier post.)

Audi and NVIDIA have combined their engineering and visual computing technologies in the past on Audi innovations such as Audi MMI navigation and the Audi virtual cockpit. Later this year Audi will introduce the next-generation Audi A8 featuring Traffic Jam Pilot—the world’s first Level 3 automated vehicle (as defined by SAE International) equipped with a first-generation central driver assistance domain controller (zFAS) that integrates NVIDIA computing hardware and software. (Earlier post.)

More... | Comments (6)

BMW, Intel, Mobileye: 40 autonomous BMWs to be on road by 2H 2017; standards-based open platform for autonomy

BMW Group, Intel and Mobileye announced that a fleet of approximately 40 autonomous BMW vehicles will be on the roads by the second half of 2017, demonstrating the significant advancements made by the three companies towards fully autonomous driving. Revealing this at a podium discussion held during a joint press conference at CES, the companies further explained that the BMW 7 Series will employ advanced Intel and Mobileye technologies during global trials starting in the US and Europe.

In July 2016, BMW Group, Intel and Mobileye announced a collaboration to bring solutions for highly and fully automated driving into series production by 2021. The three said they would create a standards-based open platform—from door locks to the datacenter—for the next generation of cars. (Earlier post.) The companies have since developed a scalable architecture that can be adopted by other automotive developers and carmakers to pursue state of the art designs and create differentiated brands. The offerings scale from individual key integrated modules to a complete end-to-end solution providing a wide range of differentiated consumer experiences.

More... | Comments (0)

Renesas Electronics unveils RH850/V1R-M automotive radar solution for ADAS and autonomous driving vehicles

January 04, 2017

Advanced semiconductor supplier Renesas Electronics Corporation introduced the RH850/V1R—its first product from the new RH850-based, 32-bit, automotive radar microcontroller (MCU) series—that will deliver the high performance and features required for enabling future advanced driver assistance systems (ADAS) and autonomous driving vehicles. The RH850/V1R-M includes a digital signal processor (DSP) and high speed serial interfaces and is specifically designed for middle- to long-range radars.

Vehicles are being equipped with a broad spectrum of sensors such as cameras, LiDAR and ultrasonic sensors to support expanded advanced driver assistance (ADAS) and emerging autonomous driving functionality. Radar sensors are needed for ADAS applications—including advanced emergency braking and adaptive cruise control—because, unlike other sensors, radar sensors are not negatively affected by external environmental limitations which includes adverse weather conditions, such as rain, fog or whether the sun is shining or not.

More... | Comments (0)

Qualcomm introducing Drive Data platform for sensor fusion

Qualcomm is introducing the Qualcomm Drive Data Platform to collect and analyze intelligently information from a vehicle’s sensors. Cars will be able to determine their location up to lane-level accuracy, to monitor and to learn driving patterns, to perceive their surroundings, and to share this reliable and accurate data with the rest of the world.

These capabilities will be key for many connected car applications, from shared mobility and fleet management to 3D high-definition mapping and automated driving. Qualcomm Drive Data platform is built on three pillars: heterogeneous connectivity; precise positioning; and on-device machine learning, all integrated into the Qualcomm Snapdragon solution.

More... | Comments (1)

Mitsubishi Electric showcasing 3D Advanced Mobile Mapping System at CES 2017

January 03, 2017

Mitsubishi Electric Corporation, along with Mitsubishi Electric US, Inc., will display a future concept of the recently released new model of its Mitsubishi Mobile Mapping System, the MMS-G22, at CES 2017. The MMS-G220 is a highly accurate measuring system using car-mounted GPS antennas, laser scanners and cameras. (Earlier post.)

The system gathers 3-D positioning data of road surfaces and roadside features to an absolute accuracy of 4 inches (10 cm), allowing the creation of comprehensive 3D maps to the level of accuracy needed to support autonomous driving.

More... | Comments (3)

Lucid Motors chooses Mobileye as partner for autonomous vehicle technology

December 30, 2016

Luxury EV developer Lucid Motors and Mobileye N.V. announced a collaboration to enable autonomous driving capability on Lucid vehicles. Lucid will launch its first car, the Lucid Air (earlier post), with a complete sensor set for autonomous driving from day one, including camera, radar and lidar sensors.

Lucid chose Mobileye to provide the primary compute platform, full 8-camera surround view processing, sensor fusion software, Road Experience Management (REM) crowd-based localization capability, and reinforcement learning algorithms for Driving Policy. These technologies will enable a full Advanced Driver Assistance System (ADAS) suite at launch, and then enable a logical and safe transition to autonomous driving functionality through over-the-air software updates.

More... | Comments (5)

HERE and Mobileye to partner on crowd-sourced HD mapping for automated driving

December 29, 2016

High-definition (HD) mapping company HERE and Mobileye, developer of computer vision and machine learning, data analysis, localization and mapping for Advanced Driver Assistance Systems and autonomous driving, plan to enter a strategic partnership that links their respective autonomous driving technologies into an enhanced industry-leading offering for automakers.

Under the partnership, Mobileye’s Roadbook—a detailed, cloud-based map of localized drivable paths and visual landmarks constantly updated in real-time—will be integrated as a data layer in HERE HD Live Map, HERE’s real-time cloud service for partially, highly and fully automated vehicles. Roadbook information will provide an important additional layer of real-time contextual awareness by gathering landmark and roadway information to assist in making a vehicle more aware of—and better able to react to—its surroundings, as well as to allow for more accurate vehicle positioning on the road.

More... | Comments (0)

Ford introducing next-gen Fusion Hybrid autonomous development vehicle at CES and NAIAS in January

December 28, 2016

Ford Motor Company is introducing its next-generation Fusion Hybrid autonomous development vehicle; the car will first appear at CES 2017 and the North American International Auto Show in January. The new vehicle uses the current Ford autonomous vehicle platform, but ups the processing power with new computer hardware.

Electrical controls are closer to production-ready, and adjustments to the sensor technology, including placement, allow the car to better see what’s around it. New LiDAR sensors have a sleeker design and more targeted field of vision, which enables the car to now use just two sensors rather than four, while still getting just as much data.

More... | Comments (4)

TriLumina to demo 256-pixel 3D solid-state LiDAR and ADAS systems for autonomous driving at CES 2017

December 27, 2016

At CES 2017, TriLumina (earlier post)—a spin-out from Sandia National Laboratories—will demonstrate, in collaboration with LeddarTech (earlier post), an innovative 256-pixel, 3D LiDAR solution for autonomous driving applications powered by TriLumina’s breakthrough laser illumination module and LeddarTech’s LeddarCore ICs.

TriLumina has developed eye-safe, vertical-cavity surface-emitting lasers (VCSELs). The TriLumina illumination modules replace the expensive, bulky scanning LiDARs being used in current autonomous vehicle demonstration programs with high resolution and long-range sensing in a small, robust and cost-effective package.

More... | Comments (3)

U of Waterloo Autonomoose autonomous vehicle on the road in Canada

December 23, 2016

Researchers from the University of Waterloo Center for Automotive Research (WatCAR) in Canada are modifying a Lincoln MKZ Hybrid to autonomous drive-by-wire operation. The research platform, dubbed “Autonomoose” is equipped with a full suite of radar, sonar, lidar, inertial and vision sensors; NVIDIA DRIVE PX 2 AI platform (earlier post) to run a complete autonomous driving system, integrating sensor fusion, path planning, and motion control software; and a custom autonomy software stack being developed at Waterloo as part of the research.

Recently, the Autonomoose autonomously drove a crew of Ontario Ministry of Transportation officials to the podium of a launch event to introduce the first car approved to hit the roads under the province’s automated vehicle pilot program.

More... | Comments (0)

LeddarTech showcasing 2D and 3D solid-state LiDARs for mass-market autonomous driving deployments; Leddar Ecosystem

December 16, 2016

At CES 2017, LeddarTech will be showcasing 2D and 3D high-resolution LiDAR solutions for autonomous driving applications based on its next-generation LeddarCore ICs and developed with the collaboration of leading-edge suppliers and partners from the newly-established Leddar Ecosystem. (Earlier post.)

Presented publicly for the first time, these systems demonstrate the scalability of Leddar technology and its ability to meet the high levels of performance, resolution, and cost-effectiveness required by Tier-1 and OEMs for mass-market autonomous driving applications. These LiDAR systems’ production versions will offer resolutions of up to 512×64 on a field of view of 120×20 degrees, and detection ranges that exceed 200 m for pedestrians and over 300 m for vehicles.

More... | Comments (2)

Velodyne LiDAR announces new design for miniaturized, low-cost solid-state LiDAR sensors using GaN technology

December 13, 2016

Velodyne LiDAR announced a new design for a solid-state LiDAR sensor that can deliver a subsystem cost of less than US$50 when sold in high-volume manufacturing scale. The technology will impact the proliferation of LiDAR sensors in multiple industry sectors, including autonomous vehicles, ridesharing, 3D mapping, and drones.

LiDAR sensors that leverage this new design will be less expensive, easier to integrate due to their smaller size, and more reliable as a result of fewer moving parts. The technology can also be integrated in Velodyne LiDAR’s existing Puck form factors.

More... | Comments (2)

Daimler joining MIT CSAIL Alliance Program for AI work; cognitive vehicles

December 07, 2016

Daimler is becoming a new member of the MIT CSAIL Alliance Program. MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) is the largest research laboratory at MIT and one of the world’s most important centers of information technology research. With 1,000 members and more than 100 principal investigators coming from eight departments, CSAIL includes approximately 50 research groups organized into three focus areas: artificial intelligence, systems and theory.

Key CSAIL initiatives currently underway include tackling the challenges of big data, developing new models for wireless and mobile systems, securing computers and the cloud against cyber attacks, rethinking the field of artificial intelligence, and developing the next generation of robots. CSAIL Alliances is a gateway into the lab for organizations seeking a closer connection to the work, researchers and students of CSAIL.

More... | Comments (9)

Delphi & Mobileye to showcase Centralized Sensing Localization and Planning (CSLP) autonomous driving system in public demo at CES 2017

November 30, 2016

Delphi Automotive PLC and Mobileye will showcase their Centralized Sensing Localization and Planning (CSLP) automated driving system—which will be ready for production by 2019—on a 6.3-mile urban and highway combined public route in Las Vegas for CES 2017. (Earlier post.)

The partners said that CSLP is the first turnkey, fully integrated automated driving solution with an industry-leading perception system and computing platform. (Intel will provide the system-on-a-chip (SOC) for the systems.) The Las Vegas drive will tackle everyday driving challenges such as highway merges, congested city streets with pedestrians and cyclists and a tunnel.

More... | Comments (0)

Hyliion developing hybrid system for semi-trailers

November 27, 2016

Hyliion, a start-up founded by a group of graduate students at Carnegie Mellon, has developed an add-on hybrid system for semi-trailers. Combining regenerative braking and power boost for the trailer to reduce on-the-road fuel consumption, the system also functions as an auxiliary power unit to reduce engine-on idling. Their patent application describing the basics of the system was published earlier this month,

The Hyliion System—comprising motor, battery and control electronics—can power a truck cab for 20 hours, out-performing an industry standard, idle-free all-electric APU. Overall fuel savings are upwards of 30%, according the company.

More... | Comments (9)

Volkswagen’s 10-year evolution of Park Assist; heading toward trained parking and higher levels of autonomy

November 26, 2016

Volkswagen first introduced a parking assistance system based on ultrasonic sensors in the early 1990s. However, it was the “Park Assist” Gen 1 system presented in the Touran in 2007 that marked a foundational point in the commercial development of the technology. After it was activated, Park Assist was able to detect parallel parking spaces on the left and right sides of the road as the car passed them using special, side-oriented ultrasonic sensors, enabling semi-automatic parking for the first time.

Volkswagen engineers have continued to enhance the functionality, leading to the release of Gen 3 Park Assist in 2014, with a clear roadmap to the deployment of higher levels of autonomy, including trained parking: fully automated parking with a one-off training process. At a recent visit to Volkswagen’s Ehra proving ground (Prüfgelände Ehra), Green Car Congress had the opportunity to see a prototype of trained parking in action.

More... | Comments (3)

nuTonomy to test its self-driving cars on specific public roads in Boston

November 21, 2016

nuTonomy, developer of software for self-driving cars, has signed a Memorandum of Understanding (MOU) with the City of Boston and the Massachusetts Department of Transportation that authorizes nuTonomy to begin testing its growing fleet of self-driving cars on specific public streets in a designated area of Boston.

nuTonomy will begin testing its self-driving Renault Zoe electric vehicle before the end of the year in the Raymond L. Flynn Marine Park in the Seaport section of the city. nuTonomy outfits its vehicles with a software system which has been integrated with high-performance sensing and computing components to enable safe operation without a driver. The company’s autonomous and robotics technology system grew out of research conducted in MIT labs run by nuTonomy co-founders Karl Iagnemma and Emilio Frazzoli.

More... | Comments (2)

Hyundai introduces new autonomous IONIQ concept at AutoMobility LA

November 16, 2016

Hyundai Motor Company introduced the Autonomous IONIQ concept during its press conference at AutoMobility LA (Los Angeles Auto Show). With a design resembling the rest of the IONIQ lineup (earlier post), the vehicle is one of the few self-driving cars in development to have a LiDAR system hidden in its front bumper instead of installed on the roof, enabling it to look like any other car on the road and not a high school science project.

Hyundai’s goal for the autonomous IONIQ concept was to keep the self-driving systems as simple as possible. This was accomplished by using the production car’s Smart Cruise Control’s forward-facing radar, Lane Keep Assist cameras and integrated them with LiDAR technology.

More... | Comments (0)

Intel to invest more than $250M over next two years in autonomous driving; “Data is the new oil”

November 15, 2016

In a keynote address at the AutoMobility LA conference, Intel CEO Brian Krzanich announced that Intel Capital is targeting more than $250 million of additional new investments over the next two years to make fully autonomous driving a reality. This is the first time Intel is keynoting at an automotive conference, signifying how critical the automotive market has become for the company.

These investments will drive the development of technologies that push the boundaries on next-generation connectivity, communication, context awareness, deep learning, security, safety and more. Drilling down into the areas that will be fueled by the fresh investments, Krzanich highlighted technologies that will drive global Internet of Things (IoT) innovation in transportation; areas where technology can directly mitigate risks while improving safety, mobility, and efficiency at a reduced cost; and companies that harness the value of the data to improve reliability of automated driving systems.

More... | Comments (21)

Renesas Electronics delivers 2nd-gen ADAS view solution kit for surround view, electronic mirrors and driver monitoring for autonomous driving

November 08, 2016

Renesas Electronics Corporation has introduced a new all-in-one Advanced Driver Assistance Systems (ADAS) view solution kit. Expanding the success of the first-generation ADAS surround view kit that was launched in October 2015, Renesas’ second-generation ADAS view solution kit with up to eight cameras realizes next-generation electronic mirrors, driver monitoring and surround view systems at the same time.

It has become a standard in autonomous driving and ADAS applications to enable sensor fusion combining and processing the collected information from automotive cameras and radars for vehicles to recognize their surroundings. 360-degree surround view is expected to become an essential feature available in all vehicle segments. Additionally, mirrors will be replaced by cameras, and driver monitoring features will be required for autonomous driving and to increase safety.

More... | Comments (0)

Groupe Renault announces strategic partnership with computer vision innovator Chronocam

Groupe Renault has entered into a strategic development agreement with Chronocam SA (earlier post), a developer of biologically-inspired vision sensors and computer vision solutions for automotive applications. This agreement will focus on further developing and applying Chronocam’s innovative approach to sensing and processing visual inputs to Renault’s Advanced Driver Assistance Systems (ADAS) and autonomous driving developments.

Renault previously announced an investment in Chronocam’s Series B round of funding, which raised $15 million for the Paris-based start-up and includes a group of international venture capital funds including: Intel Capital, Robert Bosch Venture Capital, iBionext, 360 Capital and CEA investissement.

More... | Comments (0)

Toshiba advances deep learning with extremely low-power neuromorphic processor; supporting IoT edge devices

November 07, 2016

Toshiba has developed what it calls Time Domain Neural Network (TDNN)—a neural network using a time-domain analog and digital mixed signal processing technique—based on a new, extremely low-power consumption neuromorphic semiconductor circuit to perform processing for Deep Learning. (The acronym TDNN (time-delay neural network) is also used broadly to describe feed-forward neural networks, first described in a 1989 paper (Waibel et al.).

Deep learning—as could be applied, for example, in autonomous driving—requires massive numbers of calculations, typically executed on high performance processors that consume a lot of power. However, bringing the power of deep learning to IoT edge devices such as sensors and smart phones requires highly energy-efficient ICs that can perform the large number of required operations while consuming extremely little energy.

More... | Comments (1)

Daimler and Valens partner to bring HDBase T Automotive to vehicles in near future

At Electronica 2016 in Munich, Israeli HDBaseT chip maker Valens and Daimler announced their collaboration to bring HDBaseT Automotive into cars in the near future. Daimler has selected HDBaseT Automotive as the technology of choice to guarantee high performance of advanced infotainment, ADAS, and telematics systems.

Valens, as the inventor of HDBaseT and founder of the HDBaseT Alliance, brings the technology and expertise to accomplish the goal of commercializing HDBaseT-enabled vehicles in the near future. (Earlier post.)

More... | Comments (0)

New Telit autonomous navigation IoT module relies on internal sensors to deliver class-leading dead reckoning accuracy

November 06, 2016

Telit announced commercial availability of the SL869-3DR, a GNSS (global navigation satellite system) module for global use which leverages information from internal gyros, accelerometers and a barometric pressure sensor to perform dead reckoning (DR) navigation for application areas such as track & trace and in-vehicle systems.

The module delivers accurate position data either directly from its multi-constellation receiver or from a fully autonomous DR system, requiring no connections to external devices or components other than an antenna for satellite signal reception and power. The module allows integrators to design zero-installation, in-vehicle navigation and tracking devices for fleets and other commercial or consumer applications that operate simply perched on the dashboard, connected only to vehicle power.

More... | Comments (1)

Chronocam raises $15M in Series B; high-performance bio-inspired vision technology for autos and other machines

October 27, 2016

France-based Chronocam SA, a developer of biologically-inspired vision sensors and computer vision solutions for automotive, IoT and other applications requiring vision processing, raised $15 million in Series B financing. The funding comes from lead investor Intel Capital, along with iBionext, Robert Bosch Venture Capital GmbH, 360 Capital, CEAi and Renault Group.

Chronocam will use the investment to accelerate product development and commercialize its computer vision sensing and processing technology. The funding will also allow the company to expand into key markets, including the US and Asia.

More... | Comments (0)

Intel introducing new processor series dedicated for automotive applications

October 26, 2016

Intel is developing a new processor series dedicated for automotive applications. The A3900 series will enable a complete software-defined cockpit solution that incudes in-vehicle infotainment (IVI), digital instrument clusters and advanced driver assistance systems (ADAS)—all in a single, compact and cost-effective SoC.

Intel announced the new automotive processor family along with its introduction of the new Intel Atom processor E3900 series for the Internet of Things (IoT). The A3900 series will allow car makers to offer new levels of determinism for real-time decision-making required in next-generation cars. It is currently sampling with customers and will be available in Q1 2017.

More... | Comments (1)

Tesla putting hardware for full autonomy in all models; temporary loss of some Gen1 Autopilot functions

October 20, 2016

Tesla announced that effective immediately, new Tesla vehicles—including Model 3—will have the hardware needed to support full autonomous driving.

The required software for full autonomous driving is still under development and will need validation and regulatory approval. In fact, Teslas with the new hardware will temporarily lack certain features currently available on Teslas with first-generation Autopilot hardware, including some standard safety features such as automatic emergency braking, collision warning, lane holding and active cruise control.

More... | Comments (31)

Oryx Vision raises $17M to create novel depth-sensing solution for autonomous vehicles; LiDAR replacement

October 19, 2016

Oryx Vision has emerged from stealth with a veteran team from the Israeli high-tech industry to build a novel depth-sensing solution for autonomous vehicles that overcomes some of the limitations of current LiDAR systems. Oryx has raised $17 million in Series A funding led by Bessemer Venture Partners (BVP), with additional participation from Maniv Mobility and Trucks VC. BVP Partner Adam Fisher will join Oryx’s board of directors.

In order to drive accurately and safely, autonomous vehicles need a highly detailed 3D view of their environment. Existing depth-sensing solutions rely mostly on LiDAR devices, which send short laser pulses while rotating, receive the reflected light back with photo-electric sensors, and thus construct a 3D map of the car’s surroundings, pixel by pixel. However, current LiDAR is mechanically complicated, expensive and has a severe range limit due to eye-safety considerations, Oryx says.

More... | Comments (0)

DENSO & Toshiba partner on Deep Neural Network-IP for image recognition systems for ADAS & automated driving

October 17, 2016

DENSO Corporation and Toshiba Corporation have reached a basic agreement jointly to develop an artificial intelligence technology called Deep Neural Network-Intellectual Property (DNN-IP), which will be used in image recognition systems which have been independently developed by the two companies to help achieve advanced driver assistance and automated driving technologies.

The partners expect DNN, an algorithm modeled after the neural networks of the human brain, to perform recognition processing as accurately as, or even better, the human brain.

More... | Comments (0)

Infineon acquires Innoluce BV for high-performance solid-state LiDAR systems

October 11, 2016

Semiconductor company Infineon has acquired 100% of Innoluce BV, a fabless semiconductor company headquartered in Nijmegen. Based on the know-how of Innoluce, Infineon will develop chip components for high-performance light detection and ranging (LiDAR) systems. Both companies agreed on confidentiality on the terms.

Innoluce was founded in 2010 as an entrepreneurial spin-off of Royal Philips. It is a fabless semiconductor company headquartered in Nijmegen, The Netherlands, near the Dutch-German border. The company has a strong expertise in micro-electro-mechanical systems (MEMS). Innoluce is a leading innovator of miniature laser scanning modules that integrate silicon-based solid-state MEMS micro-mirrors. Such micro-mirrors are necessary to adjust the laser beams in automotive LiDAR systems.

More... | Comments (2)

HERE unveils next-generation open platform real-time data services for automotive industry

September 28, 2016

On the eve of the Paris Motor Show, HERE, the high-definition mapping and location services business company acquired by Audi, BMW and Daimler (earlier post), announced next-generation vehicle-sourced data services. The HERE Open Location Platform will harness real-time data generated by the on-board sensors of connected vehicles—even from competing car brands—to create a live depiction of the road environment.

Drivers will be able to access this view of the road through four services that provide information on traffic conditions, potential road hazards, traffic signage and on-street parking at high quality. The goal is to ensure that drivers have more accurate and timely information with which they can make better driving decisions. HERE plans to make the services commercially available to any customers both within and outside the automotive industry from the first half of 2017.

More... | Comments (0)

Tesla leans on radar for Autopilot in Version 8 software

September 12, 2016

With the release of Version 8 of its software, Tesla has made many updates to its Autopilot function. The most significant change however, is its new heavy reliance on the onboard radar. The radar was added to all Tesla vehicles in October 2014 as part of the Autopilot hardware suite, but originally was only meant to be a supplementary sensor to the primary camera and image processing system.

Now, however, Tesla is using more advanced signal processing to use the radar as a primary control sensor without requiring the camera to confirm visual image recognition. The changes come in the wake of the fatal crash (earlier post) in which Autopilot apparently did not detect the white side of the trailer against a brightly lit sky; nor did the driver. As a result, no brake was applied. In the aftermath of that incident, Tesla CEO Elon Musk tweeted that the company was “Working on using existing Tesla radar by itself (decoupled from camera) w temporal smoothing to create a coarse point cloud, like lidar”. (Earlier post.)

More... | Comments (7)

DENSO looks to increase holding in FUJITSU TEN, making it a group company

September 10, 2016

Auto parts supplier DENSO Corporation, Fujitsu Limited, and Toyota Motor Corporation have reached a basic agreement to start consideration of changing the capital structure of automotive electronics manufacturer FUJITSU TEN, in which the three companies have stakes. DENSO is part of the Toyota Group.

In the automotive field, the interface between the driver and vehicle is becoming increasingly important due to remarkable technological innovations. Against this backdrop, DENSO has agreed with Fujitsu and Toyota to review specific changes to make FUJITSU TEN a group company of DENSO and to enhance cooperation between the two companies in developing in-vehicle ECUs, millimeter-wave radar (earlier post), advanced driver assistance / automated driving technologies, and basic electronic technologies, among others.

More... | Comments (0)

LeddarTech launches LeddarVu, a new scalable platform towards high-resolution LiDAR; Vu8 solid-state LiDAR

September 07, 2016

LeddarTech, a developer of solid-state LiDAR technology (earlier post), introduced LeddarVu, a new platform for the next generation of its Leddar detection and ranging modules. The LeddarVu platform combines the benefits of a very compact, modular architecture with superior performance, robustness and cost efficiency towards high-resolution LiDAR applications, such as autonomous driving.

Leveraging LeddarTech’s advanced, patented signal processing and algorithms, LeddarVu sensors will evolve along with the future generations of the LeddarCore ICs. As previously announced with the company’s development roadmap, upcoming iterations of LeddarCore ICs are expected to deliver ranges reaching 250 m, fields of view up to 140°, and up to 480,000 points per second (with a resolution down to 0.25° both horizontal and vertical), enabling the design of affordable LiDARs for all levels of autonomous driving, including the capability of mapping the environment over 360° around the vehicle.

More... | Comments (1)

Quanergy acquires Otus People Tracker software from Raytheon BBN for advanced autonomous driving and security LiDAR applications

August 29, 2016

Quanergy Systems, Inc., the provider of solid-state LiDAR sensors and smart sensing solutions (earlier post), has acquired the Otus People Tracker software from Raytheon BBN Technologies. The software complements Quanergy’s existing software portfolio and, when used with Quanergy’s LiDAR sensors, creates an integrated hardware and software solution for advanced people detection and tracking applications within the security and autonomous driving markets.

Otus (named after a genus of owls) uses advanced algorithms to identify and to track people for safety and security in crowded environments at ranges exceeding 100 meters when used with Quanergy LiDAR sensors. The system features segmentation techniques identifying humans; background extraction; object clustering; sophisticated merge and split algorithms; persistent tracking algorithms; and other advanced features supporting robust crowd control. Support for multiple zones of interest is included, allowing users fine control over active monitoring.

More... | Comments (0)

Mobileye and Delphi to partner on SAE Level 4/5 automated driving solution for 2019

August 23, 2016

Mobileye and Delphi Automotive PLC are partnering to develop a complete SAE Level 4/5 automated driving solution. The program will result in an end-to-end production-intent fully automated vehicle solution, with the level of performance and functional safety required for rapid integration into diverse vehicle platforms for a range of customers worldwide.

The partners’ “Central Sensing Localization and Planning” (CSLP) platform will be demonstrated in combined urban and highway driving at the 2017 Consumer Electronics Show in Las Vegas and production ready for 2019.

More... | Comments (7)

Solid-state LiDAR company Quanergy raises $90M in Series B; valuation passes $1B

Quanergy Systems, Inc., a leading provider of solid-state LiDAR sensors and smart sensing solutions (earlier post), raised $90 million in Series B funding at a valuation well over $1 billion. Sensata Technologies, Delphi Automotive, Samsung Ventures, Motus Ventures and GP Capital participated in the round. This investment brings the company’s total funds raised to approximately $150 million.

Quanergy intends to use the investment and leverage its intellectual property to work with its partners in ramping up the production of its solid-state LiDAR sensors. These sensors use standard semiconductor manufacturing processes and have no moving parts on a macro scale or a micro scale, offering significantly lower cost, higher reliability, superior performance, increased capability, smaller size and lower weight when compared to traditional mechanical sensors, sometimes named hybrid solid state sensors.

More... | Comments (4)

TU Graz team uses monocrystalline Si as Li-ion anode; integrated micro batteries for on-board sensors

August 21, 2016

Electrochemists at TU Graz have used single crystalline acceptor-doped Si—as ubiquitously used in the semiconductor industry—as anode material for rechargeable Li-ion batteries. In an open access paper in the journal Scientific Reports, the team suggests that the use of such patterned monocrystalline Si (m-Si) anodes directly shaped out of the Si wafer is a highly attractive route to realize miniaturized, on-board fully integrated, power supplies for Si-based chips.

The microchip not only houses the electronics, but is at the same time an important part of a mini battery providing electrical energy, e.g. for sending and receiving information.

More... | Comments (0)

ABI Research: highly automated driving to spark adoption of centralized advanced driver assistance systems

August 17, 2016

As vehicles become highly independent and begin to drive and react to traffic on their own, autonomous systems will aggregate and process data from a variety of on-board sensors and connected infrastructure. This will force the industry to hit a hard reset on advanced driver assistance systems (ADAS) architectures, currently dominated by distributed processing and smart sensors.

Automotive OEMs will need to adopt new platforms based on powerful, centralized processors and high-speed low latency networking (e.g., Audi zFAS, earlier post). ABI Research forecasts 13 million vehicles with centralized ADAS platforms will ship in 2025.

More... | Comments (2)

Ford and Baidu invest $150M in Velodyne LiDAR

August 16, 2016

Velodyne LiDAR, Inc., a global leader in LiDAR (Light, Detection and Ranging) technology, announced the completion of a combined $150 million investment from co-investors Ford Motor Company and China’s leading search engine company Baidu, Inc. The investment will allow Velodyne to rapidly expand the design and production of high-performance, cost-effective automotive LiDAR sensors, accelerating mass adoption in autonomous vehicle and ADAS applications and therefore accelerating the critical, transformative benefits they provide.

Over the last decade, Velodyne developed four generations of hybrid solid-state LiDAR systems incorporating the company’s proprietary software and algorithms that interpret rich data gathered from the environment via highly accurate laser-based sensors to create high-resolution 3D digital images used for mapping, localization, object identification and collision avoidance.

More... | Comments (3)

Green Car Congress © 2017 BioAge Group, LLC. All Rights Reserved. | Home | BioAge Group