[Due to the increasing size of the archives, each topic page now contains only the prior 365 days of content. Access to older stories is now solely through the Monthly Archive pages or the site search function.]
Jaguar introduces Gen 2, aluminum-intensive XF; up to 60 mpg US with diesel Ingenium engine
April 02, 2015
Jaguar has unveiled the second-generation, all-new Jaguar XF at the New York International Auto Show. The coupé-like design uses Jaguar’s aluminum-intensive (75% aluminum) architecture (earlier post) to enable weight savings of up to 190 kg (419 lbs)—making the 163PS diesel model 80 kg (176 lbs) lighter than the nearest competitor—plus an increase in torsional stiffness of up to 28%. The architecture also delivers improvements in packaging and proportions.
The powertrain range will consist of: 163PS (161 hp, 120 kW) and 180PS (178 hp, 132 kW) 2.0-liter diesel manual and automatic; RWD, 240PS (237 hp, 177 kW) 2.0-liter gasoline automatic; RWD, 300PS (296 hp, 221 kW) 3.0-liter diesel automatic; RWD and AWD, 340PS (335 hp, 250 kW) and 380PS (375 hp, 279 kW) 3.0-liter gasoline automatic. With fuel economy as low as 60 mpg US (3.94 l/100 km) (104 g/km CO2) on the European combined cycle from the Ingenium diesel (earlier post), the all-new XF delivers an improvement of almost 20% compared to its predecessor.
Toyota and Lexus roll-out low-cost automated braking safety packages on RAV4 Hybrid and RX crossover
March 30, 2015
At separate press conferences Wednesday and Thursday at the New York Auto Show, Toyota will reveal the RAV4 Hybrid SUV, while Lexus unveils its all-new fourth-generation RX luxury crossover SUV. Both debuts will mark the arrival of new, multi-feature, integrated safety packages, each anchored by automated pre-collision braking and offered at a price below comparable systems across the auto industry.
Toyota Safety Sense (TSS) and Lexus Safety System+ (LSS+) are designed to support the driver’s awareness, decision making and vehicle operation over a wide range of speeds. Packaged together in an integrated system, their features help address three key areas of accident protection: preventing or mitigating rear collisions; keeping drivers within their lane; and enhancing road safety during night time driving. The systems are intended to address commonly occurring crash types according to traffic accident statistical analyses.
Delphi Automotive launching coast-to-coast automated drive
March 15, 2015
Delphi Automotive is launching a US coast-to-coast automated drive—the longest automated drive ever attempted in North America—to showcase its technology capabilities and to gather data and further advance the company’s active safety technology development in this rapidly growing segment of the auto industry. The coast-to-coast trip will launch near the Golden Gate Bridge in San Francisco on 22 March and will cover approximately 3,500 miles.
Recently demonstrated on the streets of Las Vegas at CES 2015, Delphi’s Audi SQ5 automated driving vehicle leverages a full suite of technologies and features to make this trip possible, including:
Mobileye unveils Gen4 system-on-chip EyeQ4; visual processing for ADAS and automated driving; design win for 2018
March 05, 2015
Mobileye N.V., a designer and developer of camera-based Advanced Driver Assistance Systems (ADAS) for the automotive industry, introduced its 4th-generation system-on-chip, the EyeQ4. Leveraging the company’s more than 15 years of expertise in designing computer-vision specific cores, the EyeQ4 consists of 14 computing cores out of which 10 are specialized vector accelerators with extremely high utilization for visual processing and understanding.
The first design win for EyeQ4 in series production has been secured for a global premium European car manufacturer for production to start in early 2018. The EyeQ4 would be part of a scalable camera system starting from monocular processing for collision avoidance applications, in compliance with EU NCAP, US NHSTA and other regulatory requirements, up to trifocal camera configuration supporting high-end customer functions including semi-autonomous driving. The EyeQ4 would support fusion with radars and scanning-beam lasers in the high-end customer functions.
Freescale introduces new vision microprocessor targeting autonomous driving
March 02, 2015
Freescale Semiconductor has introduced the S32V vision microprocessor—the first automotive vision system-on-chip (SoC) with the requisite reliability, safety and security measures to automate and ‘co-pilot’ a self-aware car, the company said.
Leveraging a number of automotive-grade technologies, the S32V moves beyond the current, convenience-centric “assist” paradigm and toward an era where cars can capture data, process it and share control with drivers in critical situations. This capability establishes a bridge from the current “assist” era toward the fully autonomous vehicles of tomorrow, Freescale suggested.
Siemens researching use of infrastructure radar sensor networks to support parking management
February 26, 2015
Siemens is researching the use of infrastructure sensor networks in an advanced parking management solution intended to counter the increasing parking space crisis in cities. The system, analogous to vehicle-based research underway by Ford Motor on the same problem (earlier post), will be used this spring for the first time in a pilot project in Berlin.
In the Siemens concept, networks of stationary sensors collect information about the parking situation in cities. The information is forwarded to the drivers in order to make it easier for them to find unoccupied parking spaces. In addition, the data is transmitted to a parking management center so that cities can intelligently manage their parking spaces.
Volvo Cars announces production-viable autonomous driving system; targeting limited pilot customer rollout by 2017
February 19, 2015
Volvo Cars has developed what it says is a unique, complete system solution that makes it possible to integrate self-driving cars into real traffic. Based on its analysis of potential technical faults, Volvo Cars designed a complete production-viable autonomous driving system. The key is a complex network of sensors, cloud-based positioning systems and intelligent braking and steering technologies.
Volvo Cars’ Autopilot system is designed to be reliable enough to allow the car to take over every aspect of driving in autonomous mode. The technology advances a crucial step beyond the automotive systems demonstrated so far since it includes fault-tolerant systems. With the Drive Me project entering its second year (earlier post), Volvo Cars is thus moving rapidly towards its goal of placing 100 self-driving cars in the hands of customers on selected roads around Gothenburg by 2017.
Ford targeting smart parking as element of Smart Mobility; vehicle sensors, connectivity and big data
February 18, 2015
At CES 2015 in January, Ford CEO Mark Fields announced “Ford Smart Mobility”—a plan to use innovation to take Ford to the next level in connectivity, mobility, autonomous vehicles, the customer experience and big data. (Earlier post.) Three of the initial projects announced as part of the plan address the ongoing problem of urban parking.
Although there are “dozens” of apps available that can help a driver find a parking space, said Dave McCreadie, Ford’s manager of electric vehicle infrastructure and smart grid technology and part of the cross-functional group working on the mobility projects, “the data behind the apps seems a bit thin.” Ford’s idea is to leverage the cloud, in-vehicle sensors, connectivity and advanced data analysis including probability algorithms to provide a much more comprehensive picture of available parking without incurring a cost in infrastructure build-out.
Infineon and Hella develop new compact, lower-cost 24 GHz blind spot radar sensor
February 05, 2015
Semiconductor manufacturer Infineon Technologies, together with the German automotive supplier Hella, has developed innovative radio-frequency components for a 24 GHz radar sensor which reliably monitors the blind spot in the car’s rear section (Blind Spot Detection).
The module saves space and costs through the integration of multiple, formerly separate components into one transceiver and features low power consumption at improved performance. Due to this efficiency increase, serial production of the driver assistance system is made possible for vehicles outside of the premium segment.
Jaguar Land Rover “Bike Sense” uses color, sound and vibration to help prevent accidents involving bicycles and motorbikes
January 21, 2015
|Jaguar Land Rover is developing a range of new technologies that would use colors, sounds and touch inside the car to alert drivers to potential hazards and prevent accidents involving bicycles and motorbikes. Click to enlarge.|
At its Advanced Research Center in the UK, Jaguar Land Rover is developing a range of new technologies it calls “Bike Sense” that uses colors, sounds and touch (vibration) inside the car to alert drivers to potential hazards and prevent accidents involving bicycles and motorbikes. Nearly 19,000 cyclists are killed or injured on UK roads every year.
Sensors on the car will detect when another road user is approaching and identify it as bicycle or motorbike. Bike Sense will then make the driver aware of the potential hazard before the driver sees it. Rather than using a generic warning icon or sound, which takes time for the driver’s brain to process, Bike Sense uses lights and sounds that the driver will instinctively associate with the potential danger.
New system uses monocular camera instead of expensive laser scanners for automated vehicle navigation with comparable performance
January 20, 2015
A doctoral candidate in computer science and engineering at the University of Michigan has developed a new software system that could reduce the high cost of laser scanners used in self-driving and automated cars by enabling the vehicles to navigate using a single monocular camera with the same level of accuracy as the laser scanners at a fraction of the cost. His paper detailing the system recently was named best student paper at the Conference on Intelligent Robots and Systems in Chicago.
Ryan Wolcott’s system builds on the navigation systems used in other self-driving cars that are currently in development, including Google’s vehicle. These use three-dimensional laser scanning technology to create a real-time map of their environment, then compare that real-time map to a pre-drawn map stored in the system. By making thousands of comparisons per second, they’re able to determine the vehicle's location within a few centimeters.
Honda introducing first predictive cruise control system in CR-V in Europe
January 09, 2015
Honda will introduce in Europe the first predictive cruise control system, which Honda calls Intelligent Adaptive Cruise Control (i-ACC), capable of foreseeing and automatically reacting to other vehicles cutting-in to the equipped vehicle’s lane.
Based on extensive real-world research of typical European driving styles, Honda’s Intelligent Adaptive Cruise Control (i-ACC) uses a camera and radar to sense the position of other vehicles on the road. It then applies an algorithm to predict the likelihood of vehicles in neighboring lanes cutting-in by evaluating relations between multiple vehicles, enabling the i-ACC-equipped vehicle to react quickly, safely and comfortably.
Ford announces Smart Mobility plan; 25 initial projects
January 06, 2015
At CES, Ford CEO Mark Fields announced “Ford Smart Mobility”—a plan to use innovation to take Ford to the next level in connectivity, mobility, autonomous vehicles, the customer experience and big data. The initial step is the creation of 25 mobility experiments across the globe designed to help change the way the world moves.
Smart Mobility builds upon Ford’s Blueprint for Mobility (earlier post). As outlined by Ford Motor Company Executive Chairman Bill Ford in his keynote at the 2012 Mobile World Congress in Barcelona, the Blueprint for Mobility defines the start of Ford’s thinking on what transportation will look like in 2025 and beyond, and the technologies, business models and partnerships needed to get there.
NVIDIA introduces DRIVE automotive computers at CES; teraflops of processing for autonomous driving and cockpit visualization
January 05, 2015
At CES in Las Vegas, NVIDIA introduced its DRIVE line of automotive computers, equipped with powerful capabilities for computer vision, deep learning and advanced cockpit visualization. NVIDIA will offer two car computers: NVIDIA DRIVE PX, for developing auto-pilot capabilities, and NVIDIA DRIVE CX, for creating the most advanced digital cockpit systems.
The NVIDIA DRIVE PX auto-pilot development platform provides the technical foundation for cars with completely new features that draw heavily on recent developments in computer vision and deep learning. DRIVE PX leverages the new NVIDIA Tegra X1 mobile super chip, which is built on NVIDIA’s latest Maxwell GPU architecture and delivers more than one teraflops of processing power, giving it more horsepower than the world’s fastest supercomputer of 15 years ago.
BMW to show 360-degree collision avoidance and fully-automated remote parking in multi-story garages at CES
December 15, 2014
|The driver has the i3 park itself in a multi-story garage using a smartwatch. Click to enlarge.|
BMW will demonstrate new driver assistance and automated control functions including 360-degree collision avoidance and fully-automated parking in multi-story parking garages at the upcoming Consumer Electronics Show (CES) 2015 in January.
The platform for 360-degree collision avoidance is secure position and environment recognition; the research vehicle is a BMW i3. Four advanced laser scanners record the environment and reliably identify impediments such as columns, for example in a multi-story parking garage. If the vehicle approaches a wall or a column too quickly, the system brakes automatically to prevent the threat of collision. The vehicle is brought to a standstill very precisely with centimeters to spare.
IBM Research and ASELSAN to collaborate on metal-air battery technology, focusing on EVs; mm-wave ICs
November 25, 2014
IBM Research and Turkish defense industry technology company ASELSAN (Askerî Elektronik Sanayii, Military Electronic Industries) have signed collaborative development agreements concerning research and development of metal-air battery technologies and millimeter wave integrated circuits. The companies will work together on these projects, and through these efforts ASELSAN will enhance its in-house research and development activities.
In 2009, IBM and its partners launched a multi-year research initiative specifically exploring rechargeable Li-air systems (one type of metal-air battery): “The Battery 500 Project”. (Earlier post.) The “500” stands for a target range of 500 miles/800 km per charge, which translates into a battery capacity of about 125 kWh at an average use of 250 Wh/mile for a standard family car.
BMW i Ventures makes strategic investment in smartphone driving analytics company Zendrive
The BMW Group’s venture capital company, BMW i Ventures, has made an investment in the startup Zendrive, a company that uses data and analytics gathered from smartphones rather than OBD to improve driving through driving analytics. Related BMW mobility services investments currently include JustPark, Chargepoint, Life360, Chargemaster, and MyCityWay. The Zendrive investment is the first in a series of investment announcements that will be made in the months ahead, said Ulrich Quay, Managing Director of BMW i Ventures, LLC.
Zendrive uses the sensors on a smartphone to measure a driver’s behaviors; Zendrive’s Driver-Centric Analytics process the data from the phone’s sensors into a Zendrive score that factors in cell phone use, speed, swerves, hard stops, fast accelerations, fatigue, as well as weather, trip duration, time of day, and more.
New Toshiba image-recognition processors for ADAS; night-time pedestrian detection and 3D reconstruction
November 14, 2014
Toshiba Corporation will expand its line-up of image-recognition processors for automotive applications with the launch of the TMPV760 series. Sample shipments of the first device, TMPV7608XBG, will start in January 2015, with mass production scheduled for December 2016 onwards.
With TMPV7608XBG, Toshiba is supporting the realization of next-generation Advanced Driver Assistance Systems (ADAS). Support for standard ADAS features includes AEB (Autonomous Emergency Braking); TSR (Traffic Sign Recognition); LDW (Lane Departure Warning) and LKA (Lane Keeping Assist); HBA (High Beam Assistance); FCW (Forward Collision Warning); plus new applications that include TLR (Traffic Light Recognition) and AEB pedestrian (during both day and night), which will become part of the Euro NCAP testing program in 2018.
ONR developing offensive autonomous swarming capability for unmanned surface vehicles; adapting JPL’s CARACaS
October 05, 2014
The Office of Naval Research (ONR) is developing an autonomous offensive swarming capability for unmanned surface vehicles (USVs) not only to protect Navy ships, but also, for the first time, to attack hostile vessels.
The technology under development—based on the Control Architecture for Robotic Agent Command and Sensing (CARACaS) developed by NASA’s Jet Propulsion Laboratory (JPL)—can be put into a transportable kit and installed on almost any boat. It allows boats to operate autonomously, without a Sailor physically needing to be at the controls. Capabilities include operating in sync with other unmanned vessels; choosing their own routes; swarming to interdict enemy vessels; and escorting/protecting naval assets.
Audi to demonstrate automated driving technology in Florida
July 26, 2014
Audi will be the first to test its automated driving technology on the Lee Roy Selmon Expressway in Tampa, Florida—which recently was designated as an automated driving and connected car test bed—using an Audi A7 equipped to handle piloted driving functions on freeway conditions up to 40 mph (64 km/h).
Audi believes this initial version of piloted driving—Traffic Jam Pilot—could be available to consumers within five years. As Audi outlined this type of piloted driving functionality at CES in 2013 (earlier post), the system is based on the functionality of Audi adaptive cruise control with Stop & Go, extended by adding the component of lateral guidance.
IHS: continued legislative focus on pollutants to drive sensor market for internal combustion engines
July 24, 2014
The global market for sensors used in internal combustion engines (ICE) is on the road of steady growth for the next few years, propelled by increasing utilization in engine management and exhaust aftertreatment, according to a new report from IHS Technology. IHS projects that sensor shipments for ICEs will top 1.34 billion units in 2019, up from about 1.08 billion in 2013. Overall, IHS expects a six-year compound annual growth rate (CAGR) from 2013 to 2019 of 3.6%.
The report—“Powertrain Sensor Market Tracker – H1 2014”—is part of the Semiconductors & Components service of IHS Technology. The report examines more than 20 sensors attached to the engine, fuel and exhaust systems of passenger vehicles. The list includes pressure sensors, devices to monitor flow and temperature, ceramic sensors for the gases nitrogen oxide (NOx) and oxygen, in addition to knock sensing, position and speed.
Non-intrusive bio-monitoring system anticipates driver fatigue in the vehicle to prevent accidents
July 23, 2014
The Instituto de Biomecánica de Valencia (Biomechanics Institute - IBV) and its consortium partners in the European project HARKEN have developed a non-intrusive system integrated into smart materials which is capable of monitoring cardiac and respiratory rhythms in order to prevent drivers from falling asleep. The two-year project had its final meeting in June.
The system is based on three main components: the seat sensor, the seat belt sensor and the signal-processing unit (SPU), which processes the sensor data in real time. All are invisible to the user.
Pisa, Deutsche Telekom and Kiunsys launch smart city pilot project to optimize inner city parking as part of ITS; POSSE
June 26, 2014
The Italian city of Pisa and Deutsche Telekom have launched a smart city pilot project to test an intelligent parking system and to analyze historical traffic data via a “big data” service. The system, which will integrate into Pisa’s intelligent transport system (ITS), will help motorists in Pisa find a free parking space more easily and quickly, as well as pay for it via their smart phone.
The city of Pisa worked with Deutsche Telekom and its partner firm Kiunsys to install the new smart city service on Piazza Carrara, located directly on the banks of the river Arno. Wireless Parking Spots Sensors (PSS) on the floor of each parking spot detect whether the spaces are free or occupied. Several data units collect the information and send it over the mobile network to the city’s server infrastructure. The information is then displayed on indication panels which guide drivers to a free space. The solution is also integrated in Pisa’s existing Tap&Park app which drivers can choose to download to take them directly to a free parking space and even pay for it via the app.
CMU demo’ing Autonomous SRX in Washington this week
June 25, 2014
Researchers from Carnegie Mellon University (CMU) this week will demonstrate the CMU advanced Autonomous Cadillac SRX in Washington, DC. The car was brought to Washington at the request of Congressman Bill Shuster of Pennsylvania, who participated in a 33-mile drive in the autonomous vehicle between a Pittsburgh suburb and the city’s airport last September. Scheduled over two days, the demonstration will show how autonomous technology will eventually be fully integrated into vehicles that are currently on the market.
Developed with support from the National Science Foundation (NSF), the US Department of Transportation, DARPA and General Motors, the car is the result of more than a decade of research and development by scientists and engineers at CMU and elsewhere. Their work has advanced the underlying technologies—sensors, software, wireless communications and network integration—required to make sure a vehicle on the road is as safe—and ultimately safer—without a driver than with one. (In the case of the Washington, DC, demonstration, an engineer will be on hand to take the wheel if required.)
Google focusing autonomous driving development on mastering city street driving; patents piling up
April 29, 2014
Over the past year, Google has shifted the focus of its autonomous vehicle project onto mastering city street driving, according to Chris Urmson, Director, Google Self-Driving Car Project. (Urmson was the technical team leader of the CMU team that won the DARPA 2007 Urban Challenge, an autonomous vehicle race.)
Google’s autonomous cars use video cameras; 4 radar sensors (front, back, left, right); a laser range finder (Velodyne HDL-64E LiDAR) to “see” other traffic; and a GPS as well as a wheel encoder and very detailed road maps to determine the precise location of the vehicle. Google says that its autonomous vehicles have logged nearly 700,000 autonomous miles (1.13 million km) over the four years they have been on the road. Since its last pubic update in 2012, Google said that it has logged thousands of miles on the streets of Mountain View, California, Google’s home.