Mazda introduces SKYACTIV-VEHICLE DYNAMICS control technologies; G-Vectoring Control uses engine to enhance chassis performance
DOE awarding $15M to 3 algae-based biofuel and bioproducts projects

Consumer Reports calls on Tesla to disable and update auto steering function, remove “Autopilot” name

Consumer Reports is calling on Tesla to disable the automatic steering function in the Autopilot driving-assist system available in its Model S vehicles until the company updates the function to confirm that the driver’s hands remain on the steering wheel at all times.

The consumer organization, which has owned and tested three Teslas (2013 Model S 85, 2014 Model S P85D, and 2016 Model X 90D), said that Tesla should also change the name of the Autopilot feature because it promotes a potentially dangerous assumption that the Model S is capable of driving on its own.

Consumer Reports is far from a Tesla-basher. The organization ranked the all-wheel-drive Tesla Model S P85D as the best performing car it has ever tested, calling it “an automotive milepost. It’s a remarkable car that paves a new, unorthodox course, and it’s a powerful statement of American startup ingenuity.” The organization has also begun to note reliability issues with the Teslas, however.

Tesla is now under scrutiny for how it deployed and marketed the Autopilot system after a series of crashes. Federal safety officials are investigating a fatal crash involving a Tesla and a tractor-trailer in Florida. (Earlier post.)

By marketing their feature as ‘Autopilot,’ Tesla gives consumers a false sense of security. In the long run, advanced active safety technologies in vehicles could make our roads safer. But today, we’re deeply concerned that consumers are being sold a pile of promises about unproven technology. ‘Autopilot’ can't actually drive the car, yet it allows consumers to have their hands off the steering wheel for minutes at a time. Tesla should disable automatic steering in its cars until it updates the program to verify that the driver’s hands are kept on the wheel.

—Laura MacCleery, Vice President of Consumer Policy and Mobilization for Consumer Reports

Specifically, Consumer Reports is calling for Tesla to do each of the following:

  • Disable the Autosteer function of the Autopilot system until it can be reprogrammed to require drivers to keep their hands on the steering wheel.

  • Stop referring to the system as “Autopilot” as it is misleading and potentially dangerous.

  • Issue clearer guidance to owners on how the system should be used and its limitations.

  • Test all safety-critical systems fully before public deployment; no more Beta releases. Consumer Reports has seen first-hand how such “beta” software is transmitted wirelessly into the Teslas. When software in a desktop computer or handheld electronic devices is labeled as “beta,” it is typically means that functionality is not fully developed and is still being fine-tuned, the organization noted.

Consumer Reports contacted Tesla about these concerns, and the company sent this response via email:

Tesla is constantly introducing enhancements, proven over millions of miles of internal testing, to ensure that drivers supported by Autopilot remain safer than those operating without assistance. We will continue to develop, validate, and release those enhancements as the technology grows. While we appreciate well-meaning advice from any individual or group, we make our decisions on the basis of real-world data, not speculation by media.

Tesla also defended the safety record of the system, writing that “130 million miles have been driven on Autopilot, with one confirmed fatality.” The company underscored that its beta software development process includes “significant internal validation.”

Last week, the National Highway Traffic Safety Administration (NHTSA) sent a letter to Tesla requesting detailed information about Autopilot, including any design changes and updates to the system, as well as detailed logs of when the system has prompted drivers to take over steering. The Securities and Exchange Commission is reportedly investigating whether Tesla failed to tell investors about the crash in a timely fashion.

MacCleery said automakers must commit immediately to name automated features with descriptive, not exaggerated, titles, noting that these companies should roll out new features only when they are certain they are safe.

Consumers should never be guinea pigs for vehicle safety ‘beta’ programs. At the same time, regulators urgently need to step up their oversight of cars with these active safety features. NHTSA should insist on expert, independent third-party testing and certification for these features, and issue mandatory safety standards to ensure that they operate safely.

—Laura MacCleery

In the aftermath of the Autopilot fatality, other consumer organizations have begun calling for a general slow-down in the rollout of autonomous driving technologies.

A coalition of auto safety advocates earlier this week called on President Obama to stop his “administration’s undue haste to get autonomous vehicle technology to the road” until enforceable safety standards are in place. They said the administration’s autonomous vehicle “guidance”—expected next week—should not be issued.

The error in rushing autonomous vehicle technology into cars and onto public highways without enforceable safety rules was underscored by the recent tragic fatal crash of a Tesla Model S in Florida while autopilot was engaged,” the coalition’s letter to Obama said.

The letter to Obama was signed by Joan Claybrook, President Emeritus of Public Citizen and Former NHTSA Administrator; Clarence Ditlow, executive director of the Center for Auto Safety; Rosemary Shahan, president of Consumers for Auto Safety and Reliability; and John M. Simpson, Privacy Project Director for Consumer Watchdog.

The advocates also said Tesla should disable Autopilot until it is proven safe. Noting that both Volvo and Mercedes have said they will accept liability when their self-driving technology is responsible for a crash, the safety advocates called on Tesla make the same pledge if autopilot is offered in the future. They called for the manufactures of all self- driving cars to take responsibility for crashes cause by their autonomous technologies.


Keeping your hand resting in the steering wheel is not enough. An AP "pilot in command" must also maintain situational awareness, stay alert and ready to respond in an instant to a situation that can not be handled by autopilot - in many cases before auto pilot has detected a problem (like a semi tractor trailer crossing the road in front of the car).

How will that be monitored?


The trick is to add an anti-collision where you drive yourself but the car help you to avoid an accident in fog .


My view is that you won't get to an AV without a few scores of deaths and they might as well be done by Tesla as anyone, as long as whoever it is that has a "licence to kill" shares all their collision data including video, lidar, radar etc from say 30 seconds before the crash to the crash.
The same rules should apply to everyone, and it should apply to any collision, fatal or not.
If you are worried about privacy, blank out the faces and number plates.
I'll come back to this as I have to go now.
- JM


Anyone with any knowledge at all of statistics will realise that Tesla is grossly abusing them by using a single incident to claim a level of safety, and Musk has gone further with absolutely rubbish claims that you are twice as safe using autopilot as not using it.

To make those claims they rely on remarkable use of figures, for instance data which includes every motorcyclist and bicycle on the road, as well as the entire car fleet, including every old banger.

A comparable car like some of the Lexus's run at around 1.3 billion miles per fatality, if one must bandy statistics based on an inadequate data set.

They also claim that if the driver does not intervene when the AP crashes the car, they must not have been paying attention, and if they do and don't manage to avoid the danger the AP had created, then the crash was not under AP.

So don't brake or take evasive action and its your fault.

Brake or take evasive action and its still your fault.

Every demo including those by Musk and his wife show hands off driving.

Volvo's system will not allow the driver to take their hands off the wheel or it stops.


The Tesla autopilot system does limited access highway driving. It does not do traffic lights, stop signs, or essentially intersections. People who misuse the system by driving at high speed through uncontrolled intersections are stupid.

If you drive 85 mph through an intersection with the green light in your favor, and someone runs the red light right in front of you, would you be able to avoid an accident? Would any autonomous car?

There are situations (caused by human negligence/recklessness) where an "accident" cannot physically be avoided, by man or machine. Nobody has claimed that autonomous cars can ever eliminate all accidents. Why would anyone assert that even early versions should be able to do that? (although not surprised self-righteous Consumer Reports would)

Juan Valdez

About 100 people die everyday in the US because of automobile accidents. Yet one accident with Tesla and Consumer Reports is crying foul. Consumer Reports is nuts.

With each upgrade the Tesla system will get better and better. If Tesla has to wait for regulatory approval for every upgrade, thousand more people will die before the feds allow this type of advanced technology to save us.

This is one story where I vote for private enterprise and fast moving upgrades.

Judging from the videos posted of people "showing off" with Autopilot, personal interviews with Tesla drivers at Superchargers, and now accident reports, it appears that Tesla has not done enough to educate people on AP use.

In addition, it may be that people have difficulty staying engaged in driving under AP even if they are properly briefed. That is Google's conclusion. How is Tesla monitoring this?

How are they communicating (feedback loop) to drivers that appear not to have the capability of properly operating under AP?


The problem with "autopilot" is you can never take the idiot out of the other words the human. These systems are not ready to actually be an "autopilot" and should not be called such because idiots who have to reboot their phones/computers/tablets every day somehow think their car is magic and can deal with every real world situation at 80mph?

I'm sorry, but this is really a bit of a Darwinian situation but sadly they can take others out with them. And Tesla should NOT call it autopilot as it misleads the idiots.


Musk and his wife both 'showed off' demonstrating hands off driving, as well as every other demo I have ever seen.

Tesla have named, promoted and demoed the car far above its limited abilities, released it with a wholly inadequate set of sensors ( compare it with the Volvo sensors, for instance, and their technology will not operate if you take your hands off the wheel ).

If there are court cases arising, they will find that a disclaimer in the manual does not cover their liability.

So far looking at various aspects of their AP and their business, they are under investigation by the NHTSA, the NTSB, and the SEC.

That is aside from Norwegian fines for false and misleading advertising.

This is a company on the edge, and I am not thinking of innovation.

Account Deleted

Tesla already has scientific proof that their autopilot increases driver safety I quote ”…the data is unequivocal that Autopilot reduces driver workload and results in a statistically significant improvement in safety when compared to purely manual driving.”

It is therefore totally irresponsible and disrespectful for the victims of traffic accidents to ask Tesla not to use their Autopilot and cripple its usefulness by requiring people to hold the steering wheel at all time without a break. Tesla has implemented a system that triggers an alarm when not having touched the steering wheel for some seconds and it even slows down and stops the car should you keep ignoring the alarm. That is enough. Sure there will be idiots that despite crystal clear warnings from Tesla still use the auto pilot in the wrong manner and pay less attention to driving than they should. A name change is not going to change that behavior. However, these drivers are a minority because the net effect of all Tesla drives that use autopilot is less accidents when AP is on versus when it is not on as shown by the already extensive driving data gathered by Tesla’s system.

Some people tend to believe that driver distraction and inattention is a brand new phenomenon introduced by Tesla’s autopilot. However, the phenomenon is of cause not new and has been part of human behavior since the very first autos arrived some 100 years ago. Tesla’s auto pilot addresses that problem by steering the cars safely in the many situations when people do not pay attention as they should and that would have resulted in accidents had it not been for the auto pilot. This is why auto pilot in general reduces the number of accidents despite also inducing some idiots to pay less attention because the auto pilot is doing so well in most situations. The net effect is more safety with autopilot on.

Apart from Tesla there are no other auto makers in the world that are gathering data on accidents and driving behavior on a real time basis for every car they make for every minute of the day all year round. If we should do something good here it would be to require all new cars to be equipped with the data gathering system that Tesla has in all of their cars and require all auto-makers to update and publish these data every month on their web pages so that the public has free access to these data in order to investigate the safety of their cars independently.

Bob Niland

re: …said that Tesla should also change the name of the Autopilot feature…

Speaking as a pilot, the Tesla system is actually more capable than I would have expected for an "autopilot" feature in a car.

But sure, I can see where non-pilots might have unrealistic expectations. Perhaps if they renamed it to "Ottopilot"…


It is not good practice to offer or fit devices that require special training to ordinary consumer products if non compliance results in injury.
That suggests that such features should be barred to unqualified users.
If (as we all do) I borrow someones rig or loan mine, I cannot give the induction talk.
They cannot give me a qualified induction.
Then I must prove compliance before anyone should assume I'm safe.

Unless non compliance as in the Volvo case disarms the system.

Not good enough to offer dangerous products to consumers
as no matter how negligent or ignorant the operator, there is always a responsibility to others and the current system is not near foolproof..


If 'consumer report' really wanted to save many 1000s of lives, it would promote the ban of all diesel and gasoline ICEVs?


Although Tesla's use of statistics is more than dubious, its marketing flamboyant, its current sensor suite inadequate, etc., etc., I think the best answer will ultimately lie in the system [across all brands] individually training/evaluating each driver on a continuing basis for fitness to use the AP function up to its full extent, as opposed to now braking the technology when it is improving rapidly.

To hasten progress towards being 10x safer than human driving across all scenarios, manufacturers offering AP should be obligated [as part of the licensing requirements] to pool their raw accident data into a government-administered open source repository freely available to all developers and the general public. This industry-wide 'Transparency & Freedom of Safety Information' should accelerate bug elimination/innovation and thus produce more robust systems all round in the shortest possible time.

The comments to this entry are closed.