With the release of Version 8 of its software, Tesla has made many updates to its Autopilot function. The most significant change however, is its new heavy reliance on the onboard radar. The radar was added to all Tesla vehicles in October 2014 as part of the Autopilot hardware suite, but originally was only meant to be a supplementary sensor to the primary camera and image processing system.
Now, however, Tesla is using more advanced signal processing to use the radar as a primary control sensor without requiring the camera to confirm visual image recognition. The changes come in the wake of the fatal crash (earlier post) in which Autopilot apparently did not detect the white side of the trailer against a brightly lit sky; nor did the driver. As a result, no brake was applied. In the aftermath of that incident, Tesla CEO Elon Musk tweeted that the company was “Working on using existing Tesla radar by itself (decoupled from camera) w temporal smoothing to create a coarse point cloud, like lidar”. (Earlier post.)
Radar is typically unaffected by contrast issues (light and dark), as it uses reflected radio waves at about 76 to 77 GHz to identify and classify objects.
However, using radar as the primary control sensor is “a non-trivial and counter-intuitive problem,” Tesla explains. Anything metallic looks like a mirror. The radar can see people, but they appear partially translucent. Something made of wood or painted plastic, though opaque to a person, is almost as transparent as glass to radar. However, any metal surface with a dish shape is not only reflective, but also amplifies the reflected signal to many times its actual size. Thus, a big problem in using radar to stop the car is avoiding false alarms, Tesla said.
The first part of solving that problem is having a more detailed point cloud.
Slamming on the brakes is critical if you are about to hit something large and solid, but not if you are merely about to run over a soda can. Having lots of unnecessary braking events would at best be very annoying and at worst cause injury.—Tesla Motors
Software 8.0 delivers a more detailed point cloud, unlocking access to six times as many radar objects with the same hardware with a lot more information per object.
Tesla assembles these into a 3D “picture” of the world. It is hard to tell from a single frame whether an object is moving or stationary or to distinguish spurious reflections. By comparing several contiguous frames against vehicle velocity and expected path, the car can tell if something is real and assess the probability of collision.
An overhead highway road sign positioned on a rise in the road or a bridge where the road dips underneath often looks like object on a collision course. The navigation data and height accuracy of the GPS are not enough to know whether the car will pass under the object or not. By the time the car is close and the road pitch changes, it is too late to brake.
Tesla says it will address this problem through fleet learning. Initially, the vehicle fleet will take no action except to note the position of road signs, bridges and other stationary objects, mapping the world according to radar. The car computer will then compare when it would have braked to the driver action and upload that to the Tesla database. If several cars drive safely past a given radar object, whether Autopilot is turned on or off, then that object is added to a geo-coded list.
When the data shows that false braking events would be rare, the car will begin mild braking using radar, even if the camera doesn’t notice the object ahead. As the system confidence level rises, the braking force will gradually increase to full strength when it is approximately 99.99% certain of a collision.
This may not always prevent a collision entirely, but the impact speed will be dramatically reduced to the point where there are unlikely to be serious injuries to the vehicle occupants.—Tesla Motors
A Tesla will also be able to bounce the radar signal under a vehicle in front to detect an oncoming obstacle (Tesla used a UFO landing on the road in dense fog as an example)—using the radar pulse signature and photon time of flight to distinguish the signal—and still brake even when trailing a car that is opaque to both vision and radar.
Other notable enhancements to Autopilot include:
- TACC (Traffic Aware Cruise Control) braking max ramp rate increased and latency reduced by a factor of five
- Now controls for two cars ahead using radar echo, improving cut-out response and reaction time to otherwise-invisible heavy braking events
- Will take highway exit if indicator on (8.0) or if nav system active (8.1). Available in the United States initially
- Car offsets in lane when overtaking a slower vehicle driving close to its lane edge
- Interface alerts are much more prominent, including flashing white border on instrument panel
- Improved cut-in detection using blinker on vehicle ahead
- Reduced likelihood of overtaking in right lane in Europe
- Improved auto lane change availability
- Car will not allow reengagement of Autosteer until parked if user ignores repeated warnings
- Automatic braking will now amplify user braking in emergencies
- In manual mode, alerts driver if about to leave the road and no torque on steering wheel has been detected since Autosteer was deactivated
- With further data gathering, car will activate Autosteer to avoid collision when probability ~100%
- Curve speed adaptation now uses fleet-learned roadway curvature