Massachusetts Senator Edward Markey, suggests Tesla’s Autopilot driver assistance system needs rebranding, remarketing, and backup monitoring to increase safety. In a statement released January 24, 2020, Markey, a Democrat who serves on the Senate Commerce, Science and Transportation Committee, said the Autopilot feature is flawed. However, if Tesla implements the changes he recommends it will do more to “protect drivers, passengers, pedestrians and all other users of the road.”
Concerns About Tesla’s Autopilot System
Markey’s comments arose from concerns about at least three fatalities in the United States since 2016, due to crashes in which the Autopilot system was operating in Tesla vehicles. A fatal collision occurred recently in Gardena, California, on Dec. 29, 2019, involving a Model S Tesla that exited the 91 Freeway, sped through a red light and struck a 2006 Honda Civic. The two Honda occupants died at the scene, and the Tesla driver and passenger suffered non-life-threatening injuries. The Southern California incident is under investigation by the U.S. National Highway Traffic Safety Administration (NHTSA), but the agency has not said whether the Autopilot system was involved.
See Autopilot Crashes and Causes for more.
As evidence that Tesla’s system is flawed, the senator cites videos of drivers who appear to be sleeping at the wheel, many of which are believed to be fake. Other videos show drivers saying they can fool the system by inserting a bottle of water or a banana to simulate human control of the vehicle. Additionally, the system appears to be unable to detect objects that are not moving and often crashes into firetrucks. NHTSA is investigating a Tesla rear-end collision with a parked police car in Connecticut. Tesla’s driver-assistance system was involved in the December 2019 police car crash.
What Sen. Markey Wants Tesla to Do
Markey believes Tesla can overcome its safety problems by making two specific adjustments.
- Rename Autopilot – The senator wants the system renamed because he believes the term ‘autopilot’ is misleading and leads users into thinking the system is fully autonomous.
- Improve Driver Monitoring – Markey suggests that Tesla improve its Driver Monitoring System (DMS) to reduce driver misuse and install “backup driver monitoring tools that will make sure no one falls asleep at the wheel.”
Tesla’s Response
Tesla responded with a letter saying that it had taken numerous steps to ensure consumers use Autopilot as safely as possible. Including:
- Tesla said its website clearly states the Autopilot feature requires active driver supervision, and the vehicle is not autonomous.
- New warning systems that recognize red lights and stop signs “to minimize the potential risk of red light or stop sign running as a result of temporary driver inattention”
- Revisions to its steering wheel monitoring system now means that in most cases “a limp hand on the wheel from a sleepy driver will not work, nor will the coarse hand pressure of a person with impaired motor controls, such as a drunk driver.” It also said that devices “marketed to trick Autopilot, may be able to trick the system for a short time, but generally not for an entire trip before Autopilot disengages.”
- Tesla added that “a few bad actors who are grossly abusing Autopilot” represent “a very small percentage of our customer base.”
Our Take
While Tesla clearly pushes the envelope when it comes to providing semi-automated driver assistance system to consumers, such as Autopilot, the two suggestions by the Senator are a bit off the mark.
Autopilot Naming
On the naming front, we don’t see the use of the Autopilot name being a big issue. Autopilot is used in the aviation industry and if you read the FAA’s own handbook, it states:
While the autopilot relieves you from manually manipulating the flight controls, you must maintain vigilance over the system to ensure that it performs the intended functions
[…]
An autopilot can be capable of many very time intensive tasks, helping the pilot focus on the overall status of the aircraft and flight.
[…]
Be ready to fly the aircraft manually to ensure proper course/clearance tracking in case of autopilot failure or misprogramming.
To us, the far bigger issue is the “Full Self-Driving” option, which leaves no doubt with its promise and should frankly be renamed to “Autopilot Premium” or similar since we doubt “full self-driving”, where Tesla assumes liability, will be achieved anytime soon despite Tesla’s promises.
Driver Monitoring
While Tesla does have a very basic Driver Monitoring System (DMS) in the way of steering wheel touch/torque monitoring as many other manufacturers use, we do agree that a more robust DMS would be beneficial, but not critical.
A DMS that can better understand the cognitive state of the driver and ensure they’re actually attentive and not looking away is likely where most autopilot systems are headed. For example, even Comma.ai’s open-source openpilot hardware provides face and eye tracking to make sure the driver is paying attention. We’re hopeful Tesla will provide something similar in the future.
Training
Finally, better training on how to use the Autopilot system would be helpful. For example, new drivers could be required to walk through training using the built-in monitor and touchscreen. Again, openpilot does something similar with a step-by-step tutorial.