The National Transportation Safety Board has filed comments blasting the National Highway Traffic Safety Administration for its permissive regulation of driver-assistance systems. The letter was dated February 1 but was only spotted by CNBC’s Lora Kolodny on Friday. The letter repeatedly calls out Tesla’s Autopilot for its lax safety practices and calls on NHTSA to establish minimum standards for the industry.
The dispute between federal agencies is the result of Congress dividing responsibility for transportation safety among multiple agencies. NHTSA is the main regulator for highway safety: every car and light truck must comply with rules established by NHTSA. NTSB is a separate agency that just does safety investigations. When there’s a high-profile highway crash, NTSB investigators travel to the scene to figure out what happened and how to prevent it from happening again. NTSB also does plane crashes and train wrecks, allowing it to apply lessons from one mode of transportation to others.
This separation of responsibilities has contributed to a culture gap between the agencies. As the agency responsible for writing regulations, NHTSA has to trade safety off against other considerations like economic costs, the lobbying clout of automakers, and the risk of consumer backlash. In contrast, NTSB’s rulings are purely advisory, which frees the agency to doggedly advocate strong safety measures.
Under then-President Donald Trump, NHTSA largely let automakers do what they liked when it came to advanced driver-assistance systems (ADAS) and prototype driverless vehicles. NHTSA has generally waited until safety problems cropped up with ADAS system and dealt with them after the fact. NTSB argues NHTSA should be more proactive, and it put Tesla and Autopilot at the center of its argument.
NTSB thinks minimum ADAS safety standards are overdue
NTSB thinks NHTSA has been too slow to develop safety standards for driver-assistance systems and too slow to mandate their use in every vehicle. A growing number of cars have automatic emergency braking systems, but these systems are not yet mandatory, and different AEB systems have different capabilities.
“It is important that the agency prioritize the development of minimum performance standards for collision-avoidance technologies and require the systems as standard equipment in all new vehicles,” the NTSB wrote.
The NTSB also calls for NHTSA to require driver-monitoring systems to ensure drivers are paying attention to the road while driver-assistance systems are active.
“Because driver attention is an integral component of lower-level automation systems, a driver-monitoring system must be able to assess whether and to what degree the driver is performing the role of automation supervisor,” NTSB argued. “No minimum performance standards exist for the appropriate timing of alerts, the type of alert, or the use of redundant monitoring sensors to ensure driver engagement.”
A lot of driver-assistance systems on the road today use steering wheel torque sensors as a crude way to tell if drivers have their hands on the wheel. More recently, some manufacturers have used eye-tracking cameras to monitor driver attention. They are a more effective way to be sure users are actually looking at the road—though some drivers may find them intrusive or annoying.
Finally, NTSB argues that NHTSA should require automakers to limit use of driver-assistance systems to the type of roads they’re designed for. For example, some ADAS systems are designed to only work on limited-access freeways. Yet few cars actually enforce such limitations. Many systems can be activated on roads the systems weren’t designed for.
NTSB repeatedly singles out Tesla
The NTSB mentions Tesla 16 times in the report—far more than any other automaker. This is partly because Tesla vehicles have figured so prominently in the NTSB’s work. NTSB says it has investigated six crashes involving driver-assistance or self-driving systems between May 2016 and March 2019. Four of those were fatal. One of these four was the 2018 death of Elaine Herzberg after she was hit by an Uber self-driving prototype. The other three were Tesla owners who relied too much on Autopilot and it cost them their lives.
In one section, NTSB points to the 2016 death of Tesla owner Josh Brown. The Autopilot software on Brown’s car failed to recognize a semi trailer crossing in front of the vehicle. Brown’s Model S slid under the trailer, shearing off the top of the car and killing Brown instantly.
In its report on the crash, NTSB noted that, at the time of the crash, Autopilot software was only designed for use on controlled-access freeways—not rural highways where cars and trucks can enter the highway directly from driveways and side streets. NTSB pointed out that its report on the Brown crash “recommended that NHTSA develop a method to verify” that companies selling driver-assistance systems like Autopilot have safeguards to prevent customers from using the systems on roads they aren’t designed for. Such a system might have prevented Brown from activating Autopilot on the day of his death.
NHTSA didn’t follow the NTSB’s suggestion. In its February letter, NTSB doesn’t let NHTSA forget it: NTSB suggests that this policy choice may have led to another deadly crash.
“In March 2019, because of Tesla’s lack of appropriate safeguards and NHTSA’s inaction, another fatal crash occurred in Delray Beach, Florida, under circumstances very similar” to Brown’s death, the agency wrote. And NTSB worries that lax rules could lead to more deaths in the future.
“The NTSB remains concerned about NHTSA’s continued failure to recognize the importance of ensuring that acceptable safeguards are in place so the vehicles do not operate outside of their operational design domains and beyond the capabilities of their system designs,” the agency wrote. “Because NHTSA has put in place no requirements, manufacturers can operate and test vehicles virtually anywhere, even if the location exceeds the AV control system’s limitations.”
NTSB then called out Tesla again, specifically criticizing the decision to release its “full self-driving beta” software to a few dozen customers.
“Tesla recently released a beta version of its Level 2 Autopilot system, described as having full self-driving capability,” NTSB wrote. “By releasing the system, Tesla is testing on public roads a highly automated AV technology but with limited oversight and reporting requirements.”
Since the NTSB letter, Elon Musk has announced plans to expand the FSD beta to more customers.
The NTSB letter came at a crucial time—just as President Joe Biden was staffing the senior positions at NHTSA and the broader Department of Transportation. Under Donald Trump, NHTSA took a strongly hands-off posture toward regulation of driver-assistance systems and self-driving technology. It seems likely that the Biden team will do more in this area, but it remains to be seen how aggressive they will be.
https://arstechnica.com/?p=1749436