Skip to Content
Streetsblog California home
Streetsblog California home
Log In
Safety

How Automakers Can Stop Humans from Over-Relying on Automated Safety Tech

Automakers aren't doing enough to ensure that drivers are ready to take over if their vehicles' self-driving features make mistakes, an auto-industry group argues, re-igniting a debate about who should be held accountable when the drivers of partially automated cars kill people on U.S. roads.

Last week, the Insurance Institute for Highway Safety announced the criteria for a new set of rankings that will evaluate how well automakers are combatting "automation complacency" among their drivers, as well as the intentional misuse of advanced driver assistance systems that are becoming increasingly common on new cars.

First used to describe the phenomenon of pilots mentally checking out in the cockpit when their planes are on "autopilot" mode, "automation complacency" has since been flagged by watchdogs like the National Transportation Safety Board as a major contributing factor in several crashes involving a partially automated vehicle — and for the foreseeable future, every AV on the road will only be partially automated.

In 2020, the Board advised federal regulators to develop "performance standards for driver monitoring systems that will minimize driver disengagement, prevent automation complacency, and account for foreseeable misuse of the automation." Experts point out that many of those systems could be implemented with inexpensive, software updates such as programming advanced features not to activate if a drivers' seatbelt isn't buckled, or by installing simple in-cabin cameras to monitor motorists' behavior.

But the National Highway Traffic Safety Administration has yet to mandate either.

"There's a regulatory void," said David Harkey, president of the Insurance Institute, explaining his group's new rankings. "Automakers need to ensure that this tech is implemented so it does not lead to either intentional or unintentional misuse on the part of the driver. And consumers need to have a better understanding that these remain driver assistance systems, and not driver replacement systems. No one should purchase a vehicle thinking it’s capable of more than it is, and then learn later, sometimes through tragic consequences, that they were wrong."

In the absence of strong federal safety standards that would require companies to put better safeguards on advanced driver assistance systems, some automakers have sold customers their own narratives about what partial automation is really capable of.

In 2020, Tesla began selling an untested "Full Self Driving" mode that was not, as the name suggested, actually self-driving, paired with an owner's manual whose fine print warned customers not to trust the tech too much; it's still on the roads today. GMC, meanwhile, is currently marketing a "Super Cruise" feature with commercials that encourage drivers to operate their cars "hands free," even as a microscopic disclaimer encourages those drivers to stay attentive.

Some pundits have partly blamed that dangerously optimistic advertising for encouraging bad driving behaviors that lead to real-world crashes — and the criminal charges that sometimes follow.

Just before the Institute announced its new rankings criteria, a California driver was charged with vehicular manslaughter after he failed to take control when his Tesla, which was using the company's other misleadingly named driving mode known as "Autopilot," ran a red light at high speed and killed two occupants of a Honda Civic.

The driver may be the first in U.S. history to be charged with a felony for failing to take control of his car when the popular advanced driver assistance system failed. (A test driver for Uber was charged with negligent homicide following a similar 2018 collision in Arizona, but many of the safety features on her car weren't yet being sold to consumers; Tesla, by contrast, has sold Autopilot and the beta version of Full Self Driving software to tens of thousands of untrained motorists).

The Insurance Institute hopes its rankings will help encourage customers to look beyond the automaker hype, if only for their own legal protection — and someday, inspire NHTSA to enforce stronger requirements on semi-automated vehicles overall. To earn a top ranking from the organization, an advanced driver assistance-equipped car will "need to ensure that the driver’s eyes are directed at the road and their hands are either on the wheel or ready to grab it at all times," and issue "escalating alerts and appropriate emergency procedures when the driver does not meet those conditions," including automatically bringing the car to a safe stop.

Those alerts, of course, would help ensure that any driver behave more safely behind the wheel. Even manual motorists are prone to "complacency" and distraction, if only because the tacit design cues of the autocentric roads they drive on often signal that it's okay to tune out.

But experts argue those safety features become even more critical when it comes to semi-automated cars, which have been linked to particularly egregious roadway crimes like sleeping, gaming, or even having sex instead of watching the road.

"We’re helping consumers understand what this tech can and cannot do," said Harkey. "Automakers are advertising these technology in a way that makes it seems like it can do more than it actually can."

Source: IIHS
false

At least so far, no automaker can accurately say that their advanced driver safety systems are foolproof — which is why it's so critical that every automaker take steps to make sure that drivers don't misuse them. (Disturbingly, the Insurance Institute notes that "while most partial automation systems have some safeguards in place to help ensure drivers are focused and ready, none of them [currently] meets all the pending IIHS criteria.")

Doing that, though, would require the seemingly paradoxical acknowledgment that partially automated cars might require even more safety backstops than cars designed to be driven by human beings alone — at least as long as the public agrees that the point of advanced driver assistance systems is to protect all road users, rather than to make it easier for drivers to take a nap.

"At some point, we have to ask ourselves: are these, indeed, driver safety systems?" said Harkey. "Or are they driver convenience systems?"

Stay in touch

Sign up for our free newsletter

More from Streetsblog California

Tackling Transportation Emissions Requires Focus on Transit

California's Low Carbon Fuel Standard has flown under the radar for a long time, and recently has come under attack. But if California is serious about climate action, we must clean up our fuel supply while also reducing our dependence on fossil fuels.

November 4, 2024

DECISION ’24: Next President Needs a Vision for America’s Transportation Future

No matter who wins the White House, advocates are ready to push for the transportation system we all deserve — starting with these nine principles.

November 4, 2024

Metro Rider Updates: C/K Lines, Bus Lane Cameras, TAP-to-Exit, and Cell Service

Cameras on Metro buses are now enforcing bus lanes! Metro K Line Aviation/Century Station opens this Sunday, with changed C and K Line operations. TAP-to-exit and cell phone reception are expanding.

November 4, 2024

Monday’s Headlines

Don't expect a clear answer on Tuesday night; Growing cost of climate problems calls for ambitious policy; Amtrak plans to double service in Stanislaus Co; More

November 4, 2024

America Walks Urges Support for Stronger Vehicle Safety Standards

NHTSA has proposed safety standards to redesign vehicles with dangerous front ends. But it doesn't do nearly enough to keep pedestrians safe, says America Walks

November 1, 2024
See all posts