Driverless Cars Are Coming to Silicon Valley
DMV regulations on them are sparse
Note: GJEL Accident Attorneys regularly sponsors coverage on Streetsblog San Francisco and Streetsblog California. Unless noted in the story, GJEL Accident Attorneys is not consulted for the content or editorial direction of the sponsored content.
The California Department of Motor Vehicles just granted its first permit to a company testing driverless cars. Waymo plans to begin deploying its vehicles on the streets of Mountain View, its hometown, and a few nearby cities. These will be the state’s first tests under the DMV’s new requirements (released last spring) for cars that can be “driven” without a human being at the wheel.
Among its requirements, the new rules mandate that driverless vehicles be monitored at all times, and that the engineer monitoring them be properly trained, licensed, and able to communicate with any passengers in the vehicle.
What does it mean by “properly trained”? That’s left up to the developers of the technology. The DMV requires them to create their own training program focused both on their tech and on “defensive driver training.” That’s as specific as the regulations get.
Unfortunately, that lackluster approach is more or less consistent with the weak rules that apply to human driver training.
While the DMV’s driverless cars rules also require “continuous monitoring” of both the vehicle and the communication link to passengers, what form that monitoring takes is also left up to the company. Does that mean one “properly trained” engineer can sit in front of a bank of screens and keep an eye on ten, fifteen, fifty cars at the same time? DMV rules offer no thoughts about that.
The DMV is being asked to do a lot here: to encourage the development of a new technology that many believe will solve road safety issues, while keeping track of who is testing what kinds of driverless vehicles where. Safety doesn’t seem to be at the top of its priorities. And the regulations don’t go anywhere near the larger moral questions raised, like who should die when an autonomous vehicle gets into an inescapable situation?
That’s an ugly question, but a necessary one. Whether stated outright or not, the calculations inherent in that moral dilemma were an underlying contributor to Uber’s autonomous vehicle crash that killed a woman in Arizona. Bicyclists and other vulnerable road users are right to be concerned about who asks the question and how it gets settled.
A few things are already known: driving takes a high level of attention that is constantly underestimated by bored or distracted drivers. And even with the assistance of multiple cameras and sensing devices, autonomous cars and their robo-drivers make serious mistakes. Also, monitoring anything from a computer screen, even if only one thing, can be both taxing and boring, and a difficult job for even the most diligent engineer.
Unfortunately autonomous vehicle testing, both with and without drivers present, is showing that practice isn’t making perfect. Self-driving cars may not be improving their safety records even with millions of miles of testing.
For now, Waymo is restricting its test rides to streets in Mountain View, Sunnyvale, Los Altos, and Palo Alto, and the only passengers will be its paid employees.
The state should do both: raise the bar on human drivers as well as define clear rules for robo drivers. Aside from simply immediately improving safety another big benefit of requiring higher competence from humans is that human driving behavior is being used as the de facto standard for what is “good enough” for robo drivers. And as you note the competence required for human drivers is really really low.
AV companies would prefer that the bar be set as low as possible so they can get their product on the streets earlier. This past decade’s phenomenon of digital distraction induced collisions has lowered the bar, which in turn lowers the bar of what AVs will be required.
I’d like the DMV to spend a lot more time making rules for meat drivers and spending a little less time on rules for computers. For humans there are almost literally no rules at all. Anything you want to do is fine. Even if you demonstrably do not know how to drive you can just take the test again immediately until you guess the answers, and then you’re good for five years. Did you go blind? No problem as long as you renew by mail! Seriously it is ridiculous to focus on self-driving.