Uber’s Self-Driving Cars Are Already Endangering Pedestrians, Bicyclists

Uber broke the law and bullied its way onto the streets of San Francisco with its self-driving vehicles

Uber's self-driving cars don't follow traffic rules when turning right. Image: San Francisco Bicycle Coalition
Uber's self-driving cars don't follow traffic rules when turning right. Image: San Francisco Bicycle Coalition

Uber is in such a hurry to dump the most expensive part of its operations—the drivers who make a paltry living, if that much—that it rushed into testing self-driving cars in San Francisco without getting permission.

The DMV sent a statement to media outlets on Tuesday that said:

The California DMV encourages the responsible exploration of self-driving cars. We have a permitting process in place to ensure public safety as this technology is being tested. Twenty manufacturers have already obtained permits to test hundreds of cars on California roads. Uber shall do the same.

Nevertheless, on Wednesday Uber’s self-driving cars were being tested on the streets of San Francisco. Uber published a blog post with a convoluted argument about how the DMV’s permitting process doesn’t apply to its vehicles, because–for now–they have a human/backup driver behind the wheel at all times.

Then one of its self-driving cars was filmed running a red light in front of a pedestrian in San Francisco. The company blamed the backup driver.

Meanwhile Brian Wiedenmeier, executive director of the San Francisco Bicycle Coalition, took a demonstration ride in one of the Uber self-driving cars and he reports that Uber is correct in that they are not ready to be released as self-driving vehicles. The car he was in made several dangerous right turns, cutting across bike lanes. It’s the kind of turn that puts bicyclists and pedestrians in grave danger.

Uber told Wiedenmeier that it is requiring drivers to disengage from self-driving mode when approaching a right turn on a street with bike lanes. Meanwhile, the DMV told Uber to stop testing its vehicles on the streets.

It’s hard to tell whether a car is in self-driving mode, and given Uber’s arrogance, do we have any guarantee that the company is complying with the DMV order to stop?

Have you seen any self-driving Ubers? Have you ridden in one?

49 thoughts on Uber’s Self-Driving Cars Are Already Endangering Pedestrians, Bicyclists

  1. Isn’t the general solution to people breaking the law to charge them with a crime…?

  2. “The second piece of the puzzle is to ban human driving once self-driving cars are at least as good and self-drive can be retrofitted to every vehicle.”

    I seriously doubt that a nation that won’t even guarantee access to basic healthcare for its citizens will leap into the breach to make it happen. More like “You’re on your own, peons!”

  3. That’s exactly why the regulations need to push us in the direction of “no human driver, no steering wheel”. So long as we have self-drive optional, with the ability of the human to take control, we’ll have the worst of both worlds. You’ll still have the poorly trained human driver, only now you may expect them to take control in a situation where they’re doing something else, and hence not really aware of what’s going on. “Monitoring” an automated task with the expectation to take control if the automatic system acts improperly is just about the most error prone thing a human being can do. Humans often fall asleep when doing that. Either they do the task all the time, or the task is completely automated with no possibility for the human to take control, other than maybe some sort of emergency stop button.

  4. The Teslas only crashed because human drivers also made errors. The second piece of the puzzle is to ban human driving once self-driving cars are at least as good and self-drive can be retrofitted to every vehicle. Self-driving cars have crashed thanks to the unpredictability of human drivers. Removing human drivers from the equation removes the source of that unpredictability. You obviously still have pedestrians and cyclists, but those are easy for a self-driving car to avoid hitting given how slowly both move, and how much less space they take up.

  5. There will be a disposable, minimum wage “driver” in the front to take the blame for any accidents.

  6. But what is actual practice? Do people merge into the bike lane, slow down, then turn, or do they just swoop around the corner on the most convenient arc available at the time?

  7. I’d be more comfortable with testing on public streets if it was done under some kind of regulated system to ensure safety.
    Uber is definitely trying to make sure it gets its cars out there before there are any rules for it to follow. That worries me a lot.

  8. I’m aware, is was a basic point.

    To an extent but never enough to be 100% the same. Public testing has been going on for years now but at least we can agree that Uber is being too aggressive here.

  9. Huh? Once self-driving vehicles are at least as good as human drivers (an argument can be made that they’re there now), they should be allowed on the road, since we know they’ll only get better over time, while people won’t.

  10. Autonomous cars will crash, they will kill people. It’s going to happen.

    Teslas have crashed with their “autopilot” system engaged, and it shows how manufactures are likely to respond.

    F#ck trump.

  11. That’s a good point.
    I wonder how much that will change if people are “drivers” of self-driving cars. Will they view it differently? If they still don’t walk or bike, will they see it differently?

  12. It’s not just real world “testing” – it’s real world “learning”. The cars aren’t programmed – they *learn*

  13. Saying “Teslas already have” in the context of this discussion is Trumpism at it’s best. You are comparing something that isn’t relevant.

  14. If the manufacturer can place blame on someone else, they can still sell plenty of cars.

    A fully autonomous vehicle with no steering wheel leaves a lot less ambiguity about who’s fault a crash is. Don’t look for those to come out anytime soon. Look for crashes where the company can still blame the driver for not using the autonomous system correctly.

  15. It’s pedestrians and cyclists vs car when the word “car” is a stand-in for “driver.” As soon as a critical mass of Americans are no longer driving, then “car” will become a stand-in for “company” or “machine.”

    Our culture minimizes vehicle deaths due to driver error because a lot of us see a DRIVER who kills a pedestrian and think “there but for the grace of god go I.” When we aren’t all drivers, we might start thinking differently: when we see a pedestrian killed by a company’s machine we may look at that PEDESTRIAN and think “there but for the grace of god go I.” Then our courts and governments may start thinking quite a bit differently about who is responsible for deaths on the road… at least until Uber comes up with the self-operated voting machine.

  16. You are probably right about not hitting and running.
    You are more likely to see hitting and hiring lawyers to place blame on everyone but the vehicle manufacturer.

  17. But they will kill people. Teslas already have.
    When it happens you will see every possible excuse given as to why it’s not the cars fault. There is a huge financial incentive for them to not be blamed when the car does crash.

  18. Flying is a bit more complex than that.

    Testing can easily be done without using public streets.

    What Uber is doing here is not even testing. It’s just releasing without testing first.

  19. What?

    Aside from the fact that every SDC out there is programmed to halt in any collision aready…

    If Uber or anyone wants to succeed in the autonomous car game, they have every incentive to be as good as possible.
    The reward for not hitting and running is hundreds of billions of dollars.

    The penalty for a human hitting and running is “might get away with this”.

    Why do you ascribe specifically evil motives like “They want to kill people” to these businesses. They want to make money, and killing people is antithetical towards that goal.

  20. There is great potential for autonomous cars to be safer, but only if we hold them to a higher standard than we have drivers.
    That’s not happening. People are already pushing for autonomous cars to be “meh, as good as human drivers.”

  21. And that involves flying in the sky, not hard to do. Cars don’t have the luxury of an empty sky and what would be the point anyways since the roads they will be driving on won’t be empty. They have to test in public. Having said that, I do not trust Uber thus far.

    Eventually, yes, I think they should be held to a higher standard too but they can’t get there without real world testing and it seems very clear that they already are safer.

  22. Video means nothing in a culture that is predisposed to place blame on pedestrians and cyclists vs cars.

  23. Human drivers should be held to a higher standard as well, but nobody seems willing to do that.

    This is exactly the reason I came around to self driving cars. At this point people will constantly push on autonomous cars to be better, whereas they push for standards on human drivers to be *lower*. The quickest path to safety is autonomous cars. The quicker the better, the largest demographic in the country, one which brought the auto to the forefront, is losing their capability to drive but not their will to.

  24. Unless they are programmed to hit and run. If we don’t regulate them, they could very easily do that.

    Uber will use those deep pockets to hire lawers. If they run you over, you’re screwed.

  25. This is backwards, Given the choice between equivalent human drivers and self driving cars, I would choose the self driving car.

    1) Self driving cars will not hit and run
    2) Uber has deeper pockets than the typical human driver

  26. How do airplanes get to start hauling passengers without being allowed?

    They undergo extensive testing and regulatory approval before ever being put into use where they put the general public at risk.

    Autonomous vehicles should be held to a higher standard because we need to stop killing people with cars.
    Human drivers should be held to a higher standard as well, but nobody seems willing to do that.

  27. Why would we continue to allow the amount of death and carnage caused by human drivers to continue when there is potential to improve it?

    Zero reason, huh? How about, we don’t want to continue killing 40,000 people every year. That’s a good reason.

  28. You changed it back to your previous illogical statement. There’s zero reason to require that they be significantly better, only that they not be worse.

  29. Here in Denver it appears that it depends on the intersection, some have the dashed line implying that they get over and others have a solid one, implying that they don’t.

  30. Why should they be significantly better before being allowed? How do they get better with out being allowed?

  31. “Unless and until they are significantly better, they should not be allowed.” Fixed it back for you.

  32. “Unless and until they are at least as good, they should not be allowed.”

    Fixed that for you.

  33. I wonder if they will be programmed to say “I didn’t see the bicyclist” after the fatal accident?

    They’ll have the footage to prove it.

  34. For the record, a good portion of human drivers definitely do not merge into the bike lane before making their turn and that’s actually illegal in Oregon.

  35. Yes. Also the Uber CEO is apparently the worst of the worst, almost a caricature of entitled-yet-utterly-clueless douchebro tech CEOs…

  36. The question is whether self-driving vehicles will be better or worse. Unless and until they are significantly better, they should not be allowed.

    I wonder if they will be programmed to say “I didn’t see the bicyclist” after the fatal accident?

  37. I had an undercover in at Uber and they have a horrible culture for employees and drivers, riders alike.

Comments are closed.

ALSO ON STREETSBLOG