Uber’s Self-Driving Cars Are Already Endangering Pedestrians, Bicyclists

Uber broke the law and bullied its way onto the streets of San Francisco with its self-driving vehicles

Uber's self-driving cars don't follow traffic rules when turning right. Image: San Francisco Bicycle Coalition
Uber's self-driving cars don't follow traffic rules when turning right. Image: San Francisco Bicycle Coalition

Uber is in such a hurry to dump the most expensive part of its operations—the drivers who make a paltry living, if that much—that it rushed into testing self-driving cars in San Francisco without getting permission.

The DMV sent a statement to media outlets on Tuesday that said:

The California DMV encourages the responsible exploration of self-driving cars. We have a permitting process in place to ensure public safety as this technology is being tested. Twenty manufacturers have already obtained permits to test hundreds of cars on California roads. Uber shall do the same.

Nevertheless, on Wednesday Uber’s self-driving cars were being tested on the streets of San Francisco. Uber published a blog post with a convoluted argument about how the DMV’s permitting process doesn’t apply to its vehicles, because–for now–they have a human/backup driver behind the wheel at all times.

Then one of its self-driving cars was filmed running a red light in front of a pedestrian in San Francisco. The company blamed the backup driver.

Meanwhile Brian Wiedenmeier, executive director of the San Francisco Bicycle Coalition, took a demonstration ride in one of the Uber self-driving cars and he reports that Uber is correct in that they are not ready to be released as self-driving vehicles. The car he was in made several dangerous right turns, cutting across bike lanes. It’s the kind of turn that puts bicyclists and pedestrians in grave danger.

Uber told Wiedenmeier that it is requiring drivers to disengage from self-driving mode when approaching a right turn on a street with bike lanes. Meanwhile, the DMV told Uber to stop testing its vehicles on the streets.

It’s hard to tell whether a car is in self-driving mode, and given Uber’s arrogance, do we have any guarantee that the company is complying with the DMV order to stop?

Have you seen any self-driving Ubers? Have you ridden in one?

  • Pingback: Streetsie Award: The Bay Area’s Most Bad-Ass Advocacy Group of 2016 – Streetsblog San Francisco()

  • Ominous indication Uber thinks it can do as Uber pleases:

    “Uber had already been ordered not to deploy the self-driving cars, but defied the order in a move that the company called an ‘important issue of principle.'”

    https://www.democracynow.org/2016/12/22/headlines/california_uber_self_driving_cars_pulled_amid_traffic_violations

    • Miles Bader

      Isn’t the general solution to people breaking the law to charge them with a crime…?

  • For the record, a good portion of human drivers definitely do not merge into the bike lane before making their turn and that’s actually illegal in Oregon.

    • mckillio

      Here in Denver it appears that it depends on the intersection, some have the dashed line implying that they get over and others have a solid one, implying that they don’t.

      • But what is actual practice? Do people merge into the bike lane, slow down, then turn, or do they just swoop around the corner on the most convenient arc available at the time?

  • Meanwhile, meat-based drivers continue to not merely endanger but actually kill pedestrians and bicyclists, just like they have for the past century.

    • Larry Littlefield

      The question is whether self-driving vehicles will be better or worse. Unless and until they are significantly better, they should not be allowed.

      I wonder if they will be programmed to say “I didn’t see the bicyclist” after the fatal accident?

      • ahwr

        I wonder if they will be programmed to say “I didn’t see the bicyclist” after the fatal accident?

        They’ll have the footage to prove it.

        • MT

          Video means nothing in a culture that is predisposed to place blame on pedestrians and cyclists vs cars.

          • Steven H

            It’s pedestrians and cyclists vs car when the word “car” is a stand-in for “driver.” As soon as a critical mass of Americans are no longer driving, then “car” will become a stand-in for “company” or “machine.”

            Our culture minimizes vehicle deaths due to driver error because a lot of us see a DRIVER who kills a pedestrian and think “there but for the grace of god go I.” When we aren’t all drivers, we might start thinking differently: when we see a pedestrian killed by a company’s machine we may look at that PEDESTRIAN and think “there but for the grace of god go I.” Then our courts and governments may start thinking quite a bit differently about who is responsible for deaths on the road… at least until Uber comes up with the self-operated voting machine.

          • MT

            That’s a good point.
            I wonder how much that will change if people are “drivers” of self-driving cars. Will they view it differently? If they still don’t walk or bike, will they see it differently?

      • BubbaJoe123

        “Unless and until they are at least as good, they should not be allowed.”

        Fixed that for you.

        • MT

          “Unless and until they are significantly better, they should not be allowed.” Fixed it back for you.

          • mckillio

            Why should they be significantly better before being allowed? How do they get better with out being allowed?

          • MT

            How do airplanes get to start hauling passengers without being allowed?

            They undergo extensive testing and regulatory approval before ever being put into use where they put the general public at risk.

            Autonomous vehicles should be held to a higher standard because we need to stop killing people with cars.
            Human drivers should be held to a higher standard as well, but nobody seems willing to do that.

          • murphstahoe

            Human drivers should be held to a higher standard as well, but nobody seems willing to do that.

            This is exactly the reason I came around to self driving cars. At this point people will constantly push on autonomous cars to be better, whereas they push for standards on human drivers to be *lower*. The quickest path to safety is autonomous cars. The quicker the better, the largest demographic in the country, one which brought the auto to the forefront, is losing their capability to drive but not their will to.

          • MT

            There is great potential for autonomous cars to be safer, but only if we hold them to a higher standard than we have drivers.
            That’s not happening. People are already pushing for autonomous cars to be “meh, as good as human drivers.”

          • murphstahoe

            [citation needed].

            I have many billion$ of reasons that counter your claim.

          • MT

            If the manufacturer can place blame on someone else, they can still sell plenty of cars.

            A fully autonomous vehicle with no steering wheel leaves a lot less ambiguity about who’s fault a crash is. Don’t look for those to come out anytime soon. Look for crashes where the company can still blame the driver for not using the autonomous system correctly.

          • Joe R.

            That’s exactly why the regulations need to push us in the direction of “no human driver, no steering wheel”. So long as we have self-drive optional, with the ability of the human to take control, we’ll have the worst of both worlds. You’ll still have the poorly trained human driver, only now you may expect them to take control in a situation where they’re doing something else, and hence not really aware of what’s going on. “Monitoring” an automated task with the expectation to take control if the automatic system acts improperly is just about the most error prone thing a human being can do. Humans often fall asleep when doing that. Either they do the task all the time, or the task is completely automated with no possibility for the human to take control, other than maybe some sort of emergency stop button.

          • mckillio

            And that involves flying in the sky, not hard to do. Cars don’t have the luxury of an empty sky and what would be the point anyways since the roads they will be driving on won’t be empty. They have to test in public. Having said that, I do not trust Uber thus far.

            Eventually, yes, I think they should be held to a higher standard too but they can’t get there without real world testing and it seems very clear that they already are safer.

          • MT

            Flying is a bit more complex than that.

            Testing can easily be done without using public streets.

            What Uber is doing here is not even testing. It’s just releasing without testing first.

          • mckillio

            I’m aware, is was a basic point.

            To an extent but never enough to be 100% the same. Public testing has been going on for years now but at least we can agree that Uber is being too aggressive here.

          • MT

            I’d be more comfortable with testing on public streets if it was done under some kind of regulated system to ensure safety.
            Uber is definitely trying to make sure it gets its cars out there before there are any rules for it to follow. That worries me a lot.

          • murphstahoe

            It’s not just real world “testing” – it’s real world “learning”. The cars aren’t programmed – they *learn*

          • mckillio

            Yes.

          • BubbaJoe123

            You changed it back to your previous illogical statement. There’s zero reason to require that they be significantly better, only that they not be worse.

          • MT

            Why would we continue to allow the amount of death and carnage caused by human drivers to continue when there is potential to improve it?

            Zero reason, huh? How about, we don’t want to continue killing 40,000 people every year. That’s a good reason.

          • BubbaJoe123

            Huh? Once self-driving vehicles are at least as good as human drivers (an argument can be made that they’re there now), they should be allowed on the road, since we know they’ll only get better over time, while people won’t.

          • MT

            People could be safer drivers if we bothered to regulate them.

          • murphstahoe

            This is backwards, Given the choice between equivalent human drivers and self driving cars, I would choose the self driving car.

            1) Self driving cars will not hit and run
            2) Uber has deeper pockets than the typical human driver

          • MT

            Unless they are programmed to hit and run. If we don’t regulate them, they could very easily do that.

            Uber will use those deep pockets to hire lawers. If they run you over, you’re screwed.

          • murphstahoe

            What?

            Aside from the fact that every SDC out there is programmed to halt in any collision aready…

            If Uber or anyone wants to succeed in the autonomous car game, they have every incentive to be as good as possible.
            The reward for not hitting and running is hundreds of billions of dollars.

            The penalty for a human hitting and running is “might get away with this”.

            Why do you ascribe specifically evil motives like “They want to kill people” to these businesses. They want to make money, and killing people is antithetical towards that goal.

          • MT

            But they will kill people. Teslas already have.
            When it happens you will see every possible excuse given as to why it’s not the cars fault. There is a huge financial incentive for them to not be blamed when the car does crash.

          • murphstahoe

            Saying “Teslas already have” in the context of this discussion is Trumpism at it’s best. You are comparing something that isn’t relevant.

          • MT

            Autonomous cars will crash, they will kill people. It’s going to happen.

            Teslas have crashed with their “autopilot” system engaged, and it shows how manufactures are likely to respond.

            F#ck trump.

          • Joe R.

            The Teslas only crashed because human drivers also made errors. The second piece of the puzzle is to ban human driving once self-driving cars are at least as good and self-drive can be retrofitted to every vehicle. Self-driving cars have crashed thanks to the unpredictability of human drivers. Removing human drivers from the equation removes the source of that unpredictability. You obviously still have pedestrians and cyclists, but those are easy for a self-driving car to avoid hitting given how slowly both move, and how much less space they take up.

          • Judas Peckerwood

            “The second piece of the puzzle is to ban human driving once self-driving cars are at least as good and self-drive can be retrofitted to every vehicle.”

            I seriously doubt that a nation that won’t even guarantee access to basic healthcare for its citizens will leap into the breach to make it happen. More like “You’re on your own, peons!”

          • MT

            You are probably right about not hitting and running.
            You are more likely to see hitting and hiring lawyers to place blame on everyone but the vehicle manufacturer.

          • laughtiger

            There will be a disposable, minimum wage “driver” in the front to take the blame for any accidents.

          • Vooch

            SD cars will never speed

          • MsInformed

            Uber says it doesn’t have any pockets.

  • Nicholas Littlejohn

    I had an undercover in at Uber and they have a horrible culture for employees and drivers, riders alike.

    • Miles Bader

      Yes. Also the Uber CEO is apparently the worst of the worst, almost a caricature of entitled-yet-utterly-clueless douchebro tech CEOs…

ALSO ON STREETSBLOG

STREETSBLOG SF

Can the Uber App Stop Reckless Drivers?

|
Gyro-meters and accelerometers are circuits inside your cell phone that detect movement. It is part of how the phone helps you navigate and how the screen stays oriented. But it may also work to identify how harshly a driver is swerving, braking, and accelerating. Last week Uber announced it is launching a pilot program to […]
STREETSBLOG USA

Will US DOT’s Self-Driving Car Rules Make Streets Safe for Walking and Biking?

|
This week, U.S. DOT released guidelines for self-driving cars, a significant step as regulators prepare for companies to bring this new technology to market. Autonomous vehicles raise all sorts of questions about urban transportation systems. It’s up to advocates to ensure that the technology helps accomplish broader goals like safer streets and more efficient use […]

Today’s Headlines

|
New city council sworn in, immediately approves bikeways (Highland News) Uber tests self-driving cars in San Francisco—claims state law does not apply to them (LA Times) Hands-free phones are just as dangerous for drivers as hand-helds (ABC) Governor Brown says California won’t give in on climate policy (Sacramento Bee) Rapid rise in methane emission surprise […]

Today’s Headlines

|
Today is National Bike to Work Day—have a great ride! High speed rail gets a four-year delay (Politico)  (LA Times) …and opponents jump at the chance to raise heck (Silicon Valley Business Journal) Director of Caltrans leads annual bike tour to promote, discuss bicycling for transportation (Mass Transit) Placer County “sidewalks to nowhere” are endangering […]

Today’s Headlines

|
The Uber self-driving car story spreads: Uber admits they have a problem with bike lanes (The Verge) They can’t handle bike lanes (Register) They could be a threat to bicyclists (Fortune) Uber’s fight against regulation is part of a “master plan” (The Drive) Electric car-sharing coming to low-income areas in L.A. (KPCC) It’s ugly, but […]