Why do we assume everyone can drive competently?

A reader of Emerging Technologies Blog and Mountain View resident writes to them about the Google self-driving cars they see on the road regularly. It's well worth a read.

Anyway, here we go: Other drivers don't even blink when they see one. Neither do pedestrians - there's no "fear" from the general public about crashing or getting run over, at least not as far as I can tell.

Google cars drive like your grandma - they're never the first off the line at a stop light, they don't accelerate quickly, they don't speed, and they never take any chances with lane changes (cut people off, etc.).
 

During the years I was commuting down to Palo Alto or Menlo Park from San Francisco, I often took the 280 instead of the 101. It wasn't necessarily faster, but it was more scenic and the traffic tended to move more quickly even if my route ended up longer. On the 280 I often saw the Google Lexus self-driving cars on the road. After the novelty of seeing them on the road wore off, I didn't think of them any differently than other cars on the road. If you had taken me from ten years ago and told me that I would be driving on the freeway alongside a car driven by a computer, I would've thought you were describing something from Blade Runner. Now we speak of the technology as an eventuality. Such is life in technology, where we speak with the certainty of Moore's Law. 

I cycle a lot in San Francisco, and this weekend on the way back from a ride across the Golden Gate Bridge, I was cruising in the bike lane along the Embarcadero towards SOMA next to deadlocked traffic. Without any warning or a turn signal, a car in the right lane suddenly cut into the bike lane in front of me, apparently deciding to horn in to wait for a curbside parking spot. Road bikes have terrible brakes, and regardless, I had no time to stop in time. I screamed reflexively, adrenaline spiking, and leaned my bike right, just barely missing the right front fender of the car, then leaned hard left and managed to angle my bike and body around the left rear fender of the parked car along the curb.

For the next two blocks, I played my near collision on loop in my head like a Vine, both angry at the driver's reckless maneuver and relieved as I tallied up the likely severity of the injuries I had just managed to escape by less than a foot of clearance. This is not an unusual occurrence, unfortunately. When I bike, I just assume that drivers will suddenly make rights in front of me without turning on their turn signal or looking back to see if I'm coming in the bike lane to their right. It happens all the time. It's not just a question of skill but of mental obliviousness. American drivers have been so used to having the road to themselves for so long that they feel no need to consider anyone else might be laying claim to any piece of it. Though the roads in Europe are often narrower, I feel a hundred times safer there when biking there than I do in the U.S.

All that's to say I agree wholeheartedly with the writer quoted above that self-driving cars are much less threatening than cars driven by humans. As an avid cyclist, especially, I could think of nothing that would ease my mind when biking through the city than replacing every car on the road with self-driving cars.

We have years and years of data on human ability to drive cars, and if we've learned anything it's that humans are lousy drivers. For some reason, I don't know the history behind this, we've assumed that just about every person is qualified to drive a car. I can't think of the last time I met someone who didn't pass their driving test, who didn't have a driver's license. I can think of plenty of times I've met people I'd be scared to see behind the wheel.

Humans drink and drive. Humans text on their phones when they should be looking at the road. Humans, especially Americans, where drivers feel a great sense of entitlement, exercise aggressive maneuvers in fits of road rage, or drive at unsafe racetrack speeds on public roads. They run red lights, cut other drivers off, tailgate, drag race, and so on. You regularly hear that cars kill more humans in the U.S. than, well, just about anything else for the better part of a century now. The less people drive, which can be caused by a rise in gasoline prices, the fewer people die.

Why do we assume driving is something everyone is not only capable of but skilled enough execute at a level that doesn't endanger others? I venture to guess that it's no easier to learn to to drive a car than it is to cut hair, but our test to get a driver's license is much easier, despite the fact that the worst a bad stylist can do is give you a bad haircut while the worst a lousy driver can do is kill other human beings. Perhaps our romance with the American road is so deep, our conception of American freedom so intimately associated with hopping in a car and taking ourselves anywhere, that the thirty to forty thousand fatal crashes each year are seen as an acceptable error rate.

Whatever the reason, and whether you agree or disagree, it is something unlikely to change in the U.S. unless something comes along to force a serious reevaluation of our assumptions around driving quality. That something just might be self-driving cars which will be held to a much higher standard of driving safety than we've held humans to all those years. You might say that self-driving cars will be held to the standard we should've held ourselves to since the beginning.

Autonomous cars may be so well-behaved that they need protection from more ruthless and unscrupulous bad actors on the road. That's right, I refer to those monsters we call humans.

It's safe to cut off a Google car. I ride a motorcycle to work and in California motorcycles are allowed to split lanes (i.e., drive in the gap between lanes of cars at a stoplight, slow traffic, etc.). Obviously I do this at every opportunity because it cuts my commute time in 1/3.

Once, I got a little caught out as the traffic transitioned from slow moving back to normal speed. I was in a lane between a Google car and some random truck and, partially out of experiment and partially out of impatience, I gunned it and cut off the Google car sort of harder than maybe I needed too... The car handled it perfectly (maybe too perfectly). It slowed down and let me in. However, it left a fairly significant gap between me and it. If I had been behind it, I probably would have found this gap excessive and the lengthy slowdown annoying. Honestly, I don't think it will take long for other drivers to realize that self-driving cars are "easy targets" in traffic.

Overall, I would say that I'm impressed with how these things operate. I actually do feel safer around a self-driving car than most other California drivers.