Have I already mentioned somewhere that we humans love our mobility? In addition to having a key role in human primate evolution, the mobility goal of ‘more, better, faster’ was a big theme during my two decades at the Regional Transportation Commission. Many of us live for the fantasy of more earthly space for our own preferred mobility mode.
But earthly space is limited, and worsening automobile traffic is stressful for humans nearly everywhere in the urbanized world. The lure of driverless cars is that they offer a ‘solution’ to this problem (‘look ma, no hands!’) and, they’re supposedly safer to boot.
Tesla revealed yesterday (June 30, 2016) on their blog that one of their cars, while being driven in ‘auto-pilot’ mode, caused the death of its driver in an accident with a truck. This was disclosed nearly two months after the fatal accident occurred on May 7, 2016. The nature of the accident was apparently only revealed because an federal investigation into the crash has been initiated.
The Tesla blogpost informs us that the auto-pilot mode is still in ‘public beta phase’ & that the driver should have been ‘prepared to take over at any time’. Obvious questions arise: Isn’t the attraction of self-driving cars (SDCs) that we’d no longer have to pay attention to the road? Why is a car still in beta allowed to be driven on public roads? Who’s overseeing the development of self-driving cars?
What’s the evidence behind the safety hype of driverless cars, anyway?
Unfortunately, as we’re all going to better understand as time goes on, the driverless car safety hype is just that: hype…of the kind increasingly being doled out by Big Tech.
Worried about accidents between pedestrians & self-driving cars? Here’s Google’s patent to address the problem (granted 5/17/16): “The front of the vehicle may be coated with a specialized adhesive that adheres to a pedestrian and thus holds the pedestrian on the vehicle in the unfortunate event that…the vehicle comes into contact with the pedestrian.”
Well, I’m not buying it.
And neither are experts in automation & transportation safety. In her testimony before the March 15 Senate Commerce Committee hearing about issues related to self-driving car regulation, Mary Cummings, PhD, Director of Robotics and the Humans & Autonomy Laboratory at Duke University, highlighted the lack of transparency regarding safety testing methodologies & verifiable results by Google and other SDC developers:
“In my opinion, the self-driving car community is woefully deficient in its testing and evaluation programs,” Cummings said. She compares what should be a standard federal process for certifying the safety of self-driving cars with similar aircraft software certification, where ‘evidence-based tests and evaluations’ conducted in a ‘principled and rigorous manner’ are made public in order to enable expert peer review and validation.
This is not what’s occurring now. In fact, Google et al are pressing for a slew of exceptions & permission to fast-track the normal federal transportation safety rule-making process, complaining that state & federal rules are impeding the deployment of driverless cars. In response to industry pressure, Transportation Secretary Anthony Foxx promised in January that preliminary guidelines for self-driving cars will be available this month.
But what’s the hurry?
Well, it’s just that, as noted by the Guardian last year: “…never shy of hubris, Google wants not only to reinvent the car but to replace the whole idea of driving…’We want to fundamentally change the world with this’, Sergey Brin, the co-founder of Google, likes to say.” Hmmm – we actually do need to change our dreary & dirty driving habits, but are driverless cars really going to be our salvation?
There is a long list of other, non-safety issues with self-driving cars: privacy (Cummings calls the SDC ‘one, big data-gathering machine’), hacking, computer failure (how much time daily do you already spend dealing w computer problems?), questions about legal liability, reduced human driving skills due to automation, susceptibility of humans to distraction (uh yeah, that scary close call in cruise-control mode, not to mention the iPhone-on-the-lap syndrome), increased risk tolerance, increased social isolation, worsening congestion, challenges of both SDC & non-SDC vehicles on roads & highways, health risks of physical inactivity, more inefficient land uses & increasing urban sprawl, vehicle interactions with bicyclists & pedestrians – just to mention a few. Many of these concerns are present already with human drivers & non-automated vehicles; adding SDCs to the mix will exponentially increase the complexity of our driving & urban environment.
I also know, as the 19th century saying goes, that this train has already left the station. As dismayed as some of us (who probably have a bad habit of imagining the future) are regarding where life on earth appears to be headed, there’s no doubt that (barring a planet-wide calamity, which is, of course, entirely possible on many fronts,) self-driving cars are going to be a part of urban life in the 21st century.
Let’s hope, though, that this Tesla tragedy will provide a much-needed pause to Big Tech’s relentless driverless car juggernaut.