Saturday, September 22, 2018 -
Print Edition

An Uber mistake

A self-driving vehicle killed a women. So much for perfection. Not to mention the liability.

A self-driving car of Uber Technologies killed a pedestrian on March 18 in Tempe, Arizona. Only by stretching the word “accident” can this be called an accident. The National Transportation Safety Board issued a preliminary report on the incident. It is chilling. The bottom line is clear: Self-driving cars are not ready for prime time.

The report shows that the emergency braking system on the car did not work. Based on experts’ analysis of how long it takes to apply the brake, and of how long it was before the car detected the pedestrian, the system should have worked. The pedestrian should not have been struck. There’s that telltale cliché of computerized systems: should have. Everything was planned out to the t —  just one thing. It didn’t work. That is because the company disabled the emergency braking system and because the test driver was not paying attention.

Not a vote of confidence in Uber Technologies.

The report revealed many aspects of self-driving vehicles that we do not think are widely known. For example, these vehicles cannot definitively distinguish between objects on the road — a plastic grocery bag, a human being, a kids’ ball — in enough time to, for instance, save a human life. The vehicle’s computer systems are designed to facilitate a direct, nice ride, and if it stopped or slowed every time the vehicle detected a plastic grocery bag (or some other innocent object) on the road, it would not be a direct, nice ride. Therefore, the vehicle’s systems make decisions as to when to slow down, or stop, and when not to. Unable to detect definitively — and quickly — what requires a vehicle to slow or stop, and what does not, the prospects for self-driving vehicles obviously depend on further development.

Meanwhile, the death of this woman raised another issue: Who is liable when a self-driving vehicle causes damage? With a perfect mixture of naiveté and hubris, Uber hadn’t considered this because, you know, an Uber self-driving vehicle will never make a mistake. It will never cause damage. It will never be in an accident, let alone a fatal one. Well, scratch that. Uber may in fact be on the hook for untold sums, especially since it intentionally disabled its emergency braking system last March in Tempe, Arizona.

Add to all this the fact that Uber did not alert either the car manufacturer or the vehicle’s operator that it had disabled the emergency braking system.

Here’s a case of we’re not sure which is worse: The reason the vehicle’s testing operator did not stop in time is because she was looking down toward the center of the console, either toward her phone, or toward the self-driving system interface.

If it was her phone, we have the universal element in this tragedy, transcending Uber and self-driving vehicles. Don’t text and drive.

Or, the test driver was doing her best to figure out what to do with the vehicle at that moment. Woe to the technology that requires the driver to take his eyes off the road — and woe to all of us if these vehicles, as currently developed, are released to the roads.

Copyright © 2018 by the Intermountain Jewish News




Leave a Reply