main article image
(CC_Firefighters/Twitter)

This Man Is Probably The First to Blame His Self-Driving Tesla on Drunken Crash

You can't use autopilot as designated driver.

CLEVE R. WOOTSON JR., WASHINGTON POST
23 JAN 2018

The driver had a blood alcohol content nearly double the legal limit and a tenuous relationship with consciousness when his car slammed into the back of a parked firetruck on Interstate 405 in Culver City.

 

Still, he became maybe the first to add a technologically advanced new entry to the list of drunken driving excuses.

He wasn't driving, the man told the highway patrolman Monday morning. The car was.

According to Culver City firefighters, the driver explained that his Tesla electric vehicle "was on autopilot," obviating the need for him to be in control of the vehicle or, well, sober.

He was wrong, of course, and was ultimately jailed under suspicion of driving under the influence.

But as word of another Tesla autopilot crash spread, the case of car as designated driver became an interesting thought exercise for anyone with more than a passing interest in vehicles that drive themselves.

If Elon Musk and other forward-thinking automakers have their way, there will soon be a time when there is no more drunken driving, because cars never have to wonder whether they've had one too many vodka martinis.

But until we all have our own computer-controlled, two-ton chauffeurs, we're left with an increasing number of cars with a raft of features that make them semiautonomous – vehicles that are safer and smarter, if not particularly geniuses.

Carmakers are transparent about that caveat emptor quality of their vehicles.

 

Tesla, for example, warns that its autopilot system is not fully autonomous. The company instructs drivers to be alert because they are ultimately responsible for their vehicle and whatever it smacks into.

"Autopilot is intended for use only with a fully attentive driver," a Tesla spokesperson told The Washington Post.

But humans can slip into complacency when the car is doing most or all of the work.

For example, a fatal Tesla crash involving the autopilot system drew international scrutiny in spring 2016. The Model S had been set on autopilot and neither the vehicle nor the driver recognised that a tractor-trailer hauling blueberries had turned onto the divided highway.

In its report, the National Transportation Safety Board cited Joshua Brown's overreliance on the autopilot.

He had set the speed at 10 miles (16 kilometres) per hour over the posted speed limit and in the final 37 minutes of his drive, he had his hands on the wheel for just 25 seconds. He also ignored seven dashboard warnings and six audible warnings.

For Brown, those mistakes were fatal. But as technology advances, automakers say, they won't be mistakes at all.

 

"We aimed for a very simple, clean design, because in the future – really, the future being now – the cars will be increasingly autonomous," Musk said in July, according to The Post's Peter Holley.

"So you won't really need to look at an instrument panel all that often. You'll be able to do whatever you want: You'll be able to watch a movie, talk to friends, go to sleep."

And Musk and other autonomous vehicle proponents have disseminated videos and other media that show autopilot at its best, protecting drivers, passengers and even pedestrians from crashes.

Authorities have not identified the driver of the Tesla that crashed into the firetruck. They say no one – not even autonomous vehicle drivers – is allowed to be drunk behind the wheel of a car, no matter how advanced its safety features.

No one was seriously injured in the Bay Area wreck; the firefighters were parked in the emergency lane and car pool lane, responding to a crash on the other side of their truck, according to the San Jose Mercury News.

Tesla can check the car's data to see whether the car was indeed using autopilot before the crash, but have not released that information.

2017 © The Washington Post

This article was originally published by The Washington Post.