© 2024 KRWG
News that Matters.
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Tesla 'Autopilot' Crash Raises Concerns About Self-Driving Cars

ROBERT SIEGEL, HOST:

A fatal crash with a Tesla in autopilot mode is raising questions about self-driving car technology. We learned about the crash yesterday, but it happened back in May on a highway in Florida. The car was on a divided highway when a tractor trailer turned across the lanes up ahead. The car's autopilot system and the driver apparently failed to see the vehicle. According to Tesla, the brake was never applied before the crash.

Alex Davies is with Wired magazine, and he's been writing about this. Welcome to the program, Alex.

ALEX DAVIES: Thanks for having me.

SIEGEL: And please describe this autopilot feature to us. How does it work?

DAVIES: Sure. So Tesla's autopilot isn't actually all that different from an advanced cruise control you can buy in a new Mercedes or Audi or even cheaper cars. When you're in the car and you're on the highway, as long as you're going at least 18 miles an hour, you hit a little button on one of the stalks coming off the steering wheel.

And from that point on, the car uses cameras in the front to pick up lane lines, and it uses radars to look for other cars. And its basic mission is to keep you in the middle of your lane and a safe distance from other cars. So if the semi in front of it slows down to 55 miles an hour, it will also slow down.

SIEGEL: Well, based on what we've heard about this crash, it sounds like there's a blind spot in its capabilities.

DAVIES: There certainly seems to have been some sort of glitch. Now, Tesla is saying that when the tractor trailer turned in front of this Tesla because the trailer was all white that the car didn't see it against a similarly white sky. That makes sense or that could make sense for the cameras, but it's harder to understand why the radar wouldn't pick up the tractor trailer because the radar doesn't see color. It sees objects.

Tesla hasn't given a really clear explanation of that, but they are quick to say, hey, this is a beta system. You know, it's still technically in testing, even though we've given it out to the public.

SIEGEL: I should add, by the way, that there are a great many white vehicles on the road, so that alone is not a small problem. This is one fatal car crash. It's estimated there were 35,000 motor vehicle deaths last year.

But because it was a self-driving car at least a car in auto pilot, it raises some very interesting questions. I mean, does Tesla say, we don't guarantee that this car will miss a car turning in front of it, it's up to you, you'd better be sure to back-up the machine at all times?

DAVIES: Oh, they definitely think that. When you get a car with autopilot, autopilot is automatically turned off. You have to turn it on and in doing so effectively sign a box that says, hey, I know this is a beta feature. I know that I, the driver, am ultimately responsible for the behavior of the car. I'm not going to take my eyes off the road. I'm not going to take my hands off the wheel. And this is purely a convenience feature and a safety aide.

SIEGEL: Is there a possibility, though, here of a gap between what drivers formally signed onto and say they understand and what they're actually expecting the technology to do for them when they're at the wheel?

DAVIES: Oh, absolutely. And I think you saw that within a week of this system being released which was in mid-October of last year. There is a whole new genre of YouTube videos of Tesla drivers playing around with autopilot. There's one video of a guy filming his car driving itself from the backseat. So I think as with anything, there's a difference between signing a user agreement and actually reading it, let alone obeying the entire thing.

SIEGEL: Does that agreement that the driver makes, does that absolve Tesla of any of any liability here? I mean, is it all user error after that?

DAVIES: You know, that's not entirely clear. And I spoke with one legal expert who told me that while, you know, it's good for Tesla that they did have people sign that agreement, that NHTSA, you know, the National Highway Traffic Safety Administration which is investigating this accident, could look and say, well, hey, Tesla, like, you knew people were misusing this. You knew there was a serious risk that this was a feature that your customers were not using safely. And if Tesla didn't take adequate steps to remedy that, then maybe they're not quite as absolved as they would hope.

SIEGEL: Can Tesla argue that its cars have prevented more accidents than they have caused?

DAVIES: They certainly can argue that, and, in fact, they're already arguing it. That was the central argument of the blog post they put up yesterday in response to the crash. And I think it's a worthy argument it's just, you know, it's still hard for safety regulators to say, well, you prevented crashes. But if they don't deem this safe feature something that owners should have access to, then all of that's pulled into question.

SIEGEL: And if I have a member of my family who's trusting in an autopilot mode in a Tesla and dies in a car crash, it's small solace that five other people didn't die in car crashes that week.

DAVIES: Exactly. Small solace especially on the legal side if someone decides to press charges against Tesla.

SIEGEL: Alex Davies of Wired magazine, thanks for talking with us.

DAVIES: Thank you. Transcript provided by NPR, Copyright NPR.