© 2024 KRWG
News that Matters.
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Tech Experts Warn Of Artificial Intelligence Arms Race In Open Letter

AUDIE CORNISH, HOST:

The image of artificial intelligence in pop culture is almost cliche at this point. Man creates a machine that can think on its own. Machine turns on man. Well, more than a thousand industry researchers and specialists have signed an open letter warning that there's a much more likely problem ahead - an AI arms race, countries weaponizing the technology for high-tech combat. Stephen Hawking, Noam Chomsky and Apple co-founder Steve Wozniak are among the signers. A leading force behind the letter is Stuart Russell. He's one of the most prominent researchers in the field of artificial intelligence. I ask him to explain how the autonomous weapons the letter warns against are different from drones.

STUART RUSSELL: So a drone is a remotely piloted vehicle. There's a human who is steering it and looking through its camera. The human decides what's a target and whether to release the missile. So an autonomous weapon would be something that's doing all that itself. It's deciding where to go. It's deciding what's a target, and it's deciding who to kill.

CORNISH: And in this letter, it says this technology is not decades away. It's years away. In fact, can you give us an example of where it's being used right now?

RUSSELL: So there are sentry robots in Korea in the demilitarized zone, and those sentry robots can spot and track a human being for a distance of two miles and can very accurately kill that person with a high-powered rifle. Right now, that machine has two modes. In one, it has to first get human permission to go ahead and kill the person. But if you flip a switch, then it's in automatic mode, and it'll do it by itself.

CORNISH: There's an argument here. People have said that essentially, this is a way of reducing casualties or having more precise efforts in combat. And it sounds like, from your letter, that you scientists don't believe that will be the case.

RUSSELL: So I think there's really two arguments. One is an engineering argument. Can we make AI weapons that are better than human soldiers at deciding what's a legitimate target? The other point is that even if we do succeed in making these systems more accurate, the problem is that once you have an arms race, then you have millions or even billions of these devices available at very low cost to anyone who wants to buy them.

CORNISH: You write in the letter that just as most chemists and biologists have no interest in building chemical or biological weapons, most AI researchers have no interest in building AI weapons. Is that true? I think of science being hand-in-hand with defense.

RUSSELL: Clearly, there's a very large number of signatories to the letter. It doesn't take many videos of a flying robot chasing down a human being and killing them for people to find this technology absolutely repulsive and for there to be a very serious backlash against artificial intelligence and robots in general. So I think that the field would much rather focus on the positive uses of artificial intelligence. We can save lives with self-driving cars. We can make people's lives better with personal assistants that are a bit more intelligent than Siri or Cortana. So there are many things we can do other than making better ways to kill people.

CORNISH: Is there any way to turn back from this path once we're on it? I mean, is there any way to avoid an AI arms race?

RUSSELL: So the way to avoid it is to have a treaty that would ban autonomous weapons. And the United Nations is already working on such a treaty. And countries need to trust each other enough to go ahead with the treaty. And that way, they don't have to worry about other countries getting a strategic advantage over them.

CORNISH: You've also said that researchers don't want artificial intelligence to be linked in the minds of the public with violence and with war. And I mention that kind of pop culture cliche. Does it feel like maybe it's a little too late for that?

RUSSELL: The movies have done a very good job of connecting robots with violence. It seems like almost every time I talk to a journalist, they publish an article with pictures of "Terminator" robots no matter what I talk about, so it's hard to get away from that. But I think people understand the difference between science fiction and reality. And what we want to avoid is that the reality catches up with the science fiction.

CORNISH: Stuart Russell - he's director for the Center for Intelligent Systems at UC Berkeley. Thank you so much for speaking with us.

RUSSELL: Thank you very much. Transcript provided by NPR, Copyright NPR.