© 2024 KRWG
News that Matters.
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Former Facebook Insider Says Company Cannot Be Trusted To Regulate Itself

AILSA CHANG, HOST:

Lawmakers and regulators are demanding answers from Facebook. The company has acknowledged that the personal data of 50 million Facebook users was harvested by an outside group for political purposes. We'll hear more about that firm, Cambridge Analytica, in a few minutes. But first we're going to hear from a former Facebook employee who saw firsthand how the company treated user data.

SANDY PARAKILAS: I would characterize it as insufficient and sometimes negligent.

CHANG: That's Sandy Parakilas. Between 2011 and 2012, he was responsible for protecting the user data of people who used Facebook apps. Those are outside programs like "FarmVille" that plug directly into the platform. It was through a Facebook app two years after Parakilas left the company that Cambridge Analytica was able to get the data of those 50 million users.

PARAKILAS: So the apps that use Facebook platform are only held in compliance by policies that Facebook writes. And there was no way for Facebook to see what was happening to user data once it passed to the developers of those apps - people like Aleksandr Kogan, who built the app that passed all this data to Cambridge Analytica. And I thought that was a big problem.

CHANG: OK, so just to be clear, the way Cambridge Analytica was able to get this data is 270,000 users signed up for a personality quiz designed by Aleksandr Kogan, and then the app designed by Kogan was able to collect data on the friends of those 270,000 users, correct?

PARAKILAS: That's correct.

CHANG: And when you were there, did you see any kind of safeguards in place at Facebook to make sure the data wasn't being abused? There was - it was literally turn the other way.

PARAKILAS: They really didn't want to know, to a certain extent, what was happening with the data once it left Facebook. And I pushed repeatedly for more audits, more protection. And I didn't get much traction. They didn't seem to prioritize protecting users over the growth of Facebook apps.

CHANG: Give me a...

PARAKILAS: And in fact...

CHANG: ...Specific example of how you pushed for a better response to privacy.

PARAKILAS: So in mid-2012, I created a PowerPoint deck that outlined all the ways that Facebook data was vulnerable on Facebook platform, and I included a list of the kinds of bad actors who I had either seen trying to do bad things with Facebook data or who I hypothesized might do bad things with Facebook data. And some of the actors on the list included foreign state actors and data brokers. And I sent this PowerPoint deck to senior executives at the company. And I - you know, I didn't really see much change happen as a result.

And what's been really frustrating to me is that what happened with Cambridge Analytica was something that I warned about years before it happened. And their response was exactly the same as the response that I saw them take when I was at the company. And I thought it was insufficient then, and I think it's insufficient now.

CHANG: Are these people you're describing inside Facebook who are unresponsive - are they still with the company?

PARAKILAS: Yes.

CHANG: From what I understand, policies at Facebook have changed, right? So the data breach we saw with Cambridge Analytica wouldn't happen under current policies, right?

PARAKILAS: Well, yes and no. The first issue is that Aleksandr Kogan had all of this data - both the people who had authorized the app and the people who hadn't - the friends. And he passed all of that information to Cambridge Analytica. Passing any of that information was a huge policy violation. What has changed is not the ability of Facebook to enforce on developers like Kogan who pass data. What's changed is they have restricted the amount of data that you can get on friends. So the entirety of the breach would not be possible today, but some elements of it still would be.

CHANG: You left the company six years ago. Do you have any reason to believe that attitudes inside the company have changed?

PARAKILAS: I think what happened with Cambridge Analytica shows that nothing has changed, and I think it's really important to understand the timeline of what happened. In 2014, Aleksandr Kogan made this app, and he misrepresented it to users as a psychological profile app.

CHANG: Right.

PARAKILAS: And he then harvested all of this data for commercial purposes not just from the users of the app but from their friends who had no idea this app even existed. Then he passed that data to Cambridge Analytica, which broke Facebook policy. Facebook discovered this in December 2015 when there was an article in The Guardian that described what was going on. And they evidently reached out to Kogan and Cambridge Analytica and said, we need you to delete the data. But they just took the word of those two parties that they had deleted the data. And in fact, they had not. They lied.

And the problem is Facebook had a right to audit. They had the ability to go in and demand a physical audit of the disk drives and storage and the code of the application itself. They didn't use that, nor did they sue either of these parties at that point despite the fact that it was a serious violation. And they waited, and they did nothing until last Friday when they finally started to take real action against Cambridge Analytica and banned them from Facebook's platform.

CHANG: And why do you think they chose that - to not act more aggressively back in 2015? What incentives do they have not to act more aggressively?

PARAKILAS: I believe that their position is that they are in a better legal position if they don't uncover abuse - that they can say, well, we didn't know. And to me, that is negligent. If you can reasonably know something simply by reading the press and then by investigating but you choose not to because you believe that it protects you legally, I think that is negligence.

CHANG: What do you want to happen inside the company to help address these privacy concerns?

PARAKILAS: Well, I mean, the first thing that absolutely needs to happen is Mark Zuckerberg needs to testify to U.S. Congress and in parliaments and other government bodies in other countries that have serious concerns about this. And they must be dramatically more transparent.

CHANG: Sandy Parakilas is a former manager at Facebook. Thanks very much for joining us.

PARAKILAS: Thank you, Ailsa.

CHANG: NPR did invite a Facebook executive to come on the air, but the company says it is not doing interviews right now. Transcript provided by NPR, Copyright NPR.