Data Mining Is Out of Control at Facebook, Former Employee Sandy Parakilas Reveals
If you're just catching the news about Facebook and Cambridge Analytica, here's the short version: Several years ago, a consultancy firm called Cambridge Analytica used Facebook's lax terms of service to create an app that asked for access to its users' personal data, as well as the data of their friends. This 'Friends Permission' access allowed the company to collect an estimated 50 million users' worth of data, including information that was classified as "private and personally identifiable."
With this information, Cambridge Analytica allegedly created specially targeted campaigns that spread false information and fake news stories to influence the 2016 presidential election.
An undercover investigation revealed admissions from Cambridge that they were willing to lie and mislead the public.
When speaking about some of the potential information they could disseminate, Cambridge Analytica CEO Alexander Nix reportedly said, "It sounds a dreadful thing to say, but these are things that don't necessarily need to be true as long as they're believed."
The company also expressed a willingness to as engineer scandals and honeypot schemes to discredit targets.
The newest development in this scandal has come from Sandy Parakilas, a former Facebook platform manager who left the company in 2012 after his supervisors continually dismissed warnings that the company's lax guidelines for data collection and lack of enforcement was posing a major security risk to Facebook's users.
Parakilas was even told not to investigate third parties' suspicious or potentially illegal use of data: one executive asked him "Do you really want to see what you'll find?"
One of Parakilas' major concerns was over the exact policy that allowed Cambridge Analytica to access millions of people's data—the 'Friends Permission.'
According to Parakilas, "It was well understood in the company that that presented a risk. Facebook was giving data of people who had not authorized the app themselves, and was relying on terms of service and settings that people didn't read or understand."
Though Facebook had the power to audit third parties to make sure they were using Facebook user data correctly, Parakilas says it almost never happened.
"My concerns were that all of the data that left Facebook servers to developers could not be monitored by Facebook, so we had no idea what developers were doing with the data," he says.
Despite repeated warnings that this unmonitored data harvesting and usage was out of control, it seems Facebook shut down the Friends Permission option not out of security, but out of competition: "They were worried that the large app developers were building their own social graphs, meaning they could see all the connections between these people," Parakilas says. "They were worried that they were going to build their own social networks."
Now Parakilas' warnings have come true.
Cambridge Analytica, a third-party developer, apparently managed to cull massive amounts of data from millions of Facebook users using their app and abused the data in order to run their business, which specializes in pseudo-legal services designed to influence public opinion. Your move, Facebook.