Skip to main content

Facebook was always too busy selling ads to care about your personal data

 

(in)Secure is a weekly column that dives into the rapidly escalating topic of cybersecurity.

Recommended Videos

Last year, Facebook collected over nine billion dollars in ad revenue over just a single quarter. That’s a lot of ads. As a trade-off for using a free service, people on Facebook put up with the proliferation of these ads in their newsfeeds. But what if the trade-off involved more than that? What if it involved your personal data being sold off without your consent?

Let’s be clear. This isn’t an actual data breach. It’s merely a policy
no one at Facebook
cared about.

Facebook’s latest scandal involves a data analysis firm called Cambridge Analytica, which was supplied with the personal data of 50 million Facebook profiles without the consent of those people, which just happened to be used in the election of a certain presidential candidate. On its own, the scandal is more than a little troubling, and it provides a startling look into how little the world’s biggest social media platform is concerned about personal data.

Let’s be clear. This doesn’t involve an actual data breach. It’s merely a policy no one at Facebook cared about.

Under the guise of academic research

Using personal data for the sake of academic research has been a weak point in Facebook’s privacy policy for years now — and it’s the first vulnerability the collaborators involved with the Cambridge Analytica scandal exploited.

Despite the name, Cambridge Analytica has no official connection to academia. It’s a research organization founded with the specific purpose of impacting the electoral process, and was run by former Trump aide Steve Bannon, as well as and hedge fund billionaire Robert Mercer.

Cambridge Analytica Facebook breach
Bryan Bedder/Getty Images
Bryan Bedder/Getty Images

The facade of academic research was used as an entry point for an important figure in the crew — Aleksandr Kogan, a researcher who worked for both Cambridge University and (briefly at) St. Petersburg State University. According to a report by the New York Times, when doing work for Cambridge Analytica, Kogan told Facebook that he was collecting data for academic purposes rather than political.

The description for the app said, word for word, “This app is part of a research program in the Department of Psychology at the University of Cambridge.” Apparently, Facebook did nothing to verify that claim. To make things worse, Kogan stated he later changed the reason for his use for the data, and Facebook never bothered to inquire about it further.

Facebook has been giving the data of its users to academic researchers for years now — and not in secret.

Facebook has been giving the data of its users to academic researchers for years now — and not in secret. Facebook freely provided personal data from its users to Harvard University for an academic study back in 2007. Others since then include a partnership with Cornell University on influencing the mood of Facebook users, and yet another in 2017 which studied how AI could guess a person’s sexual orientation from only a photograph.

These studies were all met with public outrage, but Facebook emphasized that they weren’t the result of data breaches or significant holes in the company’s research protocols. It saw them as only “minor oversights.”

There’s little reason to believe a platform that views massive misuse of data without consent as “minor oversights” cares about your privacy. And that’s not where it ends.

Under the guise of a personality quiz

The other area where Facebook’s data policies are weak lie in something we all know too well: personality quizzes. They’re prominent on Facebook, and Kogan used the vulnerable pinch point to collect the data that Cambridge Analytica purchased from him.

Through Global Science Research (GSR), a separate company he created, Kogan developed a Facebook plugin called thisisyourdigitallife. It paid a group of 270,000 people to download the app and take the quiz. That might not sound like much, but the app was then allowed to collect data from each of those people’s friends as well. The result was data for 50 million profiles, now in the hands of Cambridge Analytica. That’s a lot of data.

Whistleblower Christopher Wylie posing for a portrait
Jake Naughton for The Washington Post via Getty Images
Christopher Wylie, one of the founders of Cambridge Analytica, blew the whistle on how the data firm harvested data from millions of Facebook users. Photo: Jake Naughton for The Washington Post via Getty Images

Never did Facebook inform its users that data was being used without their consent. That alone is already calling British law into question.

According to The Guardian, Facebook learned this trick was used to mine massive amounts of data in 2015, which was then used by the Ted Cruz presidential campaign. Facebook’s response was to send Cambridge Analytica an official letter, obtained by the Times, stating the following: “Because this data was obtained and used without permission, and because GSR was not authorized to share or sell it to you, it cannot be used legitimately in the future and must be deleted immediately.”

Never did Facebook inform its users of all
the data that was
being used without
their consent.

Over two years passed before Facebook would even follow up on its request. “If this data still exists, it would be a grave violation of Facebook’s policies and an unacceptable violation of trust and the commitments these groups made,” a blog post from Facebook stated. Eventually, it did get around to it, but it shows that Facebook’s problem isn’t that it lacks policies. It’s that they aren’t enforced.

Cambridge Analytica wasn’t the only organization bending Facebook’s privacy policies. A previous employee of Facebook spoke to The Guardian, saying that “My concerns were that all of the data that left Facebook servers to developers could not be monitored by Facebook, so we had no idea what developers were doing with the data.”

That’s from Sandy Parakilas, who was the platform operations manager in 2011 and 2012. “Once the data left Facebook servers there was not any control, and there was no insight into what was going on.”

Who could be bothered to care?

As reported by the Times, research director Jonathan Albright at Columbia University summarized the problem well: “Unethical people will always do bad things when we make it easy for them and there are few — if any — lasting repercussions.”

https://www.facebook.com/zuck/posts/10104712037900071

Facebook will make sure it takes care of this specific problem, sure. After remaining silent for multiple days after the release, Facebook CEO Mark Zuckerberg did finally make an official statement, in which he took a bit more responsibility for what happened: “We have a responsibility to protect your data, and if we can’t then we don’t deserve to serve you.”

He also vowed to take others steps, such as auditing suspicious apps or limiting the amount of data developers can access from applications. These policies will all help prevent a very similar scenario from unfolding, but cybersecurity is all about prevention. It requires a proactive approach to stopping holes in the system.

Mark Zuckerberg: “I’m really sorry that this happened”

For a company that lives and dies on the trust people have in giving away personal information, you’d think it’d issues a little more seriously across the breadth of its platform. If it doesn’t make massive changes to the way things are done across all levels of privacy and security, #deleteFacebook could grow into far more than just a hashtag.

Luke Larsen
Senior Editor, Computing
Luke Larsen is the Senior Editor of Computing, managing all content covering laptops, monitors, PC hardware, Macs, and more.
AMD’s RDNA 4 may surprise us in more ways than one
AMD RX 7800 XT and RX 7700 XT graphics cards.

Thanks to all the leaks, I thought I knew what to expect with AMD's upcoming RDNA 4. It turns out I may have been wrong on more than one account.

The latest leaks reveal that AMD's upcoming best graphics card may not be called the RX 8800 XT, as most leakers predicted, but will instead be referred to as the  RX 9070 XT. In addition, the first leaked benchmark of the GPU gives us a glimpse into the kind of performance we can expect, which could turn out to be a bit of a letdown.

Read more
This futuristic mechanical keyboard will set you back an eye-watering $1,600
Hands typing on The Icebreaker keyboard.

I've complained plenty about how some of the best gaming keyboards are too expensive, from the Razer Black Widow V4 75% to the Wooting 80HE, but nothing comes remotely close to The Icebreaker. Announced nearly a year ago by Serene Industries, The Icebreaker is unlike any keyboard I've ever seen -- and it's priced accordingly at $1,600. Plus shipping, of course.

What could justify such an extravagant price? Aluminum, it turns out. The keyboard is constructed of one single block of 6061 aluminum in what Serene Industries calls an "unorthodox wedge form." As if that wasn't enough metal, the keycaps are also made of aluminum, and Serene says they include "about 800" micro-perforations that allow the LED backlight of the keyboard to shine through.

Read more
Google one-ups Microsoft by making chats easier to transfer
Google Spaces in Google Chat on a MacBook.

In a recent blog post, Google announced that it is making it easier for admins to migrate from Microsoft Teams to Google Chat to reduce downtime. Admins can easily do this within the Google Chat migration menu and connect to opposing Microsoft accounts to transfer Teams data.

Google gave step-by-step instructions for admins on how to transfer the messages. Admins need to connect to their Microsoft account and upload a CSV of the Teams from where they transfer the messages. From there, it requires just entering a starting date for messages to be migrated from Teams and clicking Star migration. Once it's complete, it'll make the migrated space, messages, and conversation data available to Google Workspace users.

Read more