News

All You Need to Know About the Cambridge Analytica Privacy Scandal

Pierluigi Paganini
April 9, 2018 by
Pierluigi Paganini

Introduction

The commercial data analytics company Cambridge Analytica is in the middle of one of the biggest privacy scandals of the last years; the firm has used data harvested by Facebook to target US voters in the 2016 Presidential election.

The data were collected by a group of academics that then shared it with the firm Cambridge Analytica; the news was later confirmed by Facebook.

Top Security Awareness Posters

Top Security Awareness Posters

Download our collection of free posters and use them to keep security at the forefront of your employees' minds.

The researchers used an app developed by the University of Cambridge psychology lecturer, Dr. Aleksandr Kogan, to collect user data.

The app named "thisisyourdigitallife" is available to users since 2014, it was provided by Global Science Research (GSR) and asked users to take an online survey for $1 or $2. The app requested access to the user's profile information, with this trick researchers obtained information belonging to over 270,000 users that gave the app permission to use their personal details for academic research.

Facebook confirmed to have "suspended" any business with Cambridge Analytica (CA) and its holding company.

"Aleksandr Kogan requested and gained access to information from users who chose to sign up to his app, and everyone involved gave their consent," says the official statement released by Facebook.

"Like all app developers, Kogan requested and gained access to information from people after they chose to download his app. His app, 'thisisyourdigitallife,' offered a personality prediction, and billed itself on Facebook as 'a research app used by psychologists,' Approximately 270,000 people downloaded the app. In so doing, they gave their consent for Kogan to access information such as the city they set on their profile, or content they had liked, as well as more limited information about friends who had their privacy settings set to allow it."

The app is a powerful tool to profile users by harvesting information on their network of contacts, its code allowed to collect data from over 50 million users.

Cambridge Analytica initially attempted to downplay the problem declaring it has deleted all data received from GSR when discovered the way they were obtained.

"When it subsequently became clear that the data had not been obtained by GSR in line with Facebook's terms of service, Cambridge Analytica deleted all data received from GSR," CA said in a statement.

"No data from GSR was used by Cambridge Analytica as part of the services it provided to the Donald Trump 2016 presidential campaign."

Figure 1. Cambridge Analytica message published after the disclosure of the news

What is the novelty? Who is Christopher Wylie?

According to a report published by The Intercept precisely one year ago, Kogan operated on behalf of Strategic Communication Laboratories (SCL), a military contractor that owns the Cambridge Analytics.

Facebook discovered the activity back in 2015 thanks to claims from its users and adopted the necessary measures to force the involved parties in deleting the data from their servers.

"Although Kogan gained access to this information in a legitimate way and through the proper channels that governed all developers on Facebook at that time, he did not subsequently abide by our rules. By passing information on to a third party, including SCL/Cambridge Analytica and Christopher Wylie of Eunoia Technologies, he violated our platform policies," continues the Facebook statement. "When we learned of this violation in 2015, we removed his app from Facebook and demanded certifications from Kogan and all parties he had given data to that the information had been destroyed. Cambridge Analytica, Kogan and Wylie all certified to us that they destroyed the data."

Christopher Wylie, a Kogan's collaborator, confirmed that data had been used in the US presidential election to profile individuals and influence the final vote. Wylie provided evidence to the New York Times and The Guardian that harvested data had not been destroyed.

After the story was publicly revealed, Facebook suspended Wylie's account as confirmed by the whistleblower via Twitter on Sunday.

A few days later, Facebook revealed that 87 million users had been affected by the Cambridge Analytica case, much more than 50 million users initially thought.

The social network giant also unveiled more explicit terms of service to ensure transparency to its users about data sharing.

Facebook's chief technology officer Mike Schroepfer provided further details on the case, including new estimations for the number of affected users.

"In total, we believe the Facebook information of up to 87 million people — mostly in the US — may have been improperly shared with Cambridge Analytica," Schroepfer said.

The CTO also provided further details on how Facebook is implementing new privacy tools for its users.

"People will also be able to remove apps that they no longer want. As part of this process we will also tell people if their information may have been improperly shared with Cambridge Analytica," he added.

"Overall, we believe these changes will better protect people's information while still enabling developers to create useful experiences."

If you are interested in the several actors that were involved in the Cambridge Analytica scandal and their role, give a look at the following graph:

Figure 2 - Cambridge Analytica credit: The Guardian (sources Spillednews.com)

Mark Zuckerberg: "we made mistakes"

Zuckerberg declared that it would take "a few years" to fix the problems uncovered by the revelations on data misuse. Facebook Co-founder sustains that one of the biggest error he made is that Facebook is "idealistic,"

"Well, I don't think it's going to take 20 years. I think the basic point that you're getting at is that we're really idealistic. When we started, we thought about how good it would be if people could connect, if everyone had a voice. Frankly, we didn't spend enough time investing in, or thinking through, some of the downside uses of the tools. So for the first 10 years of the company, everyone was just focused on the positive," Zuckerberg toldVox.com

"I think now people are appropriately focused on some of the risks and downsides as well. And I think we were too slow in investing enough in that. It's not like we did nothing. I mean, at the beginning of last year, I think we had 10,000 people working on security. But by the end of this year, we're going to have 20,000 people working on security,"

In response to the Cambridge Analytica case, Facebook deleted dozens of accounts linked to Russia that were used to spread propaganda.

Facebook revoked the 70 Facebook and 65 Instagram accounts and removed 138 Facebook pages controlled by the Russia-based Internet Research Agency (IRA), also known as the Russian troll farm due to its misinformation campaigns.

The unit "has repeatedly used complex networks of inauthentic accounts to deceive and manipulate people who use Facebook, including before, during and after the 2016 US presidential elections," explained Facebook chief security officer Alex Stamos.

Zuckerberg added that the Russian agency "has been using complex networks of fake accounts to deceive people."

"While we respect people and governments sharing political views on Facebook, we do not allow them to set up fake accounts to do this. When an organization does this repeatedly, we take down all of their pages, including ones that may not be fake themselves."

Finally, Mark Zuckerberg admitted that his company has failed in protecting its users, but he pointed out that the company has already adopted necessary measures to prevent future abuses.

"We made mistakes," he said. "We have a responsibility to protect your data, and if we can't then we don't deserve to serve you," reads a statement published by Zuckerberg on Facebook.

"I've been working to understand exactly what happened and how to make sure this doesn't happen again. The good news is that the most important actions to prevent this from happening again today we have already taken years ago. But we also made mistakes, there's more to do, and we need to step up and do it."

Zuckerberg was not aware of Cambridge Analytica, Sheryl Sandberg said something different

Zuckerberg highlighted that he was not aware of the activities conducted by Cambridge Analytica and that his company prompted interrupted business activities with the firm once discovered it had not deleted collected data.

"Last week, we learned from The Guardian, The New York Times and Channel 4 that Cambridge Analytica may not have deleted the data as they had certified. We immediately banned them from using any of our services."

But many experts believe that Facebook was aware of Cambridge Analytica's activity and Facebook Chief operating officer Sheryl Sandberg gave us another point of view on the privacy scandal.

Sandberg gave two interviews last weeks to National Public Radio and NBC's Today Show during which she pointed out that Facebook was not able to prevent third parties from abusing its platform, she also added that the company should have taken further steps to protect the privacy of its users.

"We know that we did not do enough to protect people's data," Sandberg told NPR. "I'm really sorry for that. Mark is really sorry for that, and what we're doing now is taking really firm action."

"Safety and security is never done, it's an arms race," she said. "You build something, someone tries to abuse it."

"But the bigger is, 'Should we have taken these steps years ago anyway?'" Sandberg
said. "And the answer to that is yes."

"We really believed in social experiences, we really believed in protecting privacy, but we were way too idealistic," she added.

"We did not think enough about the abuse cases and now we're taking really firm steps across the board."

Sandberg confirmed that Facebook was first aware two and a half years ago that Cambridge Analytica had illegally obtained user data, and this is not the same thing initially said by Zuckerberg,

"When we received word that this researcher gave the data to Cambridge Analytica, they assured us it was deleted," she said. "We did not follow up and confirm, and that's on us — and particularly once they were active in the election, we should have done that."

Sandberg admitted that the Facebook should have detected the Russian interference in the 2016 presidential election, but this was a lesson for the company that in the future will not permit it again.

"That was something we should have caught, we should have known about," she told NPR. "We didn't. Now we've learned."

"We're going after fake accounts…A lot of it is politically motivated but even more is economically motivated."

The Incident Response

After Cambridge Analytica case, Facebook announced security improvements to prevent future interference with elections. Mark Zuckerberg added that he would take several measures to prevent threat actors from abusing Facebook users' data.

Facebook announced that it will assess all apps that had access to vast amounts of information before 2014 when the social network giant took the most critical steps to prevent bad actors from accessing people's data.

Facebook will restrict developers' data access even further to prevent such kind of situation, and it will show users a tool at the top of their News Feed to show the apps they used and revoke them permissions to access their data.

The company will ban any developer that does not agree to a thorough audit.

"We'll require developers to not only get approval but also sign a contract in order to ask anyone for access to their posts or other private data. And we'll have more changes to share in the next few days," continues Zuckerberg while announcing more changes.

Facebook also plans to improve the security of elections in four main areas: combating foreign interference, removing fake accounts, increasing ads transparency, and reducing the spread of false news.

Alex Stamos, Facebook's Chief Security Officer, declared that the company is committed in fighting "fake news," he also explained that the term is used to describe many malicious activities including:

Fake identities– this is when an actor conceals their identity or takes on the identity of another group or individual;

Fake audiences– so this is using tricks to artificially expand the audience or the perception of support for a particular message;

False facts – the assertion of false information; and

False narratives– which are intentionally divisive headlines and language that exploit disagreements and sow conflict. This is the most difficult area for us, as different news outlets and consumers can have completely different on what an appropriate narrative is even if they agree on the facts.

"When you tease apart the overall digital misinformation problem, you find multiple types of bad content and many bad actors with different motivations," said Alex Stamos.

"Once we have an understanding of the various kinds of 'fake' we need to deal with, we then need to distinguish between motivations for spreading misinformation. Because our ability to combat different actors is based upon preventing their ability to reach these goals," said Stamos.

"Each country we operate in and election we are working to support will have a different range of actors with techniques are customized for that specific audience. We are looking ahead, by studying each upcoming election and working with external experts to understand the actors involved and the specific risks in each country."

According to Stamos, it is crucial to profile the attackers determining their motivation and to adopt the proper countermeasures. The Facebook CSO distinguished profit-motivated organized group, ideologically motivated groups, state-sponsored actors, people that enjoy causing chaos and disruption, and groups having multiple motivations such as ideologically driven groups.

Samidh Chakrabarti, Facebook Product Manager, explained that the social media giant is currently blocking millions of fake accounts each day with the involvement of machine learning systems.

Chakrabarti explained that pages and domains that are used to share fake news are increasing, in response, Facebook doubles the number of people working on safety issues from 10,000 to 20,000.

"Over the past year, we've gotten increasingly better at finding and disabling fake accounts. We're now at the point that we block millions of fake accounts each day at the point of creation before they can do any harm," said Chakrabarti.

"Rather than wait for reports from our community, we now proactively look for potentially harmful types of election-related activity, such as Pages of foreign origin that are distributing inauthentic civic content. If we find any, we then send these suspicious accounts to be manually reviewed by our security team to see if they violate our Community Standards or our Terms of Service. And if they do, we can quickly remove them from Facebook. "

Facebook also announced new a new transparency feature dubbed View Ads for its advertising initiatives. View Ads is currently in testing in Canada, it allows anyone to view all the ads that a Facebook Page is running on the platform.

"You can click on any Facebook Page, and select About, and scroll to View Ads," explained Rob Leathern, Product Management Director.

"Next we'll build on our ads review process and begin authorizing US advertisers placing political ads. This spring, in the run-up to the US midterm elections, advertisers will have to verify and confirm who they are and where they are located in the US."

This summer, Facebook will launch a public archive with all the ads that ran with a political label.

Sandberg announced that from next week the news feed will be integrated with a feature that will allow users to see all the apps they have shared their data with.

"A place where you can see all the apps you've shared your data with and a really easy way to delete them."

The company has announced a series of improvements to its platform to implement tighter user privacy controls and limit scraping activities.

One of the significant changes Facebook made toward improved user privacy was to prevent applications from "seeing" an individual in one's friends list unless both users have decided to share their list of friends with the app.

"In order for a person to show up in one person's friend list, both people must have decided to share their list of friends with your app and not disabled that permission during login. Also both friends must have been asked for user_friends during the login process," Facebook explains.

Facebook also plans on expanding the bug bounty program to discover any vulnerability that could be abused to gather information related its users.

What about the future?

Although Facebook announced the above improvements, when asked by journalists at "Today Show" if other cases of misuse of user data could be expected, Sandberg did not exclude it.

"We're doing an investigation, we're going to do audits and yes, we think it's possible, that's why we're doing the audit," she told NPR.

"That's why this week we shut down a number of use cases in other areas — in groups, in pages, in events — because those are other places where we haven't necessarily found problems, but we think that we should be more protective of people's data."

Next week, on April 11, Facebook founder Mark Zuckerberg would appear at the Congress to address privacy issues.

The hearing will "be an important opportunity to shed light on critical consumer data privacy issues and help all Americans better understand what happens to their personal information online," said the committee's Republican chairman Greg Walden and ranking Democrat Frank Pallone in a statement.

"We appreciate Mr. Zuckerberg's willingness to testify before the committee, and we look forward to him answering our questions."

References

https://www.nytimes.com/2018/03/17/us/politics/cambridge-analytica-trump-campaign.html

https://securityaffairs.co/wordpress/70443/social-networks/cambridge-analytica-data-harvesting.html

https://securityaffairs.co/wordpress/70542/social-networks/zuckerberg-comments-cambridge-analytica.html

https://securityaffairs.co/wordpress/71147/social-networks/cambridge-analytica-misuse.html

https://securityaffairs.co/wordpress/71147/social-networks/cambridge-analytica-misuse.html

https://newsroom.fb.com/news/2018/03/suspending-cambridge-analytica/

https://securityaffairs.co/wordpress/70956/social-networks/facebook-election-improvements.html

https://www.securityweek.com/facebooks-sandberg-says-other-cases-data-misuse-possible

https://theintercept.com/2017/03/30/facebook-failed-to-protect-30-million-users-from-having-their-data-harvested-by-trump-campaign-affiliate/

See Infosec IQ in action

See Infosec IQ in action

From gamified security awareness to award-winning training, phishing simulations, culture assessments and more, we want to show you what makes Infosec IQ an industry leader.

https://www.spillednews.com/2018/03/50million-facebook-profiles-hijacked-breach.html

Pierluigi Paganini
Pierluigi Paganini

Pierluigi is member of the ENISA (European Union Agency for Network and Information Security) Threat Landscape Stakeholder Group, member of Cyber G7 Workgroup of the Italian Ministry of Foreign Affairs and International Cooperation, Professor and Director of the Master in Cyber Security at the Link Campus University. He is also a Security Evangelist, Security Analyst and Freelance Writer.

Editor-in-Chief at "Cyber Defense Magazine", Pierluigi is a cyber security expert with over 20 years experience in the field, he is Certified Ethical Hacker at EC Council in London. The passion for writing and a strong belief that security is founded on sharing and awareness led Pierluigi to find the security blog "Security Affairs" recently named a Top National Security Resource for US.

Pierluigi is a member of the "The Hacker News" team and he is a writer for some major publications in the field such as Cyber War Zone, ICTTF, Infosec Island, Infosec Institute, The Hacker News Magazine and for many other Security magazines.