Tag

facebook

Browsing

TNW had a conversation with Cambridge University scholar Aleksandr Kogan, one of the architects of Cambridge Analytica’s Facebook targeting model, to learn how exactly the statistical model processed Facebook data for use in targeting and influencing voters:

In 2013, Cambridge University researchers Michal Kosinski, David Stillwell and Thore Graepel published an article on the predictive power of Facebook data, using information gathered through an online personality test. Their initial analysis was nearly identical to that used on the Netflix Prize, using SVD to categorize both users and things they “liked” into the top 100 factors.

The paper showed that a factor model made with users’ Facebook “likes” alone was 95 percent accurate at distinguishing between black and white respondents, 93 percent accurate at distinguishing men from women, and 88 percent accurate at distinguishing people who identified as gay men from men who identified as straight. It could even correctly distinguish Republicans from Democrats 85 percent of the time.

It was also useful, though not as accurate, for predicting users’ scores on the “Big Five” personality test.

This is exactly why allowing nefarious companies like Cambridge Analytica to use personal data provided to Facebook is extremely dangerous. It becomes rather easy to profile and subsequently target people with scary accuracy. The average Facebook user never stops to consider the potential unintended consequences of their providing data to the platform. They merely look at the immediate benefit rather than the long term effect.

Knowing how the model is built helps explain Cambridge Analytica’s apparently contradictory statements about the role – or lack thereof – that personality profiling and psychographics played in its modeling. They’re all technically consistent with what Kogan describes.

A model like Kogan’s would give estimates for every variable available on any group of users. That means it would automatically estimate the Big Five personality scores for every voter. But these personality scores are the output of the model, not the input. All the model knows is that certain Facebook likes, and certain users, tend to be grouped together.

With this model, Cambridge Analytica could say that it was identifying people with low openness to experience and high neuroticism. But the same model, with the exact same predictions for every user, could just as accurately claim to be identifying less educated older Republican men.

Using statistics to influence the response to appear as if the data use was less evil than reality.

The New York Times has a detailed report on how a single defense contracting idea turned into the huge Cambridge Analytica data-stealing scandal the entire globe is aware of today:

As a start-up called Cambridge Analytica sought to harvest the Facebook data of tens of millions of Americans in summer 2014, the company received help from at least one employee at Palantir Technologies, a top Silicon Valley contractor to American spy agencies and the Pentagon.

It was a Palantir employee in London, working closely with the data scientists building Cambridge’s psychological profiling technology, who suggested the scientists create their own app — a mobile-phone-based personality quiz — to gain access to Facebook users’ friend networks, according to documents obtained by The New York Times.

Cambridge ultimately took a similar approach. By early summer, the company found a university researcher to harvest data using a personality questionnaire and Facebook app. The researcher scraped private data from over 50 million Facebook users — and Cambridge Analytica went into business selling so-called psychometric profiles of American voters, setting itself on a collision course with regulators and lawmakers in the United States and Britain.

It should come as no surprise Palantir is somehow peripherally involved in this operation. The company was founded by a number of extremely intelligent and influential Silicon Valley folks, funded by CIA’s In-Q-Tel venture capital arm, and primarily focuses on business with the United States Intelligence Community and other global intelligence agencies like UK’s Government Communications Headquarters (GCHQ).

Palantir’s expertise is data science, sometimes using shady tactics, and the Cambridge Analytica operations was definitely both. Leveraging the access provided by Facebook was a smart technique for collecting data, analyzing it, and then generating psychological profiles of various political leanings. This ultimately resulted in what we know today: targeted advertising and propaganda with the intent of poisoning one candidate, in the hopes of increasing the viability of another. And it worked. Very well.

The revelations pulled Palantir — co-founded by the wealthy libertarian Peter Thiel — into the furor surrounding Cambridge, which improperly obtained Facebook data to build analytical tools it deployed on behalf of Donald J. Trump and other Republican candidates in 2016. Mr. Thiel, a supporter of President Trump, serves on the board at Facebook.

“There were senior Palantir employees that were also working on the Facebook data,” said Christopher Wylie, a data expert and Cambridge Analytica co-founder, in testimony before British lawmakers on Tuesday.

Peter Thiel being on the board of both companies, and a co-Founder of Palantir, seems extremely shady and problematic. The optics appear as if he may have had some inside knowledge of Facebook’s data platform deficiencies, and then potentially shared that information with Cambridge Analytica through Palantir. This would have been one method for CA to learn about techniques for exploiting Facebook user data. But I have not seen any strong evidence to backup this claim, only circumstantial discussions.

This story is not even close to being fully exposed. I suspect there is a lot more we will learn within the coming weeks as new revelations are revealed.

CNET reports on Facebook finally centralizing all their privacy tools to make them easier to locate and use by the average, non-tech savvy user:

In an emailed statement, Facebook Chief Privacy Officer Erin Egan and Deputy General Counsel Ashlie Beringer spelled out the steps the company is taking, saying these will “put people in more control over their privacy.”

“We’ve heard loud and clear that privacy settings and other important tools are too hard to find, and that we must do more to keep people informed,” they said.

Here’s the basic rundown:

  • Privacy settings in one place: Simplifies settings from being “spread across nearly 20 different screens.”
  • Privacy Shortcuts menu: Brings together two-factor authentication, ad controls, tools to manage who sees your posts and controls for reviewing what you’ve shared.
  • Access Your Information tool: Lets you access, manage and delete information from your profile or timeline (including posts, reactions, comments and search history).
  • Secure download of all Facebook data: Including photos, contacts and posts (and the ability to move it to another service).

It’s a big change. While many of us treat our Facebook posts as social ephemera that slip away into the ether, Facebook has long stored all of this personal data to serve brands and its own ad-targeting tools.

Facebook started out very private, then slowly migrated towards hiding its privacy controls in favor of increased public posting and removing anonymity. Now in the wake of the Cambridge Analytica story, Facebook is moving back to where it should have stayed all along: easy-to-use privacy controls.

Facebook never should have made those ill-advised changes to begin with, so it is good to see them making this move.

Mozilla has just launched a new extension for their web browser to basically de-creepify Facebook usage by containing its activity within a sandbox of sorts, ensuring anything you do on the site cannot be shared with third-party companies:

The pages you visit on the web can say a lot about you. They can infer where you live, the hobbies you have, and your political persuasion. There’s enormous value in tying this data to your social profile, and Facebook has a network of trackers on various websites. This code tracks you invisibly and it is often impossible to determine when this data is being shared.

Facebook Container isolates your Facebook identity from the rest of your web activity. When you install it, you will continue to be able to use Facebook normally. Facebook can continue to deliver their service to you and send you advertising. The difference is that it will be much harder for Facebook to use your activity collected off Facebook to send you ads and other targeted messages.

This Add-On offers a solution that doesn’t tell users to simply stop using a service that they get value from. Instead, it gives users tools that help them protect themselves from the unexpected side effects of their usage. The type of data in the recent Cambridge Analytica incident would not have been prevented by Facebook Container. But troves of data are being collected on your behavior on the internet, and so giving users a choice to limit what they share in a way that is under their control is important.

In light of the recent Facebook scandal it is time to rethink the type of data we share with unscrupulous companies like Facebook. While there is value in the services Facebook provides, it all comes at a cost to your privacy. You are not a Facebook customer, but one of its products. All the web surfing you do, and the associated data Facebook can collect around that activity, is valuable and monetizable. This is how the company provides a free service to you, Mr. and Mrs. Product.

Mozilla creating this add-on is just a band-aid, but a valuable one nonetheless. If you use Mozilla, I strongly suggest you install this add-on. If you use Chrome, I strongly suggest you switch to Mozilla. The latest iterations are lightning fast, just like Chrome, but far less privacy invasive than the Google developed browser. It is well worth making the switch.

Personally, I use Safari on macOS and have done everything I can to limit my exposure and decrease any unnecessary risk. Safari makes it easy but there are improvements that could be made. Hopefully a similar extension will be developed for Safari at some point.

In the interim, switch to Firefox and start using this add-on. It is an outstanding ingredient for your freedom and independence from these companies that could care less about privacy because they merely want to turn a strong profit more than anything else.

TechCrunch reports the Cambridge Analytica story may have just taken a turn for the worse with Chris Wylie, the whistle-blower responsible for these powerful allegations, stating the 50M number was merely a safe number to share with the media:

Giving evidence today, to a UK parliamentary select committee that’s investigating the use of disinformation in political campaigning, Wylie said: “The 50 million number is what the media has felt safest to report — because of the documentation that they can rely on — but my recollection is that it was substantially higher than that. So my own view is it was much more than 50M.

Somehow I am unsurprised the number will ultimately turn out to be much larger than Facebook is willing to admit. The company is in damage control, especially after having lost $60B in value since the shocking revelations were unveiled almost ten days ago.

Facebook has previously confirmed 270,000 people downloaded Kogan’s app — a data harvesting route which, thanks to the lax structure of Facebook’s APIs at the time, enabled the foreign political consultancy firm to acquire information on more than 50 million Facebook users, according to the Observer, the vast majority of whom would have had no idea their data had been passed to CA because they were never personally asked to consent to it.

Instead, their friends were ‘consenting’ on their behalf — likely also without realizing.

In my own anecdotal testing, I have while most people are conscious that Facebook is not necessarily to be trusted, they never thought these applications operated the way they do. That is to say, nobody I have spoken with understood their friends, or their friends-of-friends data would be shared with third-party applications they interacted with on Facebook. That these applications knowingly surveilling Facebook accounts is complete news to most of the people I talked to.

This whole story keeps getting worse as the days pass. I wonder how long it will take, and what else will be revealed, before it his rock bottom.

Lifehacker has, in light of recent revelations, an awesome HOWTO explaining the ways to delete your phone contact data from both Facebook proper and the Facebook Messenger application:

Facebook’s terrible, horrible, no good, very bad week continues. Though the social network’s “contact import” feature has been around for a very, very long time, you’ve probably forgotten about it. And if you want keep Facebook from filling in the gaps by collecting data about your friends from you—or worse, records of your call data—it’s easy to shut your devices up.

To get a look at the contacts you’ve already uploaded to Facebook, you’ll want to visit Facebook’s Manage Invites and Imported Contacts page. You might be slightly shocked to see open invites from many, many years ago still active—just a little head-nod to the viral aspects of the social network back when it was still getting off the ground (and more of a collegiate gathering ground than anything else). Feel free to delete these; who needs an invite to Facebook nowadays anyway?

This will remove unwanted data from Facebook and will likely lead to less creepy “do you know Johnny” type friend suggestions. The entire article discusses a few different places on Facebook to visit to ensure specific personal data is deleted. If you have not yet done it, I strong suggest visiting this article and following the simple outlined steps.

Ars Technica reports on something not all that surprising considering the Facebook news stories lately. This time it appears for years Facebook has been surreptitiously scraping call, and text message data from Android phones:

If you granted permission to read contacts during Facebook’s installation on Android a few versions ago—specifically before Android 4.1 (Jelly Bean)—that permission also granted Facebook access to call and message logs by default. The permission structure was changed in the Android API in version 16. But Android applications could bypass this change if they were written to earlier versions of the API, so Facebook API could continue to gain access to call and SMS data by specifying an earlier Android SDK version. Google deprecated version 4.0 of the Android API in October 2017—the point at which the latest call metadata in Facebook users’ data was found. Apple iOS has never allowed silent access to call data.

Facebook provides a way for users to purge collected contact data from their accounts, but it’s not clear if this deletes just contacts or if it also purges call and SMS metadata. After purging my contact data, my contacts and calls were still in the archive I downloaded the next day—likely because the archive was not regenerated for my new request.

As always, if you’re really concerned about privacy, you should not share address book and call-log data with any mobile application. And you may want to examine the rest of what can be found in the downloadable Facebook archive, as it includes all the advertisers that Facebook has shared your contact information with, among other things.

Utterly shameful yet entirely unsurprising for one of the most unscrupulous companies on the internet.

Everyone should know the following truism by now: if you are receiving a web-based service for free, you are not the customer but the product. Your data is being monetized, and likely collected in ways you are unaware of, therefore you should be very careful with what data you provide to the platform.

WIRED sat down with Facebook CEO Mark Zuckerberg for a Q&A about the recent Cambridge Analytica scandal and other problems related to both the company and the huge amount of personal data it collects on people:

Nicholas Thompson: You learned about the Cambridge Analytica breach in late 2015, and you got them to sign a legal document saying the Facebook data they had misappropriated had been deleted. But in the two years since, there were all kinds of stories in the press that could have made one doubt and mistrust them. Why didn’t you dig deeper to see if they had misused Facebook data?

Mark Zuckerberg: So in 2015, when we heard from journalists at The Guardian that Aleksandr Kogan seemed to have shared data with Cambridge Analytica and a few other parties, the immediate actions that we took were to ban Kogan’s app and to demand a legal certification from Kogan and all the other folks who he shared it with. We got those certifications, and Cambridge Analytica had actually told us that they actually hadn’t received raw Facebook data at all. It was some kind of derivative data, but they had deleted it and weren’t [making] any use of it.

In retrospect, though, I think that what you’re pointing out here is one of the biggest mistakes that we made. And that’s why the first action that we now need to go take is to not just rely on certifications that we’ve gotten from developers, but [we] actually need to go and do a full investigation of every single app that was operating before we had the more restrictive platform policies—that had access to a lot of data—and for any app that has any suspicious activity, we’re going to go in and do a full forensic audit. And any developer who won’t sign up for that we’re going to kick off the platform. So, yes, I think the short answer to this is that’s the step that I think we should have done for Cambridge Analytica, and we’re now going to go do it for every developer who is on the platform who had access to a large amount of data before we locked things down in 2014.

Based on my experience running web sites, I suspect Zuckerberg and Facebook had no idea data was being siphoned. They likely implemented some rate control mechanisms, but had – have – zero situational awareness of how that data is being downloaded and by what companies. They merely provide access and that is where things end.

Even if there were some rate controls put into place, just like with traditional network breaches, if the actors data exfiltration technique was to slowly trickle it out, that will be difficult to detect unless the analysts are really paying close attention. I am not saying this is what happened with Cambridge Analytica, but it is a plausible scenario for some form of Facebook corporate deniability.

If that is the case, then it is just terrible platform design. It boils down to too much of release fast mentality, without properly thinking through the implications of deploying features and capabilities. Unintended consequences are hard to fully understand in advance, but still, Facebook has an extremely talented workforce and I find it hard to believe had they slowed down and thoroughly considered their approach they could not have envisioned this type of scenario.

No matter how the data left Facebook, the company is complicit. It is their platform and they need to be more cognizant about how third-party access is being used, and to eradicate actors using it maliciously.

The Guardian has an exceptional in-depth article every Facebook user – basically any human with a smart phone or computer – should read. It details Cambridge Analytica, the firm employed to psychologically profile people for political purposes by leveraging their Facebook data through tools such as surveys and other third-party applications:

Starting in 2007, Stillwell, while a student, had devised various apps for Facebook, one of which, a personality quiz called myPersonality, had gone viral. Users were scored on “big five” personality traits – Openness, Conscientiousness, Extroversion, Agreeableness and Neuroticism – and in exchange, 40% of them consented to give him access to their Facebook profiles. Suddenly, there was a way of measuring personality traits across the population and correlating scores against Facebook “likes” across millions of people.

The research was original, groundbreaking and had obvious possibilities. “They had a lot of approaches from the security services,” a member of the centre told me. “There was one called You Are What You Like and it was demonstrated to the intelligence services. And it showed these odd patterns; that, for example, people who liked ‘I hate Israel’ on Facebook also tended to like Nike shoes and KitKats.

“There are agencies that fund research on behalf of the intelligence services. And they were all over this research. That one was nicknamed Operation KitKat.”

The defence and military establishment were the first to see the potential of the research. Boeing, a major US defence contractor, funded Kosinski’s PhD and Darpa, the US government’s secretive Defense Advanced Research Projects Agency, is cited in at least two academic papers supporting Kosinski’s work.

This is why I never play Facebook games or use any of the third-party applications on the platform. There is just no granular control of the data you have knowingly provided to Facebook, and these applications can access just about all of it. Most people merely click-through the permissions page when installing a new game or survey, and remain completely oblivious to what they are giving away for free.

Over the past year I have begun to feel as if Facebook is almost like a cancer. It continues to grow and grow, and is at a point where it cannot stop metastasizing unless a drastic change occurs. What is that change?

People simply stop using Facebook. Cold turkey.

At this juncture, Facebook is too big for its own good. I really distrust Google, but have far less trust for Facebook. They collect a lot of data on people, and there are no controls in place to ensure they are safeguarding it appropriately. This article demonstrates their complete and utter disregard for personal information.

Do yourself a favor and take a step back from Facebook, stop using it for a week, and compare how you felt prior to and after this little experiment. I will almost guarantee you will better, if without that dopamine hit Facebook provides.

CSO Online about a unique Facebook-based delivery method for Locky ransomware:

The attack leverages a downloader called Nemucod, which is delivered via Facebook Messenger as a .svg file.

The usage of SVG (Scalable Vector Graphics) files, is important. SVG is XML-based, meaning a criminal can embed any type of content they want – such as JavaScript. In this case, JavaScript is exactly what the attackers embedded.

If accessed, the malicious image will direct the victim to a website that appears to be YouTube in design only, as it’s hosted on a completely different URL.

Once the page is loaded, the victim is asked to install a codec in order to play the video that’s shown on the page.

If the codec (presented as a Chrome extension) is installed, the attack is this spread further via Facebook Messenger. Sometimes the malicious Chrome extension installs the Nemucod downloader, which ultimately delivers Locky.

There are a lot of moving parts to delivering Locky in this manner. In addition, anecdotally anyhow, I believe most people use Facebook Messenger on their mobile devices rather than via the web so I wonder about the effectiveness of this attack. Unfortunately, there are a lot of folks who do not pay close enough attention and will allow the codec to install without nary a second thought, and thus allow this exploit to succeed.

This is a must read on why Signal is the clear secure messaging choice over WhatsApp:

In short, if a government demands that Open Whisper Systems hand over the content or metadata of a Signal message or a user’s contact list, it has nothing to hand over. And that government will have just as little luck requesting backups of Signal messages from Google or Apple.

From a user privacy perspective, Signal is the clear winner, but it’s not without its downsides.

Compared to WhatsApp’s 1 billion users, Signal’s user base is minuscule. Marlinspike said that they don’t publish statistics about how many users they have, but Android’s Google Play store reports that Signal has been downloaded between 1 and 5 million times. The iPhone App Store does not publish this data.

Too bad almost nobody uses Signal. Convincing people to switch their text-messaging app is almost like trying to get people to convert religions.

Well this sure is interesting: Facebook is offering a service capable of hunting down Hacking Team malware solely on Apple Mac OS X:

Facebook announced today it was pushing out some “query packs” on its code page that would enable IT folk to quickly look for signs of Hacking Team infection. These query packs form part of Facebook’s “osquery”, a free and open source framework that can be used to gather network data and quickly ask questions to uncover potential security threats. It’s part of the social network’s own security defences and was updated recently to protect against some critical Apple Mac and iPhone vulnerabilities.

Whilst query packs can be created to bunch specific, commonly-used sets of questions for datasets, Facebook has released a handful of its own, including ones related specifically to Apple Mac OS X machines. “The OS X-attacks pack has queries which identify known variants of malware, ranging from advanced persistent threats (APT) to adware and spyware. If a query in this pack produces results, a host in your Mac fleet is compromised with malware. This pack is high signal and should result in near-zero false positives,” said Javier Marcos, security engineer at Facebook, in a blog post, before noting that the query pack includes commands that seek out signs of Hacking Team infiltration.

Sounds quite useful!

Ars Technica on the NSA or FBI intercepting WhatsApp messages, ultimately leading to two people being arrested for plotting a terror attack in Belgium:

In an article in German magazine C’T, editor Fabian A. Scherschel dove into the encryption scheme in WhatsApp and contended that it did not vary the key used to encrypt information in transit—instead, it used a key derived from the user’s password and encryption code based on the RC4 algorithm for both inbound and outbound communication. The insinuation was that intercepted and collected messages could theoretically be broken much more easily since the key seeds could be more easily found because it reduced the number of possible keys. But in a response to the article posted to Reddit, Moxie Marlinspike said, “This article should be retitled ‘Breaking News: WhatsApp E2E Deployment Process Exactly As Advertised.’  We announced a partnership, not a finished deployment. In the blog post announcing that partnership, we publicly outlined the WhatsApp E2E deployment process, and it describes exactly what has been ‘discovered’ here. As I said in the blog post, deploying across this many users (hundreds of millions) and this many platforms (seven, of which they checked two) takes time, and is being done incrementally. I also point out that we will be surfacing information in the UI once that is complete.”

TechCrunch on Facebook supporting PGP for sending encrypted notification emails and also allowing users to post their public keys on their profile:

Facebook uses the well-established PGP scheme (the GNU Privacy Guard implementation of OpenPGP, to be precise) to encrypt messages and tools like Mailvelope for Gmail users now make it a bit more straightforward to generate and manage keys in order to read and write encrypted emails. It’s still by no means a completely trivial procedure, and you still need to have a basic understanding of what you are doing.

Facebook acknowledges as much and points potential users to the Electronic Frontier Foundation’s introduction to PGP. Sadly, Facebook made no attempt at hiding the complexity of using PGP, so it’s unlikely that many regular users will actually sign up for it.

The company says it’s rolling out this new feature slowly the feature is now available globally. If you want to see if it’s available for your account, head to your Facebook settings, look for the contact info section and you should see the option to add a PGP public key.

It is great to see a large web-based company like Facebook support encryption, but unfortunate they did not dumb this down enough for the lowest common denominator. It would have been nice to see Facebook offer a tutorial of some sort, and help instruct the average user on how to use PGP to secure their communications.

At the very least, this is a nice start.