The Korea Herald

소아쌤

[Leonid Bershidsky] Facebook is real problem, not Cambridge Analytica

By Bloomberg

Published : March 21, 2018 - 17:35

    • Link copied

Facebook is being hammered for allowing data firm Cambridge Analytica to acquire 50 million user profiles in the US, which it may or may not have used to help the Trump campaign. But the outrage misses the target: There’s nothing Cambridge Analytica could have done that Facebook itself doesn’t offer political clients.

Here, in a nutshell, is the scandal. In 2014, Aleksandr Kogan, an academic of Russian origin at Cambridge University in the UK, built a Facebook app that paid hundreds of thousands of users to take a psychological test. Apart from their test results, the users also shared the data of their Facebook friends with the app. Kogan sold the resulting database to Cambridge Analytica, which Facebook considers a violation of its policies: The app was not allowed to use the data for commercial purposes. Carol Cadwalladr and Emma Graham-Harrison, writing for the UK’s Observer, quoted former Cambridge Analytica employee Christopher Wylie as saying the firm “broke Facebook” on behalf of Stephen Bannon, the ideologue and manager behind the Trump campaign.

It didn’t escape keen observers that if the Trump campaign used Facebook user data harvested through an app, it did no more than Barack Obama’s 2012 data-heavy re-election campaign. It’s not documented exactly how Obama’s team gathered oodles of data on potential supporters, but a deep dive into the tech side of that campaign by Sasha Issenberg mentioned how “‘targeted sharing’ protocols mined an Obama backer’s Facebook network in search of friends the campaign wanted to register, mobilize, or persuade.” To do this, the protocols would need to use the same feature of the Facebook platform for developers, discontinued in 2015, that allowed apps access to a user’s friends’ profiles -- with the user’s consent, as Facebook invariably points out.

Let’s face it: Users are routinely tricked to obtain such consent. Tech companies make giving it, or agreeing to complex terms of service, look like a low-engagement decision.

“Is it okay if we look at your friends’ info?” they ask.

“Sure, why not? I want to take this nifty psychological test,” we answer.

Afterward, only Facebook itself is interested in the legal minutiae of what permissions it gave to which developers. As far as everyone else is concerned, it doesn’t matter whether an app gets the data for research purposes or for straight-up political ones. Average users worry more about convenience than privacy.

The relevant question, however, is what a campaign can actually do with the data. Cambridge Analytica’s supposedly sinister skill is that it can use the Facebook profile information to build psychological profiles that reveal a person’s propensity to vote for a certain party or candidate. When matched against electoral registers, targeted appeals are possible.

But no one should take the psychological profile stuff at face value. No academic work exists to link personality traits, especially those gleaned from the sketchy and often false information on Facebook profiles, definitively to political choices. There is, however, research showing that values or even genetic factors trump traits. It’s not even clear how traits affect political behavior, such as the tendency to vote and donate to campaigns: Some researchers, for example, have found a negative relationship between emotional stability and these measures; others have found a positive one.

This is not to say Facebook data, including data on a user’s friends, can’t be useful to campaigns. The Obama campaign actually asked its active supporters to contact six specific friends suggested by the algorithm. So 600,000 people reached 5 million others, and, according to data from the campaign, 20 percent of the 5 million actually did something -- like registering to vote.

But did the Trump campaign need Cambridge Analytica and the data it acquired from Kogan to do this kind of outreach in 2016? Likely not. Facebook cut off the friends functionality for app developers because it wanted to control its own offering to clients interested in microtargeting.

There’s plenty of evidence that Brad Parscale, who ran the digital side of Trump’s campaign, worked closely with Facebook. Using the platform’s “Lookalike Audiences,” he could find people who resemble known Trump supporters. Facebook also has the capacity to target ads to the friends of people who have “liked” a page -- a Trump campaign page, for example.

Targeting messages to millions of specific people without going directly through Facebook is messier and probably more expensive than using the social platform’s own tools. All Facebook requires for access to its data trove is a reasonable fee.

Whether Cambridge Analytica could add anything meaningful to Facebook’s effort is unclear. Its previous client, the unsuccessful presidential campaign of Sen. Ted Cruz, has said it didn’t deliver on all its promises.

Some studies have shown that Facebook ads can work quite well for businesses. If they also worked for Trump, the Cambridge Analytica story is a red herring: It’s Facebook’s own data collection and the tools it makes available to clients that should be the target of scrutiny and perhaps regulation, both from a privacy perspective and for the sake of political transparency.


Leonid Bershidsky
Leonid Bershidsky is a Bloomberg View columnist. -- Ed.

(Bloomberg)