It sounded like the stuff of spy novels. A secretive company owned by a reclusive genius billionaire taps into sensitive data gathered by a University of Cambridge researcher. The company then works to help elect an ultra-nationalist presidential candidate who admires Russian president Vladimir Putin. Oh, and that Cambridge researcher, Kogan, worked briefly for St Petersburg State University. And his research was designed to develop ways to psychologically profile and manipulate voters. Before we go too deep down the rabbit hole, let’s recognise that the data Cambridge Analytica gathered to try to target more than 87 million Facebook users in the United States was not stolen from Facebook or removed after some security flaw or “data breach.” The real story is far less dramatic but much more important. It’s such an old story that the US Federal Trade Commission investigated and punished Facebook back in 2011.

It’s such a deep story that social media researchers have been warning about such exploitative practices since at least 2010, and many of us complained when the Obama campaign in 2012 used the same kinds of data that Cambridge Analytica coveted.

Obama targeted voters and potential supporters using software that ran outside of Facebook. It was a problem then. It’s a problem now. But back in 2012, the Obama story was one of hope continued, and his campaign’s tech-savvy ways were the subject of “gee whiz” admiration. So academic critics’ concerns fell silent. Just as important, Facebook in 2012 was coming off a peak reputational moment. Facebook usage kept growing globally, as did the glowing if misleading accounts of its potential to improve the world after the 2011 revolution in Egypt. Between about 2010 and 2015, Facebook was a data-exporting machine. Facebook gave data – profiles of users who agreed to take one of those annoying quizzes that proliferated around Facebook between 2010 and 2015, but also records of those who were Facebook Friends with those users – to application developers who built cute and clever functions into Facebook. These included games like Mafia Wars, Words with Friends, or Farmville. You might have played, and thus unwittingly permitted the export of data about you and your Friends to other companies. Until 2015 it was Facebook policy and practice to let application developers tap into sensitive user data as long as users consented to let those applications use their data. Facebook users were never clearly informed that their Friends’ data might also flow out of Facebook or that subsequent parties, like Cambridge Analytica, might reasonably get hold of the data and use it however they wished.

Advertisement

The Federal Trade Commission saw this as a problem. In 2011 the agency released a report after an investigation revealed that Facebook had deceived its users over how personal data was being shared and used. Among other violations of user trust, the commission found that Facebook had promised users that third-party apps like Farmville would have access only to the information that they needed to operate. In fact, the apps could access nearly all of users’ personal data – data the apps didn’t need. While Facebook had long told users they could restrict sharing of data to limited audiences like “Friends Only,” selecting “Friends Only” did not limit third-party applications from vacuuming up records of interactions with Friends. The conclusions were damning. They should have alarmed Americans – and Congress – that this once huggable company had lied to them and exploited them. through a consent decree with the commission, Facebook was barred from making misrepresentations about the privacy or security of consumers’ personal information. It was required to obtain consumers’ affirmative express consent before overriding privacy preferences. And Facebook was required to prevent anyone from accessing a user’s material more than thirty days after the user has deleted his or her account. Most important, Facebook had to proactively police its application partners and its own products to put user privacy first. The consent decree put the burden on Facebook to police third parties like Kogan, the Obama campaign, and the makers of Farmville. Facebook was responsible for making sure fourth parties, like Cambridge Analytica, did not get and use people’s information. We now know how well Facebook lived up to that responsibility. Facebook shut down this “Friends” data-sharing practice in 2015, long after it got in trouble for misleading users but before the 2016 election got into high gear. Not coincidentally, Facebook began embedding consultants inside major campaigns around the world.

For 2016 Facebook would do the voter targeting itself. Facebook is the hot new political consultant because it controls all the valuable data about voter preferences and behavior.

No one needs Cambridge Analytica or the Obama 2012 app if Facebook will do all the targeting work and do it better. This is the main reason we should stay steady at the rim of the Cambridge Analytica rabbit hole. Cambridge Analytica sells snake oil. No campaign has embraced it as effective. Cambridge Analytica CEO Alexander Nix even admitted that the Trump campaign did not deploy psychometric profiling. Why would it? It had Facebook to do the dirty work for it. Cambridge Analytica tries to come off as a band of data wizards. But they are simple street magicians, hoping to fool another mark and cash another check.

We should be wary of the practice of data-driven voter targeting in general – whether done for the Trumps of the world or for the Obamas of the world. The industry devoted to rich data targeting and voter manipulation is far bigger than SCL and Cambridge Analytica. It’s growing on every continent. And it’s undermining democracy everywhere. Facebook is doing the data analysis internally. Facebook is working directly with campaigns – many of which support authoritarian and nationalist candidates. You don’t need Cambridge Analytica if you have Facebook. The impact of Facebook on democracy is corrosive.

Advertisement

By segmenting an electorate into distinct sets, candidates move resources toward efforts to pander to small issues with high emotional appeal instead of those that can affect broad swaths of the electorate and perhaps cross over presumed rifts among voters. It’s not necessary – and may be counterproductive – for a campaign to issue a general vision of government or society or to articulate a unifying vision. It’s still done, but it’s not the essence of the game anymore. Voter targeting, even without the powerful black magic of psychographics, encourages narrow-gauge interventions that can operate below the sight of journalists or regulators. A campaign like Trump’s can issue small, cheap advertisements via platforms like Facebook and Instagram that disappear after a day or get locked forever in Facebook’s servers. That’s bad for transparency. That’s exactly what happened. That story has not echoed as far as the one about Cambridge Analytica and psychographics. But it’s the real story.

Excerpted with permission from Antisocial Media: How Facebook Disconnects Us And Undermines Democracy, Siva Vaidhyanathan, Oxford University Press.