The Cambridge Analytica scandal has put data harvesting in the spotlight, but the problem goes far beyond Facebook.
“You don’t get to 500 million friends without making a few enemies,” as the film about the founding of Facebook, The Social Network put it. Everyone the social media giant has ever crossed has joined the pile-on of late, as Facebook CEO Mark Zuckerberg has been forced to explain himself over Facebook’s involvement in data harvesting.
But “Zuck” is just the smirking face of a much wider issue: the way the web has been captured by corporate profiteers who make their money from selling a simple product: you. Or more precisely, your data.
‘Data is the new oil’
The biggest technology firms, responsible for an ever-growing share of the world’s billionaires, follow the mantra that “data is the new oil”. The apps and websites you use every day are the extraction method. They are not monitoring you for state-style social control, but for profit: surveillance capitalism.
What does data extraction mean in practice? Let’s return to Facebook. Behind the photo slideshows, cartoon smiles and birthday wishes, downloading your Facebook data shows that its app has been slurping every bit of data it can get its hands on, from the location of your phone to even who you’ve been calling and for how long.
But it’s not just about Facebook – and not just about targeted advertising, either. Google knows everything you search for, and when you stay signed in to Gmail, it knows who made the searches.
That’s not so unexpected, but I was surprised to find recently that Google had been keeping track of everywhere I’ve been for the past five years. I must have said yes once when prompted by its Maps app with some explanation about “improving the experience”, and that was enough for it to graciously keep track of my every footstep from then on. (See your own location history.)
Twitter, meanwhile, decided to make a list of every app I have installed on my phone, to show me “more relevant content”. (See yours here.)
Netflix builds preference profiles based on what you watch, and then uses this data in aggregate to create entire new original TV shows, right down to the cover images, micro-targeted at sections of its audience. Amazon keeps track not only of what you buy, but everything you search for and look at – it’s all grist to its marketing mill.
Every company can use what it learns from millions or even billions of people not just to target ads but to make decisions that will let it grow faster than the competition — so over time, the most data-driven firms come to dominate.
As computer security expert Bruce Schneier writes, the smartphone is “probably the most intimate surveillance device ever invented. It tracks our location continuously, so it knows where we live, where we work, and where we spend our time. It’s the first and last thing we check in a day, so it knows when we wake up and when we go to sleep. We all have one, so it knows who we sleep with.”
These ubiquitous devices arrived not as a state-enforced requirement, barking orders at us in the manner of an outwardly oppressive apparatus, but as our ever-present assistants. They are always keen to help us — and help themselves to a little more of our data so that they might give us “better recommendations”.
In the rapidly approaching future, when you have an automated home, a self-driving car and a city full of internet-connected sensors, their makers will be watching you too, unless we can change the path we’re on.
People have heard of “algorithms”, those annoying things that mean your social media news feed does its best never to appear in chronological order. But algorithm is a soft term for what is really going on: machine learning — cutting-edge artificial intelligence — is being trained on all that data being extracted from huge populations every day.
Every time you scroll on Facebook, hit the heart button on Instagram or watch a video on Youtube, you are taking part in the latest round of a never-ending global experiment.
Never-ending experiment
You are like a rat in a maze, with a machine showing you a stimulus, noting your response, and then showing you another.
The machine is not trying to make you happy — though, to be fair, it is not trying to make you sad either. Its aim is what is called “engagement”: in other words, to keep you running around inside the maze for as long as possible each day, every day, for the rest of your life.
Why bamboozle billions of people like so many rodents? Because every minute you spend “engaged” racks up another few fractions of a cent in corporate profit.
Tristan Harris, a former Google employee and founder of the Center for Humane Technology, argues that apps’ feeds hook into the same parts of human psychology as gambling does: “When we pull our phone out of our pocket, we’re playing a slot machine to see what notifications we got…
“If you want to maximise addictiveness, all tech designers need to do is link a user’s action (like pulling a lever) with a variable reward.”
The attention rat-race is what causes all of the presumably unintended consequences that are more visible on the surface, from Facebook fake news farms to the creepy, often auto-generated kids’ videos clogging up Youtube.
The tech giants’ money-bots spray out audience traffic and ad cash in the direction of anyone who produces content that captures attention. What that content might be is, at best, a secondary concern.
What “your feeds” show you, and in what order, is no longer under your control. That doesn’t make them some kind of mind-control machine, but it does mean that, over time, the decisions of a large population could be influenced in subtle ways — as Facebook found in a 2014 study where it was able to influence users’ emotions.
Cambridge Analytica, according to whistleblower Chris Wylie, used psychological profiling to “know what kinds of messaging you would be susceptible to”. So, for example, “a person who’s more prone towards conspiratorial thinking” might be shown ads that play to that mindset. This can help an idea to spread by starting with a “group of people who are more prone to adopting that idea”.
Without blaming Facebook for all political ills, it is not inconceivable that a large enough advertising campaign with that kind of targeting could influence an election by a crucial few percent. As the whistleblowers’ evidence highlights, such methods have been quietly in use in the global South for several years.
A people’s algorithm
Brexit and Trump — and the Cambridge Analytica affair’s contribution to the ever-growing web of connections between the two — have catalysed new interest in how our data is being collected and sifted. But any potential solution has to start much further back, before the data was originally gathered.
These data-mongers are no geniuses: they stumbled across Facebook’s data goldmine and filled their boots. The point is that such a motherlode should never have existed in the first place.
At its best, social media has provided an important platform for alternatives to the mainstream media, allowing people to spread the word about protests and grassroots events, giving a voice to people previously marginalised and ignored. This is one factor behind the emergence of “Corbynism” in Britain — the rise of veteran socialist MP Jeremy Corbyn to Labour Party leader on a left-wing platform in the face of hostility from the traditional media and political establishment.
The question is: how can we decommodify our everyday interactions — and even our resistance?
For the moment, deleting your Facebook account might feel like a start, but individuals opting out does little to change the overall societal problem. Facebook is also known to keep “shadow profiles” of non-members, so it might not even make much difference to your own privacy.
Collectively, we need to demand regulation. The European Union’s new General Data Protection Regulation (GDPR) is a good start. Just as importantly, we have to build alternatives that explicitly reject the data-extraction model.
There is nothing inherent about social media that requires it to plunder our personal data, other than the companies’ surveillance-capitalist business models. Open source, non-profit tools, in the original spirit of the web, could let us communicate freely and easily. It could give us the positive aspects of social media without taking the mercenary-spies along for the ride.
Mastodon is one existing open source effort that has been able to start reaching out beyond just techies.
Going one step further, can we imagine machine learning put to social use? It is possible in principle — though to my knowledge it hasn’t yet been tried — to flip the switch on those mass experiments so that they don’t aim to produce engagement, profit or propaganda, but happiness instead.
A “people’s algorithm” could help us reject the rule of commerce and promote ideas and actions that challenge corporate power.
The current spotlight on Facebook will soon fade, but every data scandal — and there will be many more to come — increases the relevance and urgency of technological alternatives that let us take back our online lives from the corporations’ clutches.
[Abridged from Red Pepper.]