Surveillance capitalism

– by Shoshana Zuboff –

art by Anthony Freda, anthonyfreda.com

When we think about capitalism in the 19th and 20th centuries, the titanic struggles of those eras, of that industrial era, was the struggle between capital and labor. The power of capital bore down on us as workers, as employees in our economic roles, in the economic domain, in factories, in offices. Something different is happening now. The titanic struggles of capital today have washed over the walls of the factories and the offices, at least those that are left. They’ve surpassed the economic domain. They have flooded the whole space that we think of as the social, society. They bear down on each one of us, not because of a specific economic role that we play but simply because we are here living our lives, this is our time. These forces bear down on our bodies, on our homes, in our cars, in our cities. They know our tears, they know our bloodstream, they know our pancreas, they know our conversations, they know our emotions, they know our personalities. They know our futures.

And yet what are we called? We don’t have a name. We are not workers, we are not laborers. We don’t have a name. The only name we have is the name that they gave us – users. This is not okay. What I’m learning and maybe you will learn this with me, maybe you disagree, but what I’m learning is that what these words, in different countries and different cities and across different generations, are expressing are the social, political, psychological, and economic interests that are emerging for us in our new experience in a new era of capital that is no longer confined to the economic domain..

Let me give you an analogy. In the 1830s, the first third of the 19th century, in Britain, where industrial capitalism was slowly taking form, slowly emerging as a new kind of capitalism, there were two words for the social classes. If you looked at the whole social hierarchy, there were only two labels: one was aristocracy and the other was the lower classes. Everyone who wasn’t aristocratic was grouped into this big mélange, the lower classes. That included everyone from bankers and merchants to paupers. Everyone was just part of the lower classes.

There’s a very specific and interesting history of how the conditions of industrial capitalism took hold and the factories emerged and the new forms of work and the new forms of economic oppression. And under the pressure of those new forms a new consciousness was born, and the idea of the laborer emerged. It wasn’t already given; it emerged out of a felt recognition of shared interests that were new in the world. That felt recognition, that new consciousness became the basis for the new forms of collective action that eventually mobilized the emergence of democracies in our societies and eventually formed the basis of power that tethered industrial capitalism to the interests of society and to the requirements of democracy and to democratic values and principles, such that over the decades, and indeed well into the 20th century, we could talk about something like a market democracy and we could experience some kind of equilibrium in which capitalism and its raw excesses were tethered to the needs of people and society and democracy.

I wonder if this is our time now to emerge from this amorphous nonentity of “users,” which is a word that says we don’t matter and we have no interests, as we through these words – anxiety, manipulation, control, freedom, democracy, resistance, rebellion, solution – begin to identify our true interests, and through that, discover the new forms of collective action. It won’t be the solutions of the 20th or the 19th century, but there will be new forms of collective action and collaborative action that bind us and that allow us to compel and harness the resources of our democratic institutions to restrain, interrupt, and even outlaw the raw excesses of a rogue mutation of capitalism that I call “surveillance capitalism”, which is now illegitimately claiming a dominant role in our democracy, our society, and indeed in our lives.

I’m going to say a couple of words just sort of introducing the basics of surveillance capitalism, a little Surveillance Capitalism 101. I define surveillance capitalism this way. Surveillance capitalism departs in many respects from the history of market capitalism, but in this respect it mirrors that history. People have long discussed the way in which capitalism evolves by claiming things that have their own life outside of the market dynamic and bringing them into the market dynamic so that they can be turned into commodities for sale and purchase. So famously, for example, industrial capitalism claimed nature. Nature lives its own life: the forests and the meadows and the waters of the rivers, the oceans, the mountains, its own life. Now claimed by industrial capitalism for the market dynamic to be reborn as real estate, as land to be sold and purchased. Famously, industrial capitalism claimed work, the kinds of things that people did in their fields, in their cottages, in their homes, in their gardens, for the market dynamic to be reborn as labor – labor that could have a price attached to it, wage labor to be sold and purchased.

Surveillance capitalism follows in this tradition by claiming private human experience as a source of free, raw material for subordination to the market dynamic, where it is reborn as behavioral data. These behavioral data are then combined with world historic computational capabilities that we generally refer to as things like machine intelligence, machine learning, artificial intelligence. Really the labels are secondary to the thing. But these are computational capabilities that have never before existed.

The combination of behavioral data and world historic computation capabilities are aimed at one goal, and that is to produce predictions of human behavior, what we will do now, soon, and later. These predictions – think of them as prediction products – are then sold in a new kind of marketplace in which business customers, not us, business customers, have an interest in betting on our future. And it is business customers who are now vying to lay their bets on these predictions, to purchase these prediction products to know our futures. So these are markets in behavioral futures. Just like we have markets in pork belly futures and oil futures, these are markets in behavioral futures, which is why I call on them behavioral futures markets.

This may seem sort of weird and science-fictiony and whatever, but really it’s just a tiny stroke of the dial of abstraction on what has become commonplace in our lives. That starts with online targeted advertising. This logic was pretty much invented at Google. Like with mass production, there were pieces, elements of it that were out and about, but it really all came together at Google in 2001 under the heat of financial emergency in the dot-com bubble bursting. In the same way that mass production, elements of that, had been around in the armories and the Singer sewing machine and various things, but they really all came together at the Ford Motor Company at a certain place in time under the heat of Ford’s own financial emergency, having gone bankrupt twice and now really trying to make it work the third time.

When we think about what was invented there – I’m not going to go into all the details, because it will run through too much time – essentially what is it that happened? They figured out that they could take data that they were not using to actually improve their products and services – stuff that was sitting around in data logs that at that time was called waste, it was called data exhaust, it was called digital bread crumbs, it was collateral left-over stuff from search and so forth behavior – and that it had great predictive power, discovered that they could put that together with their even then considerable computational capabilities, and that with that they could produce a prediction of a piece of future behavior. In this case, that future behavior was pretty specific. It was a click-through. But that’s a future behavior.

And what those online targeted ad folks were doing, they were buying these predictions of future behavior. And the way it happened was Google said: Look, you used to pick the keywords, you used to decide where your ads were going to go. You’re not going to do that anymore. We’re going to use this special computational capability and our proprietary data. We’re going to tell you the result, we’re going to tell you where to put your ad. And if you just follow along, you will make money. At first they didn’t want to do that because they didn’t like the black box idea: We want to know what’s going on, we want to pick. But eventually they agreed and they went with the black box and they picked the computational result, the prediction product, that came out of these analyses. Lo and behold, Google made money.

Very interesting to note, between that moment of financial emergency, say, in the year 2000, Google’s revenues were about $86 million. When it IPO’d in 2004 and the fruit of these activities first became known to the world, we saw that their revenue line increased by 3,590 percent.

That’s on the strength of a new economic logic that depended upon the social relations of the one-way mirror. Because they quickly understood that in order to find these very predictive data, they not only could scrape them from their data logs, but they could go hunt them in all kinds of online environments. They were very explicit about seeing that they could hunt and take outside of people’s awareness, because they knew even then that asking would not be a profitable undertaking. So from the start, all of this, the taking of private experience for translation into data, had to be secret. It was designed to keep us users in ignorance. And over the years that has proved to be extremely successful, that our ignorance has been their bliss. Ergo, surveillance capitalism.

I spent seven years literally locked away writing this book, and the thing that really guided me on this was my feeling for my children’s future. Seeing into my children’s future and being afraid of what I saw was a real big motivator for me. But I’m not naturally a very selfish person, so it’s a very quick step from there to being worried about everybody’s children and all the young people that I know and all the young people that I meet. These are two paragraphs that I wrote for my children and all the young people that I meet when I’m teaching when I’m on the road. These are things I want either you to know to tell your children or, if you are one of the children in the room, I want you to know this in your heart. That’s why I’m reading this right now. This is way at the end of the book.

“When I speak to my children or an audience of young people, I try to alert them to the historically contingent nature of the thing that has us by calling attention to ordinary values and expectations before surveillance capitalism began its campaign of psychic numbing. It is not OK to have to hide in your own life. It is not normal, I tell them. It is not OK to spend your lunchtime conversations comparing software that will camouflage you and protect you from unwanted, continuous invasion: “5 trackers blocked,” “4 trackers blocked,” “59 trackers blocked,” facial features scrambled, voice disguised.

“I tell them that the word ‘search’ has meant a daring existential journey, not a fingertip to already existing answers; that friend is an embodied mystery that can be forged only face to face and heart to heart; and that recognition is the glimmer of homecoming we experience in our beloved’s face, not facial recognition. I say that it is not OK to have our best instincts for connection, empathy, and information exploited by a draconian quid pro quo that holds these goods hostage to the pervasive strip search of our lives. It is not okay for every moving, emotion, utterance, and desire to be catalogued, manipulated, and then used to surreptitiously herd us through the future tense for the sake of someone else’s profit. These things are brand-new, I tell them, they are unprecedented. You should not take them for granted, because they are not OK.

“If democracy is to be replenished in the coming decades, it is up to us to rekindle the sense of outrage and loss over what is being taken from us. In this I do not mean only our personal information. What is as stake here is the human expectation of sovereignty over one’s own life and authorship of one’s own experience. What is at stake is the inward experiences from which we form the will to will and the public spaces to act on that will. What is at stake is the dominant principle of social ordering in an information civilization and our rights as individuals and societies to answer the questions who knows, who decides, who decides who decides.

So if we are coming up with the same words in different countries and different cities, across generations and across societies, isn’t this really about the beginning of an information civilization that we are all participants in? And isn’t this really the beginning of the contest of what kind of civilization it will be, what will be its moral milieu, what will be its values, what will be its balance of knowledge and power. Who knows? Who decides? Who decides who decides?”

Early on the Google founders understood that human experience was the new virgin wood. They understood it would be the thing that could be claimed and monetized. And they understood that it would be very cheap to do so, not only in the online environment but out in the real world, where devices, sensors, cameras, blah, blah, blah, all of the digital infrastructure, all the saturation of digital architecture that now surrounds us would be both ubiquitous and extremely cheap. This was a vision early on.

It is certainly a vision of unilateral dispossession. I write about the many, many, dispossession strategies that were honed over time, including when they were challenged, which is something that’s going on right now with Facebook. We’re seeing what I call the stages of the dispossession cycle unfold – incursion, habituation, adaptation, redirection – stages that they go through to cope with how to institutionalize the dispossession and how to demobilize the resistance that is offered to it, as is being offered now, like, Okay, no more pictures of self-harm on Instagram. Adaptation. Anyway, it’s all there.

When it comes to surveillance capitalism, to get into the global exploitation, I’ve tried to isolate some of the economic imperatives at work here, because in order for prediction – competition in surveillance capitalism revolves around predicting the future. So surveillance capitalists are competing on who’s got the best predictions. The best predictions approximate observation; the best prediction is one that is just like actually seeing the thing. You don’t have to predict it. It’s happening. These imperatives have become more dynamic and more pernicious over the last two decades.

Economies of scale. We need a lot of behavioral data, a lot of these surplus data to make good predictions. Economies of scope. We need different qualities of data. Now it’s got to come from your emotions, from your face, from your voice, from where you run, from where you shop, from what you do in the city, from how you look for a parking space. And then ultimately from what I call economies of action. Economies of action means the very best, the choicest predictive data comes from actually intervening in the state of play and tuning behavior, herding behavior, shunting, manipulating behavior so that it moves in the direction of the outcomes that are aligned with our commercial objectives.

I write about Pokémon GO, for example, as an experimental dry run at population-level herding. The skills of herding populations in ways that are strictly maintained to bypass the awareness of individuals in order to nevertheless herd them along the lines and toward the spaces and places where guaranteed commercial outcomes will be fulfilled. And that is, in the case of Pokémon GO, the establishments, the restaurants, the pizza joints, the bars, the service stations, the McDonald’s franchises, blah, blah, blah, that paid Pokémon GO, paid Niantic Labs, Google incubated, as you all know, probably, for footfall in exactly the same way that online targeted advertisers pay Google and Facebook and others for what, for click-through rights. So footfall in the real world is click-through in the online world.

Ultimately, the digital architecture over which we have all poured so much effort, into which we have placed so much hope, into which we have placed our dreams for an empowering and a democratizing information civilization, this digital architecture, which I call not Big Brother but Big Other, doesn’t really care about us at all. It really doesn’t care who you are or what you do. It doesn’t care if you are happy or sad. It doesn’t care if you are alt-right or alt-left. It really does not give a toss about you. All it cares about is that whoever you are and whatever you do, you do it in a way that it can get the data.

This is what I call radical indifference. There is an economic compulsion toward totality of data. This is what pushes it first from online to offline, across the offline, deep into our personalities, across our activities, and then across our homes, our cars, our cities, our regions, our countries, our societies, and our world. There are no boundaries to totality. The more behavioral data, the more prediction, the better the prediction, the more lucrative, the more powerful these behavioral futures markets.

An economic logic has been created and institutionalized on which huge market capitalization now rests. This economic logic has imperatives, and you can predict the behavior that will occur in these companies by understanding the economic logic. That doesn’t mean that every individual in the organization actually even understands these economic imperatives or what they’re doing. But the economic imperatives, once you grasp them, predict the behavior.

In the same way, I quote Andy Bosworth’s amazing internal memo, which is the perfect description of what I call radical indifference. “Connectivity equals economic growth. We connect, we grow. We connect, someone uses our connection to pull off a terrorist plot, that’s too bad. But we continue connecting, because connection is growth. We connect, some people find each other and fall in love, marry, live happily ever after, that’s great. We continue to connect. No matter what, we connect.” This is the idea of radical indifference.

It’s not just Google, it’s not just Facebook, which, by the way, is the second category error, that this is just related to a couple of big companies. The CEO of Ford Motor, who says, we want PEs like Google and Facebook. We want market cap like Google and Facebook. Nobody wants to buy cars anymore on planet Earth. What are we going to do? Oh, I know. Let’s sell data. We have 100 million people driving around in Ford cars. Let’s get all the data from that. Let’s put it together with the data from Ford Credit, where, he says, we already know everything about you. Then we’re returning with the big boys, we’re up there. Who wouldn’t want to source predictive data from Ford Motor? This is an economic logic, and it’s transforming industry after industry. We see it in insurance, we see it in finance, we see it in health, we see it in education. This is not about a planet suddenly gone evil.

These are economic imperatives. Normal people. It’s not like every bad person went over there to work on surveillance capitalism. This is all people like us caught up in an economic logic that is unprecedented, that has not been named, that has barely been recognized even by the people who are practicing it.

Shoshana Zuboff is professor emerita at the Harvard Business School. She is the author of In the Age of the Smart Machine and The Age of Surveillance Capitalism. This article was transcribed by David Barsamian, founder & Director of Alternative Radio airing on Vancouver Co-op Radio CFRO-FM (100.5), Tues, noon-1pm. www.alternativeradio.org

Leave a comment

*