Back

Professor Shoshana Zuboff on Surveillance Capitalism Q&A

Shoshana Zuboff is Charles Edward Wilson Professor Emerita at Harvard Business School and author of several seminal texts on the rise of digital and its relationship to capitalism. In her critically-acclaimed new book The Age of Surveillance Capitalism, Professor Zuboff dissects how the business models underlying the data economy are influencing us, something we discuss in more depth in this interview.

Shoshana Zuboff

Could you explain your concept of “surveillance capitalism”?

It has long been understood that capitalism evolves by claiming things that exist outside of the market dynamic and turning them into market commodities for sale and purchase. In historian Karl Polanyi’s 1944 grand narrative of the “great transformation” to a self-regulating market economy, he described the origins of this translation process in three astonishing and crucial mental inventions that he called “commodity fictions”. The first was that human life could be subordinated to market dynamics and reborn as “labour” to be bought and sold. The second was that nature could be translated into the market and reborn as “land” or “real estate”. The third was that exchange could be reborn as “money.”

The age of surveillance capitalism originates in an even more startling and audacious mental invention, as surveillance capitalism declares private human experience as free raw material for translation into production and sales. It relies on hidden operations intentionally designed to bypass “user” awareness.

Once private human experience is claimed for the market, it is translated into behavioural data for computation and analysis. While some of this data may be applied to product or service improvements, the rest is declared a proprietary behavioural surplus. This surplus is defined by data with substantial predictive value that exceeds what is required for product or service improvement.

When it comes to “behavioural surplus” extracted from “private human experience,” nothing is exempt. These operations began with behavioural surplus drawn from online browsing, search, and social media behaviour, but they now encompass every movement, conversation, facial expression, sound, text, and image that is, or can be, or will be, accessible to the always-on ubiquitous internet-enabled digital extraction architecture that I call Big Other.

In this digital surround, every “smart” and “connected” device, interface, and internet touch point is redefined as a node in a vast supply network dedicated to relentlessly tracking, hunting, inducing, and taking more behavioural surplus. These new supply chains ultimately feed a new “means of production” known as “machine intelligence”. These are the new age factories where behavioural surplus is fabricated into prediction products: calculations that anticipate what we will do now, soon, and later. Finally, these prediction products are rapidly swept up into the life of the market, traded in newly constituted marketplaces for behavioural predictions that I call behavioural futures markets. Surveillance capitalists have grown immensely wealthy from these trading operations, as many companies are eager to lay bets on our future behaviour.

Why, and how, exactly is this happening?

The hunt for surplus is driven by a new competitive struggle over the prediction of human behavior. In a world of highly commoditized products and services, companies now turn to behavioural surplus and its predictive value as the long sought after glory road to higher margins. The result is whole new ecosystems of behavioural surplus suppliers, computational hired guns, and market players as companies from every sector seek ways to participate in this universal expropriation of private experience.

Among these growing stores of proprietary surplus, we find your tears, the clench of his jaw in anger, the secrets your children share with their dolls, our breakfast conversations and sleep habits, the decibel levels in my living room, the location of the furniture in your home, the thinning treads on your running shoes, your hesitation as you survey the sweaters laid out on the table in the shop, and the exclamation points that follow a Facebook post, once composed in innocence and hope…

The competitive dynamics of these new markets produce economic imperatives to acquire ever more predictive sources of behavioural surplus. The first competitive phase emphasised the volume of data and thus economies of scale. The second emphasised varieties of data – economies of scope. Eventually, surveillance capitalists discovered that the most-predictive behavioural data come from intervening in real life behaviour in order to nudge, coax, tune, and herd human activity at scale, always pushing behaviour toward profitable outcomes: economies of action. Data scientists call this the shift from “monitoring” to “actuation”.

With this reorientation from knowledge to power, it is no longer enough to automate information flows about us; the goal now is to automate us. In this phase of surveillance capitalism’s evolution, the means of production are subordinated to an increasingly complex and comprehensive “means of behavioural modification”. As long as surveillance capitalism and its behavioural futures markets are allowed to thrive, ownership of the new means of behavioural modification eclipses ownership of the means of production as the fountainhead of capitalist wealth and power in the twenty-first century.

In this way, surveillance capitalism births a new species of power: instrumentarian power. This is the power to know and shape human behaviour toward others’ ends. It is an unprecedented quality of power, completely distinct from totalitarianism. Instead of armaments and armies, terror and murder, instrumentarian power works its will through the automated medium of an increasingly ubiquitous, internet-enabled, computational architecture of “smart” networked devices, things, and spaces. It has become difficult to escape this bold market project, whose tentacles reach from the gentle herding of innocent Pokémon Go players to eat, drink, and purchase in the restaurants, bars, fast-food joints and other service establishments that pay to play in its behavioral futures markets, to the ruthless expropriation of behavioral surplus from Facebook profiles for the purposes of shaping individual behaviour. Whether it’s buying pimple cream at 5:45 PM on Thursday, clicking “yes” on an offer of new running shoes as the endorphins race through your brain after your long Sunday morning run, or voting next week in response to subliminal cues in your Facebook feed.

Just as industrial capitalism was driven to the continuous intensification of the means of production, so surveillance capitalists and their market players are now locked into the continuous intensification of the means of behavioural modification and the gathering might of instrumentarian power. Surveillance capitalism runs contrary to the early digital dream. It strips away the illusion that the networked form has some kind of indigenous moral content, that being “connected” is somehow intrinsically pro-social, innately inclusive, or naturally tending toward the democratization of knowledge.

Digital connection is now a means to others’ commercial ends. At its core, surveillance capitalism is parasitic and self-referential. It revives Karl Marx’s old image of capitalism as a vampire that feeds on labour, but with an unexpected turn. Instead of labour, surveillance capitalism feeds on every aspect of every human’s experience. Under this new regime, the precise moment at which our needs are met is also the precise moment at which our lives are plundered for behavioural data, and all for the sake of others’ gain. The result is a perverse amalgam of empowerment inextricably layered with diminishment.

This new regime’s most poignant harms have been difficult to grasp or theorize, blurred by extreme velocity and camouflaged by expensive and illegible machine operations, secretive corporate practices, masterful rhetorical misdirection, and purposeful cultural misappropriation. On this road, terms whose meanings we take to be positive or at least banal— “the open internet,” “interoperability,” and “connectivity”— have been quietly harnessed to a market project in which individuals are definitively cast as the means to others’ market ends.

What is it about our current technological era that resulted in this model? Where did we go wrong?

Where we went wrong was to allow surveillance capitalism to run free in the dark, unimpeded by democratic oversight, law, or regulation. Surveillance capitalism is not technology; it is a logic that imbues technology and commands it into action. Surveillance capitalism is a market form that is unimaginable outside the digital milieu, but it is not the same as the “digital”. The digital can take many forms depending upon the social and economic logics that bring it to life. It is capitalism that assigns the price tag of extraction, subjugation, and helplessness –– not the technology.

In the years before the logic of surveillance capitalism and its financial success were decipherable, scientists and engineers designed versions of the “smart home” or “telemedicine” in ways that explicitly protected the privacy and decision rights of a home’s occupants or individuals seeking remote healthcare. There was no secret taking of private experience for secret translation into behavioural data for secret processes of manufacture and sales, all of it aimed at others’ profits.

That surveillance capitalism is a logic-in-action and not a technology is a vital point because surveillance capitalists want us to think that their practices are inevitable expressions of the technologies they employ.

For example, in 2009 the public first became aware that Google maintains our search histories indefinitely: data that are available as raw-material supplies are also available to intelligence and law-enforcement agencies. When questioned about these practices, the corporation’s former CEO Eric Schmidt mused, “The reality is that search engines including Google do retain this information for some time”. In truth, search engines do not retain, but surveillance capitalism does. The truth is that we can easily imagine the digital without surveillance capitalism, but there can be no surveillance capitalism without the digital.

Schmidt’s statement is a classic of misdirection that bewilders the public by conflating commercial imperatives and technological necessity. It camouflages the concrete practices of surveillance capitalism and the specific choices that impel Google’s brand of search into action. Most significantly, it makes surveillance capitalism’s practices appear to be inevitable when they are actually meticulously calculated and lavishly funded means to self-dealing commercial ends. Despite all the futuristic sophistication of digital innovation, the message of the surveillance capitalist companies barely differs from the themes once glorified in the motto of the 1933 Chicago World’s Fair: “Science Finds— Industry Applies— Man Conforms”.

Surveillance capitalism was invented at a time and place. It lives in history, not in physics or any other source of determinism. Google invented and perfected surveillance capitalism in much the same way that a century ago General Motors invented and perfected managerial capitalism. The new economic logic emerged between 2000 and 2004 in the teeth of financial emergency triggered by the dot-com bust, as Google launched an unprecedented market operation into the unmapped spaces of the internet, where it faced few impediments from law or competitors. Its leaders drove the systemic coherence of their business at a breakneck pace that neither public institutions nor individuals could follow. During those first five years of invention and elaboration, Google kept its new operations shrouded in secrecy. Only when the company went public in 2004 was it possible to learn that its revenues during that period increased by 3,590 per cent.

Google was the pioneer of surveillance capitalism in thought and practice, the deep pocket for research and development, and the trailblazer in experimentation and implementation, but it is no longer the only actor on this path. Surveillance capitalism quickly spread to Facebook and later to Microsoft. Evidence suggests that Amazon has veered in this direction, and it is a constant challenge to Apple, both as an external threat and as a source of internal debate and conflict. Surveillance capitalism is no longer confined to the competitive dramas of the large internet companies, where behavioural futures markets were first aimed at online advertising. Its mechanisms and economic imperatives have become the default model for most internet-based businesses. Eventually, competitive pressure drove expansion into the offline world, where the same foundational mechanisms that expropriate your online browsing, likes, and clicks are trained on your run in the park, breakfast conversation, or hunt for a parking space.

Today’s prediction products are traded in behavioural futures markets that extend beyond targeted online ads to many other sectors, including insurance, retail, finance, education, healthcare, and an ever-widening range of goods and services companies determined to participate in these new and profitable internet-enabled markets. Every internet-enabled “smart” product or “personalised” service is supply chain interface for surveillance capitalism’s complex data flows. In the absence of a decisive societal response that constrains or outlaws this logic of accumulation, surveillance capitalism appears poised to become the dominant form of capitalism in our time.

Why do we, as consumers and regulators, allow this happen?

First, the supply side:
Surveillance capitalism is inconceivable outside the digital milieu, but its ability to root and flourish over the last two decades was also the fruit of specific historical conditions, investment decisions, and public policy commitments.

One such legacy condition was the neoliberal ideology that fetishises “self-regulation” and regards law and regulation as an assault on freedom. Surveillance capitalists doubled down on this rhetoric, insisting that any legal impediments to their action would depress innovation and repress freedom of speech.

Google also benefited from historical events when a national security apparatus galvanized by the attacks of 9/11 prioritised “total information awareness” over comprehensive privacy legislation. In this new milieu, the state’s interest in the internet as a means of surveillance inclined it to nurture, buffer, borrow, and emulate the fledgling internet firms’ growing capabilities to monitor and shape behaviour.

Another key enabler was the culture of impatient money in Silicon Valley. Once Google established surveillance capitalism as the fast track to monetization, investors followed, raising the bar for every other internet company and driving surveillance capitalism’s dominance of the internet.

Second, the demand side:
Surveillance capitalism is unprecedented and this made it difficult for us to grasp. Second, its operations are scrupulously designed to bypass our awareness and keep us in ignorance. Third, our alternatives have largely disappeared.

Consider that the internet has become essential for social participation, that the internet is now saturated with commerce, and that commerce is now subordinated to surveillance capitalism. Even our most basic requirements for effective life on a daily basis are ensnared in this economic web. When we go online to retrieve our children’s’ homework from their school’s portal, search a government website, see our health data from the doctor’s office, or arrange dinner with friends, we are forced to march through the same channels that are surveillance capitalism’s supply chains.

Our dependency is thus at the heart of the commercial surveillance project, in which our felt needs for effective life vie with the inclination to resist its bold incursions. This conflict produces a psychic numbing that inures us to the realities of being tracked, parsed, mined, and modified. It disposes us to rationalise the situation in resigned cynicism, create excuses that operate like defense mechanisms (“I have nothing to hide”), or find other ways to stick our heads in the sand. In this way, surveillance capitalism imposes a fundamentally illegitimate choice that twenty-first-century individuals should not have to make, and its normalisation leaves us singing in our chains.

How might things have turned out differently if the web was designed differently?

The failures here are less failures of engineering than they are the political failures to grasp, impede, and outlaw the raw destructive excesses of this new market form.

Surveillance capitalism’s economic imperatives and institutionalised incentives are the sources of the most dangerous network dynamics that are typically attributed to the internet itself: the assault on privacy and individual autonomy, addiction, mental health harms, fake news, fraudulent information flows, behavioural manipulation at scale. Surveillance capitalism’s dominance of the internet favors centralisation over decentralisation for the sake of economies and network effects. It also favors boundary reduction over boundary maintenance for the sake of increased engagement and frictionless data flows.

The internet’s vulnerability to this hijack arose from an essential error: the notion that technology is a thing in itself, isolated from economics and society. There is no technological inevitability. Technologies are always economic means, not ends in themselves.

In modern times, technology’s DNA comes already patterned by what the sociologist Max Weber called the “economic orientation”. Economic ends, Weber observed, are always intrinsic to technology’s development and deployment. “Economic action” determines objectives, whereas technology provides “appropriate means.” In Weber’s framing, “The fact that what is called the technological development of modern times has been so largely oriented economically to profit-making is one of the fundamental facts of the history of technology”. In a modern capitalist society, technology was, is, and always will be an expression of the economic objectives that direct it into action.

A worthwhile exercise would be to delete the word “technology” from our vocabularies in order to see how quickly capitalism’s objectives are exposed.

Surveillance capitalism commandeered the internet and the wonders of the digital world, promising the magic of unlimited information and a thousand ways to anticipate our needs and ease the complexities of our harried lives. We welcomed it into our hearts and homes, unaware that its gifts come encumbered with a new breed of menace.

Surveillance capitalism employs many technologies, but it cannot be equated with any technology. Its operations may employ platforms, but these operations are not the same as platforms. It employs machine intelligence, but it cannot be reduced to those machines. It produces and relies on algorithms, but it is not the same as algorithms.

Surveillance capitalism’s unique economic imperatives are the puppet masters that hide behind the curtain orienting the machines and summoning them to action. These imperatives, to indulge another metaphor, are like the body’s soft tissues that cannot be seen in an X-ray but do the real work of binding muscle and bone. We are not alone in falling prey to the technology illusion. It is an enduring theme of social thought, as old as the Trojan horse. Despite this, each generation stumbles into the quicksand of forgetting that technology is an expression of other interests. In modern times this means the interests of capital, and in our time it is surveillance capital that commands the digital milieu and directs our trajectory toward the future.

Surveillance capitalism is not simply an accident of overzealous technologists or poor design. Rather it is a rogue capitalism that learned to cunningly exploit its historical conditions to ensure and defend its success.

On the whole, do you think the web, and the internet more broadly, has had a positive or negative influence on society?

The internet is now owned and operated by private surveillance capital. The internet promised to amplify human voice, connection, empowerment, and the democratisation of information. These promises have been fulfilled in many ways, but they have come at a cost. That cost now threatens to overwhelm the emancipatory potential of the digital future, setting it on a collision course with individual autonomy and the possibility of a democratic society.

Democracy both slept through and colluded with the rise of surveillance capitalism. The result is a threat to the very possibility of democracy, which is now eroded from below and from above. From below, surveillance capitalism erodes democracy by threatening self-determination, individual decision rights, agency, and autonomy, which are the elemental building blocks of democracy. From above it introduces profound new sources of social inequality. We enter the third decade of the 21st century with a new institutional pattern defined by unprecedented asymmetries in knowledge and the instrumentarian power that accrues to knowledge.

Surveillance capitalists know everything about us, while their operations are designed to be unknowable to us. They accumulate vast domains of new knowledge from us, but not for us. They predict our futures for the sake of others’ gain, not ours.

These asymmetries introduce a new dimension of social inequality: epistemic inequality and its corollary in epistemic injustice defined by the essential questions of knowledge, authority, and power. Who knows? Who decides who knows? Who decides who decides?

Back to Finding ctrl

Back