Read Transcript EXPAND
CHRISTIANE AMANPOUR: And we turn now to big tech with our next guest, Shoshana Zuboff of Harvard Business School who is warning of an inflammation coup in the way tech giants collect our data. She is the author of “The Age of Surveillance Capitalism.” And here she is speaking to our Hari Sreenivasan about her theory.
HARI SREENIVASAN: Christiane, thanks. Shoshana Zuboff, thanks so for joining us.
SHOSHANA ZUBOFF, AUTHOR, “THE AGE OF SURVEILLANCE CAPITALISM”: Thank you so much for having me, Hari. It`s a pleasure.
SREENIVASAN: So, you have been studying power, society, culture, the workplace, the internet for decades now. And here we are at this interesting point where while we`re having this conversation there is a mark threat to democracy, there is an influence of a former president, a lot of that has to do with technology and social networks. How much of where we are at today do you lay at the feet of these technology companies?
ZUBOFF: A great deal. I mean, we have never before had a situation in our society or any society where a private economic logic serving a corporation or set of corporations could drive disinformation at scale to so many people in such powerful and relentless ways day after day, hour after hour. The conditions that we are living in now as far as the digital domain goes are unprecedented in any history. Let alone in the history of our short digital century. So, a great deal is laid at the feet of this economic logic.
SREENIVASAN: So, explain surveillance capitalism to someone who is unfamiliar with it. I mean, you also call it almost a human futures market. What`s being traded? What is the surveillance and where is the capitalism?
ZUBOFF: Surveillance capitalism was a breakthrough idea where it was discovered that it was possible to secretly capture private human experience and treat it as free raw material for the translation into behavioral data. These behavioral data immediately then are declared as the private property of that corporation. Now, with their private property, which is data about us, they can take that into their manufacturing processes, which are, of course, computational, we call it artificial intelligence, and they can produce products which are computational products and they can sell them. What are the products they produce? They actually take all of this data about us and they produce products that are predictions of our behavior. What we will do soon and later. It turns out that many kinds of businesses and many kinds of industries really want know what we will do. So, they have established a new kind of marketplace where they sell these behavioral predictions. These are predictions of our future behavior, and I call these markets, as you have noted, humans` future markets, because now what has happened is our private experience has been commodified in the form of this behavioral data.
SREENIVASAN: Now, how different is what is happening in the digital domain than say if I am a customer and I go to McDonald`s and I keep ordering the large fries? After a while they say, well, this guy probably wants the large fries, it`s a pattern, I see that. Let`s have some more large fries ready when he shows up. How is that different? What do I own in the types of the transactions that I am making online or what, I guess, should I own but don`t really have any access to?
ZUBOFF: So, this is such a good question because it`s a huge point of confusion, the techniques for capturing our data are designed to be hidden. They are designed to keep us in ignorance. So, we rely a lot on weak documents and whistle-blowers and informants to find out what is really going on here, but the public at large, just — you know, we`ve just sort of proceeded really confused and really not understanding what is going on behind the curtain. So, here is a great piece of information from a leaked document, Facebook, 2018, we find out that it is describing what I call its factory, they call it the A.I. backbone. And we find out that in this A.I. backbone, the A.I. is computing trillions of data points every day and producing 6 million behavioral predictions each second. That is the kind of scale that we are talking about, Hari. Now, what folks don`t understand is that a very small fraction of those trillions of data points are based on what we knowingly give to the companies. Most of what they have about us is what they secretly take. And when they write about their ability to do this, they celebrate the fact that they can lift personal information from us for translation into behavioral data, they can aggregate it in ways that allow them to infer things about us that they know we do not intend to disclose, sexual orientation, political orientation, personality profiles, just to get started, and they understood right from start, and I mean from the start, going back to the very early 2000s when we began to have research and surveys on this, they understood from the start that as soon as people find out about what is really going on, they hate it, they rebel, they resist, they want to hide, they are looking for camouflage, they want an alternative. But what has happened over these past 20 years is those alternatives have disappeared. And so, now, we are finding ourselves pretty much trapped, every internet interface is now a data supply chain for surveillance capitalists, artificial intelligence factories, and its profit-making operations.
SREENIVASAN: I want to be clear that this is not just about Facebook or Google, we are talking broadly about digital domains anywhere that we go on the web, there`s tracking cookies and so forth that are constantly in effect, I want to pull this to the audience`s attention. You say, they have declared our personal information their private property, so they own all the information. They have all the rights to the information based on their ownership. They know more than we do. The gap between what we can know and what can be known about us is growing exponentially every moment. So, there is new extreme inequality. Explain that.
ZUBOFF: So, just about every industry now has taken on this model as the key way to produce revenue. So, with this inequality, you know, we are used to talking in the 20th century about concentrations of economic power. But now, we have to talk about concentrations of knowledge, and that`s important for two reasons. The basic problem which is I can`t know nearly as much as these people can know about me, as their machine systems can know about me. Why is that so critical? Because knowledge, as everybody has heard, is power. And when we talk about the operations of these companies we talk about a range of targeting mechanisms that use all the knowledge they have amassed about us to come back to us in very specific ways, manipulating what they know about us to get us to think things, do things, join things, sometimes just behave in subtle ways and sometimes behave in very overt significant ways that we would not otherwise have done. The knowledge they have about us translates into a new kind of power to shape, tune, herd and modify our behavior individually and collectively at scale. This is a digital-born form of power that has never before existed.
SREENIVASAN: You know, many of us as consumers have been lured by the bargain that, hey, for this, I`m getting free e-mail, I`m getting free something or another, right? I mean, economists will say there ain`t no free lunch, right, but there is always this notion that I don`t have to participate in the social network, I don`t have to use this particular e- mail provider. Is it possible for me to escape the clutches of what you are describing here?
ZUBOFF: The big deal is what we give them is a tiny fraction of what they have. Their systems are taking from us without asking. They are taking from us in a way that is engineered to bypass our awareness. Now, I know you have young children. You go home tonight, Hari, and ask one of your kids, hey, if somebody takes something from me without asking and then they say it belongs to them and they`d go off and they do stuff with it and they sell it, what should I do about that? And I guaranty you, your child will say, dad, that is called stealing. You should call the police, because somebody is stealing from you. But somehow, we missed that step. We have gotten so habituated to the fundamental illegitimacy, this license to steal that they have claimed for themselves, you know, that we have gone along with it. But the fact is that all around us, they are taking without asking. It is illegitimate and it is stealing.
SREENIVASAN: Is this the end game here of these companies? Why do they target me so that they can, what, deliver something to their customers which are all of the companies that are trying to sell me stuff? Is that all of it?
ZUBOFF: You`ve got to look at these capabilities and what they are able to do, and the sort of the power that is at work here. Now, if we just pivot this a few degrees from commercial outcomes to political outcomes, I think that you will see my point. We now know, Hari, that in 2016, you know, the Facebook executives have claimed that Mr. Trump would not have won the 2016 election had it not been for Facebook. They said that his campaign used Facebook more potently exploited all of its capabilities more potently than any other candidate had ever done in the three key swing states that year, you will recall Michigan, Wisconsin and Ohio, the Trump campaign had an explicit goal of getting the black citizens not to vote, to refrain from voting. And they used all of this mass of information that they had on black citizens to analyze their personalities, to analyze their political orientations, to analyze other things about them, to get an idea of what their emotional triggers might be, all of this analyses, and then they created a category out of all of this which they called the deterrables, the people who could be deterred from voting on election day. And then they used the targeting. We know from the larger picture that the black vote was successfully suppressed in these states. And there is every reason to believe that these efforts were effective. But here is what I want you to think about. Democratic citizens in the oldest democracy on earth relinquished, voluntarily relinquished their most critical democratic right, which is the right to vote and they did it without anyone ever knocking on their door and showing a gun. They did it without anyone ever threatening to drive them to the gulag or the camp, without anyone ever threatening murder or violence, they did it simply because they were manipulated through the milieu of the digital. They were manipulated through the medium of digital instrumentation. Manipulated in ways that were designed to keep them in ignorance so they never saw it coming. They never knew it was happening.
SREENIVASAN: Right now, there seems to be some focus on Section 230 and figuring out whether the platforms should be liable for what`s said on there. Should we be — is that kind of missing the point? I mean, are we looking downstream versus kind of more of the root of the problem? Should we be regulating them like utilities? I mean, is Europe doing it right? What should we be thinking of?
ZUBOFF: So, I am very motivated for us to really think hard about the solutions that we need right now, because we are on the cusp of the third decade of the digital century. If we think back to industrialization and 19th and 20th centuries, that was also a very bleak time, very much like now, employers had all the rights. Their property rights gave them all of the rights to dictate every aspect of what happened in a workplace. There were no workers` rights, there were no consumers` rights. We fought as a society, the public together with lawmakers to eventually, in the third and even into fourth decade of the 20th century, produce the new charters of rights and laws and the new kinds of democratic institutions to oversee it all that we would need to make the industrial century safe for democracy, and it is exactly that kind of challenge that we face right now. We need to be thinking about new kinds of rights. In every age, when the conditions of our lives change fundamentally, new kinds of rights become necessary. So, for the first time in the evolution of civilization, we have to have formal rights that say, epistemic rights, right to knowing about our own experience, these belong to the sovereign individual. So, I decide what to share and with whom and for what purpose. I decide what is public and I decide what remains private. What remain my secret or maybe I share it with my family or maybe I share with my best friend. So, it is not that we are not going to have big data, it is not that we are not going to be able to learn from data, it`s that the learning from the data is going to be tied to public service and the real needs of society. And it is going to be done transparently with the full participation of democratic citizens. But the fact is that, Hari, we have not yet begun. We have not yet begun. Now, it is really only in the past year or so that we are beginning to see democracy finally on the move. It`s on the move in Europe where they are starting to take the lead with critical new legislation that is now in front of the European Parliament. And if this legislation is passed in the next year or so, which I believe it will be, it will quite firmly reassert democratic governance over these companies and their operation operations, and that will be, you know, the first big effort at turning this huge ship, turning the Titanic away from the iceberg, something that we will be building on for this decade and the decade to follow that.
SREENIVASAN: Shoshana Zuboff, thank you.
ZUBOFF: Thank you, Hari.
About This Episode EXPAND
Reviving the Iran Nuclear Deal is a major priority for the Biden administration, and the White House has now announced it is ready to engage in talks with Iran. Jake Sullivan held top foreign policy positions under President Obama and played a key role in hammering out the original deal. He now serves as President Biden’s national security advisor and joins the show from the White House.
LEARN MORE