Surveillance capitalism maintains that it is not selling our personal data, and this is true

The problem is not either the digital world or the technology. It is the economic logic that is guiding it, which can be confronted only in political terms and with a political plan. Comprehending the economic logic of surveillance capitalism means exiting from the wonderland that has been constructed by the propaganda mechanisms of the technological giants.

This article was published on 30 June 2020 at the Greek newspaper “Avgi”. It is trasnlated by Wayne Hall and given to the Barricade with the consent of the author – Theodora Kotsaka.

The plundering of the indigenous populations of the New World by the Spanish Conquistadors in the 15th century is a story of violent seizure of land and wealth and extermination of populations. But for some reason to do with established morality the invaders felt the need to add a veneer of legality to their actions with the assistance of a legal text which they compiled and proclaimed in a spirit of celebration to the native populations they had compelled to be in attendance. The text – in the name of the Spanish king and the Pope – describes in immaculate Spanish the terms of surrender of wealth and subjection of the locals, naturally without troubling to enlist translators. Having taken this initiative, the Spanish occupiers very reasonably assumed that since the populations had been informed and had not manifested any disagreement with the content of the text, they therefore consented. The legitimation of the Spanish actions derived from that consent.

Shoshana Zuboff, author of The Age of Surveillance Capitalism, which was published last year (2019) and is already regarded as one of the most influential best sellers, argues that the abovementioned story provides an apt analogy for us to understand what our situation is today as “users”.[1]. This new identity of ours is predicated, it seems, on pressing an I AGREE button before installing a new app or new software in our electronic devices. Waiting for us at the end is a sibyllic economic-technical text entitled “Terms of Service”, which it would be of statistical interest to ascertain how many people actually read, apart from the cadres of the companies who commission it and the lawyers who draft it.2]. These are texts deliberately written to be as incomprehensible as possible, on the basis of which we divest ourselves of our constitutionally underwritten rights, like the right to privacy. We consent to activities such as the collection of our biometric data (voice, iris of the eyes, facial expressions depending on input content, etc.) or activation of the camera without our permission). In essence we consent to the processing of our personal data by the technological colossi who harvest it for commercial utilization. When the companies in question are called to account in public debate for their dystopian practices, their representatives, evoking the I AGREE button that we have all pressed, respond “but you agree”?

It seems that for the “Terms of Service”, apart from being impenetrable, it is a qualification to be in no way succinct. By way of illustration, the approximate time required for the relevant Facebook text to be read is seventeen minutes and twelve seconds, as long as is required for reading the United States Constitution.[3]. For TIKTOK or SPOTIFY more than a half-hour is needed, and Microsoft wins hands down: reading the terms of use for its operating system presupposes availability of more than an hour, as much time as is required for a person to read Shakespeare’s “Macbeth”. It is calculated that for a user to acquire real knowledge of the terms he/she is accepting in the guise of digital contracts, a total of 250 hours would be necessary. Users often feel that they are wasting their time trying to read such a demanding contract, which they cannot amend or react to in any way, or even understand.

Behavioral Futures, a new market.

All the above contravenes basic principles of contractual processes, as does the claim that “I have been notified”. Users are consciously abdicating their rights. These applications and software are the flagship of the most profitable entrepreneurial activity of our age: data collection. The user’s private activity is harvested and stored without any impediment. Data bases are constructed of the user’s behavioral characteristics. The technology makes it possible for private life to be transformed into a commodity. The introduction of a new sector into the market is the next form of capitalist enclosure for renewing it. This type of behavioral information stimulates the market today more than any other input. Human personal experience is treated as if it were the latest unexploited market and is transformed into behavioral data. It is capitalism’s new El Dorado.

Behavioral data is channeled into a new productive process, identified as AI (artifical intelligence). The products of surveillance capitalism are commodities for predicting user behavior. A broad spectrum of entrepreneurial activity is interested in products of this kind. The new market that is constructed markets predictions and is designated through the term behavioral futures. It is under the aegis of private capital, without constitutional protection, without control mechanisms, without regulation. A system based on concealed technologies, secret, designed with a priority on keeping us in the dark about its mode of operation.

New products and services make their appearance, such as Click Τhrough Probability (CTP), the probability of a use proceeding to the next possible click or Click Through Rate (CTR), denoting the ratio of users clicking on a specific link to the total number of visitors to a web page with an electronic message or advertisement. Products of this kind can be used for measuring the success achieved by an internet advertising campaign on a particular web page or the effectiveness of pre-electoral campaigns via e-mail. .

The logic of the new economic model is encapsulated in the two mottos communicated by the great technological giants to its employees: “keep the data flowing” through the creation of complex chains of data with an uninterrupted flow into the productive process, and “keep them involved” (keep the users on line), through their car, telephone, thermostat, the internet of things, the smart cities, through everything “smart”. Everything starts from an arbitrary assumption, without any legitimation, that our personal experience is free raw material for the enrichment of private individuals. Nevertheless, it would be good to bear in mind that the right of intervention in our future inclinations is not grounded anywhere.

Surveillance Capitalism

Until recently the situation could be summarized with the cliché “If you don’t pay for the product, you are the product.” Today we have progressed into a new phase. Technically we are not the product because we have been downgraded into a contributor of information on the real product, predictions of our future behavior which are sold to the strongest bidder. There are many companies that buy predictions as to what we are going to do, what our next moves are going to be. All sectors of our lives are of commercial interest. According to Shoshana Zuboff: Digital links are today a means for achieving other commercial objectives. Surveillance capitalism is in essence parasitical and self-referential. It revives the old picture projected by Karl Marx of a vampire capitalism that feeds on labor, but with an unpredicted twist. Instead of labor it feeds today on human experience.”

The processing of data and its conversion into predictive products is the reason that the technological colossi are able to assert publicly that “We don’t sell your personal data” and not be sued for claiming this. Τhe same is true for the recent case of the European legislation on privacy, the (General Data Protection Regulation), in the context of which the technological colossi argued that we can have access to the data they collect. This is data we have already given them. Access to the procedure for processing the data and the predictive products is not provided for. .

As competition over predictive products intensifies it becomes clear that surveillance capitalism is identifying the sources of the most useful data within everyday life, our actual reactions. This is where the possibilities are to be found for steering our actions in a particular direction congruent with the kinds of results they promise to their customers. It is the source of profitability: modification of our behavior in the direction of their goals, whether this entails purchase of a product, electoral choice or participation in political protest or a cultural curtain-raiser.

At the same time scandals linked to violation of users’ privacy are seeing the light of publicity with ever greater frequency and involve either private companies or governments. These are not isolated instances but rather fleeting – albeit revealing – evidence of a new economic logic that engulfed the planet at a time when humanity was preoccupied with savouring the delights of Facebook and Instagram, along with much else. There is an abundance of data from the time after the American presidential elections of 2016 and the Brexit referendum. It has emerged that in both cases monitoring of the behavior of citizens made possible the development of programs with targeted messages aimed at specific social groups. For example black voters in crucial states such as Pennsylvania a few hours before the elections received derogatory messages characterizing Hillary Clinton as a racist, strengthening abstention from the elections, withdrawal of support and ultimately loss of the state to Trump. Similar methods were employed in Britain a few months earlier. In both cases the Cambridge Analytica company was a key protagonist and the prime role was played by Trump’s chief strategist Steve Bannon.

Through a combination of collection, storage and analysis of data with the assistance of specially designed and secret artificial intelligence applications, Google and Facebook know the psychological triggers of the various social groups but also of each user individually, and they press the button accordingly. A car manufacturer or a political party can contact Google or Facebook to buy consultation programs for predicting political or consumer behavior which make it possible for political message or automobiles to be designed with optimum resonance in a specific community. Zuboff revealed that 87% of the income of Google and 90% of that of Facebook comes from selling such predictive packages.

Out of wonderland…

In the 1990s, when new companies utilized the potential of the internet, the technology had functioned in a similar way, reinvigorating the market. Some of these “dotcom companies” achieved extraordinary results, attracting major investors. Huge profits were made. But because of Wall Street’s unregulated mode of operation and out-of-control speculative practices, in 2001 the bubble burst, provoking a serious economic crisis. In the same year 11th September 2001 totally changed the landscape for personal data management. It is indicative that the day preceding the attacks the United States Congress had decided that dot.com companies cannot regulate themselves when it comes to issues such as e.g. cookies. Legislators in Washington had become uneasy about the personal data being concentrated on the internet in the hands of private companies that were providing free services, monitoring and recording the activity of users. Indeed on 10th September 2001, one day before the attack on the Twin Towers, a discussion was placed on the Congressional agenda which many believed would lead to the institution of strict rules and abolition of the possibilities for monitoring and recording of the activity of internet users. Within a single day all this changed.[4].

According to Zuboff “11th September provided the perfect opportunity. The vote was cancelled for the laws and regulation that would have limited many of the mechanisms and practices of monitoring which would subsequently evolve into surveillance capitalism. Washington allowed internet companies to develop this surveillance potential because the USA as a state is inhibited by its constitution from spying on American citizens.’ In other words America surreptitiously privatized the spying. It assigned the surveillance tasks to the private sector, to popular companies which appeared, and appear, innocuous and user-friendly, to companies to which we willingly award the right to keep us under observation, in exchange for their free services.

It is worth noting that until that time the predominant view in Google had been that the business model should not be based on advertisements, seen as something demeaning for their services. The crisis on Wall Street led to a revaluation of this policy. The search engine was tracking not profits but losses. Some people in the company recalled all the data that users happened to be providing that was being left and remained unexploited. The deregulation in protection of personal data offered new opportunities for further collection, processing and conversion of it into predictive products, initially for advertising companies. Profit levels went through the roof.

Cognitive asymmetries

The technological companies know as much about us as we know little about them. Time we spend investigating Google is time Google spends investigating us. The more we use the social media, the more the social media is using us. When they inform us of the company’s privacy policy, in reality it is the company’s surveillance policy. There has always been inequality of knowledge, but not with these characteristics of redefining ethical boundaries and ultimately the sense of self. We do not know about ourselves what they know about us. It is knowledge which comes from us but is not meant for us. At the same time new crucial questions emerge concerning the functioning of democracy”: “who decides, who is going to decide, who is going to know?” Democratic institutions have lost the script. The activities of the technological colossi are unregulatable.

In the corporate discourse elaborated by the technological colossi they call us “users”. This is a new ontological category with a content of its own, like that of consumer or lender. Zuboff argues that, just as in the 18th century the citizen was obliged to become organized with the attribute of worker to defend his/her interests, the same should be done today under the attribute of user. His or her activity is what generates profit.

It is possible for us to avail ourselves of digital technology without the operational framework of surveillance capitalism. The problem is not either the digital world or the technology. It is the economic logic that is guiding it, which can be confronted only in political terms and with a political plan. Comprehending the economic logic of surveillance capitalism means exiting from the wonderland that has been constructed by the propaganda mechanisms of the technological giants.

 

The Barricade is an independent platform, which is supported financially by its readers. Become one of them! If you have enjoyed reading this article, support The Barricade’s existence! We need you! See how you can help – here!

Photo: Pixabay, CC0


[1] Shoshana Zuboff, 2019, The age of Surveillance Capitalism: The fight for a human future at the new frontier of power, Profile Books UK.

[2]https://www.theguardian.com/technology/2017/mar/03/terms-of-service-online-contracts-fine-print. Of particular interest is the web page below where an attempt is made to summarize the terms of use for the most popular applications. https://tosdr.org/.

[3]The times have been calculated on the basis of the 200-250 words per minute that is the average for adults. Details of the times for the most popular applications can be found in the table https://www.visualcapitalist.com/terms-of-service-visualizing-the-length-of-internet-agreements/

[4] The change in US policy on respect for privacy and protection of personal data is encapsulated in the mass surveillance program Total Information Awareness (TIA) https://en.wikipedia.org/wiki/Total_Information_Awareness. For the role of the technological colossi in bringing it about see https://www.wired.com/story/darpa-total-informatio-awareness/

Leave a Reply

Your email address will not be published. Required fields are marked *