Skip to content

Power Asymmetries in the Medium of Data

von Max Grigutsch

Extraction – integration – analysis – personalization – sale. And repeat. The algorithm of data use has been written, compiled and executed innumerable times; and with it, we ourselves have been written, compiled and executed. Humans are the input variables and the output variables in the disciplinary apparatus of technology-enabled capitalism today. With data, people have once again created a means to overpower people. The Cambridge Analytica scandal is but the most politicized instance hereof; the tip of the iceberg. 

Data may well be the new oil, but how does that work, economically speaking? How does one turn data into profits? The most comprehensive examination of contemporaneous data use in the structures of Capital is extended in Shoshana Zuboff’s (2015, 2019) notion of Surveillance Capitalism. Grounded in the hitherto largely unregulated domain of Big Data, she (2019, ch. 1.III) argues that the default business model of most modern internet companies (often even non-internet companies) consists of selling data or data-products. Google, as a frontrunner in this field, makes around 86 per cent of its revenue from such sources (“Annual Report 2017,” 2018). It does so in the following simplified process: First, a variety of data is distilled from users (extraction); this data is then combined into large electronic databases (integration); in a third step, the data is mined and further analyzed for predictive purposes (analysis); fourth, it is customized according to target audiences (personalization); eventually, the data or even finalized prediction products are sold for profit (sale). Various corporations and government agencies employ an apparatus that follows this or a similar approach: Extraction – integration – analysis – personalization – sale.

Crucially, prediction and prescription go together. Zuboff (2019, ch. 13.I) holds that data-based predictive technologies are, in fact, means of behaviour modification. Companies do not take data sets for granted; instead, they increasingly embark upon the quest of manipulating the original data to fit desired and guaranteed outcomes (2019, ch. 7.I). Ironically, the unknowable neoliberal market of Hayek and Nozick transforms into a market of total certainty (2019, ch. 13.II). The goal: change consumption choices towards more profitability! In the political arena, these prospects became apparent with the Cambridge Analytica scandal. Economically, we might experience this as more and more sophisticated advertisements, which are increasingly personal, attach to our deepest desires and emotions, and appear at exactly the right moments. As Hal Varian (2014, p. 28), chief economist of Google, states: “[Google] should know what you want and tell it to you before you ask the question.” Reality itself becomes a commodity, that is realized in anticipatory conformity of its inhabitants (Zuboff, 2015, pp. 82, 85).

Data is the farthest from being a neutral medium to convey content. It is the data structure itself which implies asymmetric power relations. “Media react to media” and “[h]umans are at best along for the ride” (Winthrop-Young, 2011, p. 65). By necessity the critical theorist today is set up to question regarding information technology. In accordance, what I am to investigate in the present essay is its heart: the medium of data. More precisely, I wish to scrutinize the ‘nature’ of data with an eye – or rather both eyes – turned towards the power structures this medium implies, involves, incorporates; implying them, that is, for humans. Drawing from thinkers like Michel Foucault and Shoshana Zuboff, I query what power relations are present in data, and presented to us through data. For if I am correct in detecting certain power asymmetries in the medium, then projects like equality, freedom, and democracy are under siege – a siege that needs to be crushed.

Pursuing my analysis, I wonder, in section I, about the origins of specific power structures within the data form itself. It follows an examination of power in the contemporaneous use of data (section II). In an outlook, section III finally considers some corresponding implications for the human subject. 

I. Power Relations and the 'Nature' of Data

In general, the data form reveals itself in various interrelated ways, all of which are worth analyzing: its quantitative essence, reducing subjectivities into objectivities; its connection to profit and the capitalist system; its alleged neutrality regarding the contents different data represent (Zuboff, 2015, p. 79); and so forth. What interests me here, though, are the power relations inherent to data. For I hold that the latter is not the neutral medium its proponents would have us believe. When it comes to issues of privacy, it is not unusual for data giants like Google, Facebook, Microsoft, Amazon, and others to grandly announce that they are unconcerned with the contents of user’s internet behaviour, such as their emails or messages (Burlacu, 2018). Or, in a more recent instance, Facebook’s Chief Privacy Officer Erin Egan claimed that the company does not “watch or listen to your audio or video calls” (Biddle, 2020). The occasional internal document leak and subsequent scandal notwithstanding, this is probably largely true (Price, 2019). The unfortunate fact is: They do not need to. As long as Google and co. have access to their user’s metadata, they have more than enough information to profitably sustain their business. Indeed, we are talking high profitability, with data having become the world’s most valuable resource to date (Selz, 2017; Tarnoff, 2018; The Economist, 2017). It is predominantly metadata, or what Zuboff (2015, p. 79) calls ‘data exhaust’, that underlies this fountain of wealth. To Big Data it is not necessarily what you write that matters but how you write it, for example – Google knows how to turn even the most trivial sets of data trash into money. As the documentary Nothing to Hide (Meillassoux, 2017) vividly displays, with nothing but superficial data can sophisticated technology map out a personality and behaviour profile which is more accurate than a self-description of one’s conscious self. Significantly, “Google knows far more about its populations than they know about themselves” (Zuboff, 2015, p. 83).

The foregoing ‘problem statement’ points to crucial injustices and inequalities related to data, which go beyond the mere knowledge of another’s secrets. These inequalities are best evaluated under the header of power. In this regard, I contend that asymmetries of power become prevalent when scrutinizing data, both by virtue of its nature and its current utilization. I shall treat both these aspects under what I term ‘the inaccessibility problem’.

As data qua data, a disbalance is already exposed in its two most common definitions. According to Merriam-Webster (“Data,” n.d.), data refers to either “factual information (such as measurements or statistics) used as a basis for reasoning, discussion, or calculation” or “information in digital form that can be transmitted or processed”. We thus have: Data as information and data as bits and bytes. 

As for the former, let us remember that we humans are the original computers: Humans have a very refined capacity for processing, storing and retrieving information. This ability is existent in every human to some degree, in some fashion. Nonetheless, issues come with numbers. The sheer mass of information available is restricted by the individual capacity to cope: There is only so much information one person can effectively manage. More so, the bulk of information permits one entity to possess more than another. As scientia potentia est, it is here that power inequalities attach. While possibly insignificant at the micro-level of individuals, information lends itself to be accumulated, conglomerated, by meso and macro groups, such as, interest groups, companies and states. Masses of information are hardly handled by one person alone, but increasingly more accessible the larger the handler becomes.

There are limits to the amount of information that humans can naturally deal with, even if converged in larger entities. In this field, machines have long outraced humans, and will continue to do so exponentially according to Moore’s law. Enter data as bits and bytes. But here the inaccessibility problem is even more apparent, since humans simply cannot understand data in this form. Bits, ones and zeros, digital abstractions, are not within the repertoire of human language or cognition. They are, in this sense, unintelligible to humans. Certainly, individual pieces of data can be translated – binary can be translated into decimal, no doubt. However, strings of data, ever-growing and convoluted vastness of digital information, are not available for penetration by the human mind.

One might therefore suspect, in the data form, a propensity for power asymmetries. To sum up my endeavours to this point: Thus far I have described the inaccessibility problem, which is evident both in data as information, where mass opens up the possibilities for power, and in data as bits, where direct unintelligibility renders humans secondary interpreters at best. The medium of data is, so it appears, not altogether neutral. By contrast, it appears that data in itself is a potent carrier of power relations. 

 

II. Power Relations in the Current Use of Data

Fortunately, we are presumably far from an AI takeover à la Terminator. Nevertheless, in current reality, other actors have mastered the problems of mass and unintelligibility. In free market economies, corporations are a at the forefront of data-instrumentalization, and the profit margins of Google and the like bear witness to this. They thrive under the mantra that knowledge is power. Constantly perfecting the process from data extraction to the sale of prediction products, as mentioned in the introduction, the ‘reality business’, so termed by Zuboff (2019, ch. 7), modifies human behaviour for profit. This is hardly done in conditions of equality of power between the modifiers and those modified. In this section, I shall analyze the characteristics of these power relations in the contemporaneous use of data. While various interpretations seem suitable here – I personally would be interested in an examination according to Gramscian hegemony – the interrelations indicated so far occasion a power-knowledge analysis as described by Foucault. 

Foucault (1984, p. 175) cautions us that “power produces knowledge […]; that power and knowledge directly imply one another”. Key ingredient for an informed power in this sense is the gaze, of which he (1975, p. 173) writes that perfect discipline – behaviour control – would be administered through a single point of view which observes everything. As the most fitting analogy he discerns Bentham’s Panopticon, where a single guard in a watchtower in the centre of a prison is able to monitor all inmates simultaneously, without being seen themselves. The crucial point: The potentiality of being watched at any moment disciplines the inmates into conformity (Foucault, 1975, p. 201).

In a 21st century panopticism, this principle is perfected. On the one hand, we have extensive apparatuses of data utilization, where humanity’s best and highest paid scientists are coupled with enormous economic resources and most advanced technologies to administer data mining and advertising practices ominously termed neuromarketing, nanomarketing, predictive analytics, patterns-of-life analysis, and reality mining (Mileti, Guido, & Prete, 2016; Zuboff, 2015, p. 80). Using state of the art insights from psychology and neuroscience, marketing, data science, and so on, these apparatuses aim to conform human behaviour to specific goals – mostly, profitable consumption. While on the other hand we encounter the average internet user going about their daily life. 

Note that computer mediation makes the panopticon possible in a new way: Whereas Bentham’s design functioned by way of conceivability of being watched at any time – but the limits of human capacities rendered that a mere theoretical possibility (no guard or guards could ever really absorb the whole mass of information occurring in the entire prison) – computers are able to keep track of all data that can be gathered. True, no human could manually supervise all the data extracted today; but automated analysis programs can. Similarly, the exponentially increasing power and storage capacities of modern computing make available the accumulation of vast amounts of data in the hands of a few corporations. Doing so, the latter have found a way to handle the inaccessibility problem to their favour. By controlling the technological means of Big Data, a new priesthood has domesticated the mass and unintelligibility of data (Zuboff, 2019, ch. 6.IV).

Importantly, the average person simply does not have access to the apparatuses that Big Data possesses, and consequently, does not have access to the data itself. Even if she did, she would have no way of using it as these companies do. This phenomenon hints at asymmetries of knowledge – and ergo, power – that urge further scrutiny. As Zuboff (2015, p. 79) rightfully notes, data extraction is a one-way process and no reciprocal relationship. However, with Foucault, we must argue that a simple top-down overpowering is too easy an analysis. Instead, Foucault (1975, pp. 174-177) posits a hierarchized, pyramidal organization of power which operates “from top to bottom, but also to a certain extent from bottom to top laterally”. A simple exemplary visualization of this idea is provided in figure 1.

The medium of data is instrumentalized in the context of a modern knowledge-power nexus, featuring top-down information extraction and unilateral behaviour modification. The top of the pyramid has extensive knowledge in data form of the bottom layer. Conversely, bottom-up, the power is very limited. It exists only in form of weak and indirect feedback loops, extending at most to the possibility to rate apps, or more relevantly, protest and shape public opinion on issues such as privacy and data protection. Yet, knowledge of the practices of companies like Google is hardly present and accessible only with difficulty and restriction (Zuboff, 2015, p. 83). Again, Zuboff (2015, p. 83) clarifies: 

The work of surveillance […] is not to erode privacy rights but rather to redistribute them. Instead of many people having some privacy rights, these rights have been concentrated within the surveillance regime. Surveillance capitalists have extensive privacy rights and therefore many opportunities for secrets. These are increasingly used to deprive populations of choice in the matter of what about their lives remains secret.

Laterally, power and knowledge reinforce themselves in the form of Foucault’s (1975, pp. 177-184) notion of normalizing judgement, under which conformity is achieved not by compulsion but by defining the norm and punishing the abnormal. In the case at hand, this is represented by the sensed necessity to participate in the structures of Big Data. For instance, network effects of social media are only too prevalent. Non-participation may be punished with social exclusion: not knowing about events, exclusion from culture, or lacking information that the Google search engine could readily provide. Further, while it is possible – strictly speaking – to not use services like Google Maps, employing them makes life so much easier. Zuboff (2015, p. 83) calls this a 21st´ century Faustian pact: Exchanging personal integrity for effective living. It is evident nonetheless: Through all these power relations, it remains clear what entity is located on top of the pyramid.

Hence, we have established that the data structure incorporates non-negligible power asymmetries which operate ubiquitously. Are we entering the age of the transparent self; of constant surveillance; of puppeteering based on the power of Big Data; the prediction society?

III. Prospects: The Power of Data Internalized

It was Marx (1859, preface)  who initiated many of today’s structuralist debates, when he declared that “[i]t is not the consciousness of men that determines their existence, but their social existence that determines their consciousness”. Various thinkers, more or less in the Marxian tradition, have henceforth picked up on similar ideas. For instance, Herbert Marcuse theorized that societal needs are merged with subjectivities in the shape of false needs, penetrating the ‘biology’ of humans by means of mimesis, a direct identification of personal with structural needs; in effect, breaking with the Establishment, the status quo, would therefore require a break with oneself (Marcuse, 1964, pp. 7-12; 1969, p. 18). Another angle is treated by the Toronto School, when they assert that the medium of writing restructures consciousness in a manner different from primary oral cultures (Ong, 1982, p. 78). Or eventually Friedrich Kittler (1999 [1986], p. 16), who writes of a media-technological shift that ends the reign of symbolic writing, making possible the “fabrication of so-called Man”; the essence of the fabricated human “escapes into apparatuses” where “[m]achines take over functions of the central nervous system”. Kittler warns explicitly of information technology. 

Can we expect a shift of human consciousness in result of Big Data? And if so, are the power asymmetries intrinsic to data transmuted into human subjectivity? A mimesis with the data form; subjectivity transformed into objectivity? An affirmative answer would greatly problematize notions of autonomy, free will and democracy. Furthermore, the project of equality would be compromised to a deception of equality at best, and blatant hierarchy and external control at worst. Are we guided into an age of data hegemony, where those subordinated participate in their own subordination in a blissful euphoria in unhappiness? Arguably, a grim outlook, but one worth considering for the purpose of its prevention. Further research and opposition is required, and urgently so.

IV. Conclusion

Grounded in the nature of the medium data itself – data as information and data as bits – one finds the origins of power asymmetries that manifest themselves in its current utilization. The inaccessibility problem proves insurmountable for the average person, but entities like corporations have found technological ways to address the issues of mass and unintelligibility of data. In what can be laid out as a Foucauldian power pyramid, the data structure reveals itself as the opposite of what the internet originally promised. From freedom of speech, participation, and equal accessibility, we have proceeded to administered speech, compulsory participation, and accessibility based on our relation to the means of data harvesting. Far from being a neutral medium of information transfer, it represents the fulcrum of an unprecedented knowledge-power nexus.

References

Annual Report 2017. (2018). Alphabet Investor Relations. Retrieved from https://abc.xyz/investor/

Biddle, S. (2020). There’s No Telling What Data Facebook Will Collect If You Use Its Zoom Clone. The Intercept. Retrieved from https://theintercept.com/2020/05/20/facebook-messenger-rooms-video-call/

Burlacu, A. (2018). Facebook Messenger Monitors Users’ Messages, But For Good Reason. Tech Times. Retrieved from https://www.techtimes.com/articles/224501/20180406/facebook-messenger-monitors-users-messages-but-for-good-reason.htm

Data. (n.d.). Retrieved from https://www.merriam-webster.com/dictionary/data.  Retrieved 30.05.2020, from Merriam-Webster.com dictionary https://www.merriam-webster.com/dictionary/data

Foucault, M. (1975). Discipline & Punish. The Birth of the Prison. New York: Vintage Books.

Foucault, M., & Rabinow, P. (1984). The Foucault reader. New York: Pantheon Books.

Kittler, F., & Young, G. (1999 [1986]). Introduction. In Gramophone, Film, Typewriter (pp. 1-21). Stanford: Stanford University Press.

Marcuse, H. (1964). One-Dimensional Man. UK: Routledge & Kegan Paul.

Marcuse, H. (1969). An Essay on Liberation. Retrieved from https://www.marxists.org/reference/archive/marcuse/works/1969/essay-liberation.htm#s1

Marx, K. (1859). A Contribution to the Critique of Political Economy. 

Meillassoux, M. (Writer). (2017). Nothing to Hide. In. France.

Mileti, A., Guido, G., & Prete, M. I. (2016). Nanomarketing: A New Frontier for Neuromarketing. Psychology & Marketing, 33(8), 664-674. doi:https://doi.org/10.1002/mar.20907

Ong, W. (1982). Orality and literacy : The technologizing of the word. London: Routledge.

Price, R. (2019). Facebook fought to keep a trove of thousands of explosive internal documents and emails secret. They were just published online in full. Business Insider. Retrieved from https://www.businessinsider.nl/facebook-internal-documents-executive-emails-published-six4three-court-leak-2019-11?international=true&r=US

Selz, D. (2017). Data: The World’s Most Underused Valuable Resource. insideBIGDATA. Retrieved from https://insidebigdata.com/2017/12/06/data-worlds-underused-valuable-resource/

Tarnoff, B. (2018). Data is the new lifeblood of capitalism – don’t hand corporate America control. The Guardian. Retrieved from https://www.theguardian.com/technology/2018/jan/31/data-laws-corporate-america-capitalism

The Economist. (2017). The world’s most valuable resource is no longer oil, but data. The Economist, May 6th 2017 Edition. Retrieved from https://www.economist.com/leaders/2017/05/06/the-worlds-most-valuable-resource-is-no-longer-oil-but-data

Varian, H. R. (2014). Beyond Big Data. Business Economics, 49(1), 27-31. 

Winthrop-Young, G. (2011). Kittler and the Media. Cambridge, UK: Polity Press.

Zuboff, S. (2015). Big other: surveillance capitalism and the prospects of an information civilization. Journal of Information Technology, 30(1), 75-89. doi:10.1057/jit.2015.5

Zuboff, S. (2019). The Age of Surveillance Capitalism. The Fight for a Human Future at the New Frontier of Power. London: Profile Books.

 

Weitere Artikel

Die Negation der Vernunft – Neoliberalismus als Gegenaufklärung

Der Neoliberalismus steht im Widerspruch zu den Ideen der Aufklärung. Er basiert auf einem Misstrauen gegenüber der Vernunft und ist zutiefst dogmatisch.

Also, was bedeutet Akzelerationismus überhaupt?

Philosoph Pete Wolfendale räumt mit einigen Missverständnissen auf und erklärt was Akzelerationismus überhaupt ist - und was es definitiv nicht ist.

Die Haitianische Revolution – Angewandte Aufklärung

Die haitianische Revolution. Hier begegnet uns ein Aufklärungsmodernismus, der den Universalismus ernst nimmt, statt auf halbem Wege stehenzubleiben.

Die Haitianische Revolution – Angewandte Aufklärung

Die haitianische Revolution. Hier begegnet uns ein Aufklärungsmodernismus, der den Universalismus ernst nimmt, statt auf halbem Wege stehenzubleiben.

Also, was bedeutet Akzelerationismus überhaupt?

Philosoph Pete Wolfendale räumt mit einigen Missverständnissen auf und erklärt was Akzelerationismus überhaupt ist - und was es definitiv nicht ist.

Die Negation der Vernunft – Neoliberalismus als Gegenaufklärung

Der Neoliberalismus steht im Widerspruch zu den Ideen der Aufklärung. Er basiert auf einem Misstrauen gegenüber der Vernunft und ist zutiefst dogmatisch.