The Quest for Stewardship in the Digital Age

Digital Transformation is advancing, virtually at light speed, and has a series of repercussions on people’s lives and livelihoods. Disruptive innovation is causing turbulence and governance challenges are piling up. From a political science perspective, the author identifies the most burning issues and sketches out a research agenda for “stewardship in the digital age.”

As pointed out by Shoshana Zuboff in her seminal book, The Age of Surveillance Capitalism (2018), an unparalleled accumulation and concentration of information, wealth, and power have been going on over the last two decades, or so. Research published by Oxfam in January 2022 reports that the ten richest men in the world (there are no women among them!) managed to double their fortunes only in the two-year period of the COVID-19 pandemic (from $700 billion to 1.5 trillion).[1] Some scholars see the United States as developing into a plutocracy (Kuhner 2015, Pierson 2017). In Austria, a country commonly regarded as a (post-)traditional welfare state, the top 1 percent owns between 30 and 50 percent of the country’s wealth; the top 10 percent account for about 70 percent (National Bank of Austria 2022; figures are estimates due to the state’s bank secrecy regime). Facebook has increased its earnings by 20 percent in the last quarter of 2021 (and yet its shares lost a quarter of their value in January 2022, “because of the rapid expansion of its competitors,” according to its CEO, Mark Zuckerberg).

While this may be anecdotal evidence, inequality is not a new phenomenon. Recent developments with regards to the unequal distribution of wealth, however, appear to exceed any reference in history (Piketty 2017, Streeck 2021). Wealth has not always been a necessary and sufficient condition for power (Dartnell 2018). Today, few very powerful persons dominate the entire world in an unprecedented manner. They literally drive a biopolitical project to govern individual human behavior to attract and retain ever more people as their customers, clients, users, followers, and products (Foucault 1975, 1979; Zuboff 2018).


The tasks and risks for the generation “post festum nati” (Generation PFN); Digital natives—born into data-slavery?

For the young generation, the “post-millennials” (commonly called Generation Z), those born after the millennium, they are “Generation PFN”—grown and socialized in the midst of financial, migration, and climate crises and the COVID-19 pandemic—apparently, after the party is over), the challenges and risks are enormous, and not always obvious. They are fed the current “dogma,” namely that the blessings of digital transformation and “new” (“sharing,” “green,” “smart,” etc.) economy will solve sustainability issues and further human development. Honestly, I am puzzled by the question of how the commercialization of all areas of life by ever-new appliances and gadgets is supposed to fix our problems; and whether “transhumanism” really is to be considered the next step in the evolution of the super-sapiens (or meta-sapiens). Personally, as a teacher of politics, I have a hard time convincing my students that TikTok will probably not save the world. Neither do I see how online shopping can be seen as a sustainable practice (especially, when we know that half of the goods are sent and shipped back and destroyed). Please, understand me right: I am not anti-tech, I do not deny the potential benefits of scientific and technological progress; rather the opposite: I quite happily embrace the advantages offered by automation, acceleration, and increasing inter-connectivity, such as smart phone apps and information systems. And, I have the choice; I do not need to adopt all of digitalization.

Yet, if you look at it, much of this novelty seems to be a huge lure—and it may reveal to turn out to be a massive trap. The pandemic has, in my view, (had) the effect of a catalyst for the establishment of global surveillance capitalism. Moreover, it is manifest that our young people have suffered tremendously from two years of (self-)isolation and—if not “self-incarceration,” then at least extreme (self-)restrictions in almost all countries of the world. The German philosopher Peter Sloterdijk finds that the governments’ protective measures against the pandemic amount to a “medico-securitarian regime” (Sloterdijk 2021). Youngsters and young adults were left with their communication devices and info- or entertainment electronics, and have been encouraged to make best use of them for learning and exchange with peers. The online-gaming sector and dating platforms are booming, as are streaming formats, consumption services, and providers of pornographic content, and the big tech companies (GAFA + Alibaba etc.) could build imperial power structures, reaching into the bedrooms of virtually anyone. What this will do to this generation (and even younger kids) is not yet known to us. According to research by cognitive- and neuro-sciences, too much exposure to digital devices and a lack of direct, physical social interaction (including tactile sensing and closeness to other humans) may affect the brain and basic neuronal structures, such as the limbic system (Damasio 2017, 2021; Roth 2021). Sociologists, in turn, point to crucial real-life embedment in structures, habitus(ses), and rituals, connecting humans with each other and creating their own resonance-room, which suddenly are replaced by virtual social networks and soon to be absorbed in a “meta-verse” (Bourdieu 1979, 1980; Rosa 2016).

We should be prepared for a somewhat gloomy outcome of this “reality check.” Perhaps part of the disruptive and turbulent innovation experience we are witnessing has detrimental and worrying externalities, which we may have overlooked in our enthusiastic adaption to the new realities that command our attention and offer a plethora of commodities.

The following “governmentality dispositives” (Dean 2010) demand our urgent attention:


Biopolitics meets bio-economics: Creating the conditions for neo-illiberal/authoritarian, cyber-capitalist data extraction and production; Silicon Valley and the military industrial complex

What is behind this—admittedly presumptuous—title and another neologism, I want to introduce, namely “data-colonialism?” Zuboff speaks of “behavioral surplus capture” via an ever-expanding “extraction architecture” to develop prediction and manipulation products, and to finally attract and retain more customers, users, or followers, to collect even more data and information for the purpose of big data trading. Our personal information is then sold and used by other actors such as insurances, banks, healthcare business, sales, and companies. By transforming human behavior into a tradeable good, a commodity, we achieve the commercialization of all areas of life, including private life, family, friends, love-life, personal relationships, and health (The Economist 2018). Humans themselves become or, in fact, are treated and traded as a commodity (Polanyi 1957). And we willingly give up our most private data to all sorts of businesses for the sake of better services and a more comfortable life.


The state-capitalist surveillance and oppression apparatus: China’s repressive, authoritarian hegemony

Some time ago, we learned that the Chinese government has introduced a social evaluation system to control and punish (Surveiller et Punir) their populations. By intruding into all areas of citizens’ lives, this total technological surveillance is much more effective than Foucault’s “panopticum” (Le Panoptique) and has berobbed the Chinese people of their freedom and basic human rights. Large sections of public and private spaces are under constant video and electronic monitoring, analyzed with the help of advanced software, such as face recognition. Any minor misdemeanor will be recorded by the authorities, by businesses and fellow citizens, filed and stored, and a certain number of points are deducted from the citizens’ social score (ORF 2022). If the recollected offenses reach a certain level, a citizen may not be allowed to travel any more, live in a certain area, use public services (education, health, housing), nor embark on certain career paths. This resembles an Orwellian state, where opponents and members of ethnic minorities, for instance, the Uighurs are oppressed and silenced, captured and brought to “re-education camps” to infuse them with the correct views on the world (Amnesty International 2021; The Economist 2018).

As widely known, these dual-use appliances are developed and distributed by Chinese state-owned companies (SOCs), and China is eager to export its surveillance products to other markets, notwithstanding if they are used to support suppression and autocratic regimes and rulers abroad. And, in the midst (or, hopefully, towards the end of this strange and straining pandemic, merely all states of the world partake in the Beijing Olympic Winter Games 2022, probably the best monitored games of Olympic history).[2]


The ethics of Artificial Intelligence and machine learning: Checks and balances for the military industrial complex

Etymologically speaking, “technics” or “technology” stems from the ancient Greek word τέχνη (téchne) which means a war ruse to kill as many opponents as possible, as the German translation “Kriegslist” suggests. In this tradition, technology would be the science of warfare skills and competence. More recently, “cyberwar” has entered our vocabulary and research on strategic human autonomy is building scenarios for tomorrow’s world order designed by computer systems (Gill 2019). At the same time, warfare robotics and drone technology can be used by terrorist groups [read: “the enemy”], according to recent reports by Foreign Affairs and the European Council on Foreign Relations (ECFR), and thereby increasingly contribute to “destabilize global politics” and pose direct threats to a. o. US troops themselves.[3]

Instrumentally analyzing technological progress in armament industries, Gill (2019) finds that “it is certain that combat systems will have much more autonomy and humans will be working much more closely with machines than they do today.”

He writes:

With the introduction of AI techniques such as machine learning, deep neural networks and reinforcement learning, there has been a shift from rule-based deterministic systems to more data-driven and outcome-oriented systems [… able …] to extract insights from training data in ways that not always apparent to human programmers […] (Gill 2019).


Automated decision-making in the public and private sectors; public services, banking, insurances, transport, welfare, influencing public opinion (Cambridge Analytics etc.) 

Oftentimes, it is argued that machines are the more efficient and more just deciders, since their calculation is based merely on logical computation and is therefore totally unbiased. Now, I am not an information scientist nor a programmer, but it would appear to me that a potential bias has only been transferred from the individual decision-making act to the production of a certain algorithm that computes the factors and transforms them into a decision. So, if there is a bias in the algorithm, it would apply to all decisions made by the machine instead of a single case (Sfez 1992). We all experience the bias of algorithms used by Google or Facebook—so we would have to be particularly careful in applying these technologies to highly important decision-making processes, such as social and welfare services or questions of life and death, of the kind that war-machines would have to make. To my mind, this sort of decisional outsourcing appears to be an extreme instance of desynchronization and alienation—potentiating an escalation with most imponderable consequences of the severest character (Reckwitz and Rosa 2021, pp. 201 seq.).

While in principle encouraging the development of AI in the EU, the European Commission has set ethical guidelines for use and application (European Commission 2020). To date, it remains, however, unclear whether these guidelines are sufficient to safeguard the fundamental rights of citizens.  The guidelines are: “1. Human agency and oversight; 2. Technical robustness and safety; 3. Privacy of data; 4. Transparency; 5. Diversity, non-discrimination, and fairness; 6. Societal and environmental wellbeing: 7. Accountability;” (EC 2020). Altogether it would seem that these ethics rules are rather general and abstract principles, and their implementation is left to the EU member states, which may have very different interpretations of the Commission’s recommendations.

Most recently, EU member states plan to put in place legislation against hate speech and to force social media platforms to moderate user content. The platform Telegram is particularly difficult to tame, as the company is based outside the EU (ORF Abendjournal 04.02.2022). Here, the question is how to weigh freedom of expression against potential threats from disinformation or criminal content.


“Digital elites”: What, who and where are they? What type of power and resources do they control? Whom do they influence? How do they exercise power? What is their agenda?

In the United States, populist plutocracy (Pierson 2017) is forged in the politico-finance-technological bubble, as the example of Peter Thiel and his substantial contribution to Donald Trump’s election campaign demonstrates. In December 2021, after his resignation from the office of the Austrian Federal Chancellor because of his alleged implication in a case of serious corruption the (populist) politician Sebastian Kurz joined Thiel’s team, apparently for reasons grounded on his political networks and close contacts to European leaders. This is the way wealthy individuals in the fin-tech bubble systematically work on gaining influence and power at the political level. This, in turn, is the recipe for institutionalizingwhat in line with—or in prolongation of—Pierson (2017) could be termed ‘pluto-polic populist autocracy.’

Concomitantly, directional manipulation of voters has alarmed the international public in the aftermath of the US presidential election of 2016 and ahead of the German parliamentary election of 2021. The internet has become the favorite playing ground, or rather, battlefield, for actors spreading all sorts of misinformation and conspiracy theories, hate speech and radicalization by different kinds of extremists. Quite obviously, such developments pose a direct threat to the functioning of our democratic institutions and demand urgent attention.


Implications for the “real world:” personal safety, vulnerability, interoperability, and over-exposition; Towards a “private equity society?”

Disappearance and replacement of older cultural techniques by new ones is not a recent or novel development. Yet, the speed and scale of transformation make digitalization one of the mega-trends (next to climate change and aging populations in the industrialized world) of our era, which has been termed the Anthropocene by earth scientists.

The digital divide is increasing inequality and the helplessness of the non-illuminati, i. e. those who do not have the knowledge to critically assess the opportunities and risks of digitalization. There is an urgent need for progressive education (critical approaches and analogue knowledge) and applied techniques of governance (taxing, taming, limiting, reining in) vis à vis disruptive innovation and its externalities. My focus here is to point out some of the risks and dangers, raise awareness for them and to look for solutions or mitigation strategies. Governments are on the reactive side, so they are always (at least) one step behind the entrepreneurs of innovation, and they need to build structures and competences to effectively act to protect citizens and to guarantee the best possible outcome of this transformative process.

Change management literature suggests that every intentional re-design of our work- and life-world will produce unintended consequences and provoke resistance. This is true, for example, concerning the permanent accessibility via ITC in a professional context. As a result of tech-facilitated (self-) exploitation, burnout and depression seem to be the conditions of our time (– as appear to be obesity and myopia[4]).

Furthermore, cyber-mobbing and cybercrime, such as identity theft and internet fraud as well as digital money-laundering by use of cryptocurrencies, have become familiar phenomena in practically all countries, just as hacker attacks and blackmailing of companies, hospitals, and public administrations are frequent occurrences. This puts the vulnerabilities of our societies and, especially, of our critical infrastructures in the spotlight, as power-outages—reportedly due to hostile coding activities, or the recurring large-scale theft of bank details—health and medical registers and other important personal information clearly demonstrate. Hence, government action to put in place countermeasures, defense systems and safety standards to protect our life-sustaining systems (security, transport, energy, health, and administration) is of the essence.

Whether you look at the energy consumption of server farms, the subversion of labor standards for workers in online-delivery, production, and distribution services or at the lawless space harbored by what is called the “darknet,” there are a lot of important items on the digital agenda of decision-makers, at all levels of governance.


“The Sorcerer’s Apprentice” and the need of stewardship in the digital age: Governance challenges of the digital transformation

All is not lost. There is no point in doomsday scenario-building. In writing this, I hope to reach out to educators and network with them to build the competences for the digital age, and to raise awareness for the need for stewardship of the transformation we are experiencing. Of course, we want to continue to make ample, even excessive use of technology for information exchange, knowledge production and dissemination as well as social networking and nurturing our personal relationships. But lawmakers must address problems related to behavioral surplus extraction, manipulation, the ethics of AI, data privacy, intellectual property and copyright in this age of permanent and ubiquitous online-ness of all types of appliances and devices.

From the governance side, the European Union has made inroads with its legislation on data protection and privacy rights. Especially the EU Parliament has taken the role of a policy entrepreneur in these issue-areas, as well as in the efforts to adequately tax big tech firms. However, also national governments need to live up to their responsibilities in this area, which is a fast moving and ever-expanding part of our reality. Balanced education is one of the key priorities, preparing pupils, students, professionals, senior citizens to recognize and manage opportunities, challenges and risks related to the use of ITC. In any event, we need to be careful, so individuals are not pushed out of the public space by technology into an obscure private sphere, where they no longer are part of the “polis,” our common space, the polity.



Thomas E. Henökl is an Associate Professor of Public Policy and Member of the Jean Monnet Centre of Excellence (JCMoE) at the University of Agder (Norway) and a Senior Research Associate at the German Development Institute (Bonn). He works in research and teaching in the fields of European politics, public administration, EU foreign and security policy, international cooperation and development, and more widely on comparative politics and organization theory. Previously, Thomas Henökl had worked for the European Commission, External Relations DG (from 2011 the European External Action Service, and among other assignments in the EU Delegation in Tokyo) and at the European Institute of Public Administration (EIPA). Some of his scholarly work appeared in the Journal of European Public Policy, West European Politics, Journal of European Integration or the European Foreign Affairs Review. He holds a PhD in Political Science from the University of Agder (Norway), as well as three Masters’ degrees in Political Science, European Public Policy and Public Administration from the University of Innsbruck (Austria), the Institut d’Etudes Politiques (Sciences-po), Paris and the Graduate School of Public Administration at the International Christian University, Tokyo.




Bourdieu, Pierre. 1979. La Distinction. Critique sociale du jugement. Paris: Les Éditions de Minuit.

Bourdieu, Pierre. 1980. Le Sens pratique. Paris: Les Éditions de Minuit, coll. « Le sens commun ».

Ibid. 1982. “Der Sozialraum und seine Transformationen.” In: Die feinen Unterschiede – Kritik der gesellschaftlichen Urteilskraft. Suhrkamp, Frankfurt am Main.

Bourdieu, Pierre. 1997. “Zur Genese der Begriffe Habitus und Feld.” In: Bourdieu: Der Tote packt den Lebenden. VSA-Verlag, Hamburg.

Damasio, Antonio. 2021. Feeling and Knowing. NY: Penguin Books

Damasio, Antonio. 2017. The Strange Order of Things. NY: Penguin Books

Dean, Mitchell. 2010. Governmentality. Power and Rule in Modern Society. London: Sage.

Dartnell, Lewis. 2019. Ursprünge. [Origins]. München: Hanser (translated).

European Commission. 2020. White Paper “Trustworthy Artificial Intelligence”, Brussels DG Digitalization.

Foucault, Michel. 1975. Surveiller et punir. Naissance de la prison. Paris: Gallimard.

Foucault, Michel. 1979. Naissance de la biopolitique (1978-1979)., Paris: EHESS, Gallimard, Le Seuil, coll. « Hautes études ».

Gill, Amandeep Singh. 2019. “Artificial intelligence and international security: the long view.” Ethics & International Affairs33(2), 169-179.

Kuhner, T. K. 2015. “American Plutocracy.” King’s Law Journal26(1), 44-75.

Larsson, Stefan, Ingram Bogusz, Claire and Andersson Schwarz, Johan, Human-centred AI in the EU. Trustworthiness as a strategic priority in the European Member States. Bussels: Elf FORES.

ORF (Austrian Public Broadcaster), 2022, available online, at: OE1.ORF.AT

Pierson, P. 2017. “American hybrid: Donald Trump and the strange merger of populism and plutocracy.” The British journal of sociology68, S105-S119.

Piketty, Thomas. 2013. Le Capital au XXe Siècle. Paris: Seuil.

Polanyi, Karl. 1957. The Great Transformation. The political and economic origins of our time. Boston: Beacon Press.

Reckwitz, Andreas, Rosa, Hartmut. 2021. Spätmoderne in der Krise. Was leistet die Gesellschaftstheorie?. F.a.M: Suhrkamp

Rosa, Hartmut. 2016. Resonanz. Eine Soziologie der Weltbeziehung. F.a.M.: Suhrkamp

Roth, Gerhard. 2021. Über den Menschen. F.a.M.: Suhrkamp.

Sfez, Lucien. 1992. Critique de la Décision. Paris: Presses de Sciences Po

Streeck, Wolfgang. 2021. Zwischen Globalismus und Demokratie. Politische Ökonomie im ausgehenden Neoliberalismus. F.a. M.: Suhrkamp

Simmel, Georg. 1900. Die Philosphie des Geldes. Berlin: DeGruyter.

Sloterdijk, Peter. 2021. Der Staat streift die Samthandschuhe ab. F.a.M.: Suhrkamp

The Economist. 2018. Modern love. Dating in the digital age, August 18th-24th

Zuboff, Shoshana. 2019. The age of Global Surveillance Capitalism. The Fight for a Human Future at the new Frontier of Power. London: Profile Books

[1] Inequality kills | Oxfam International


[3] Drones Are Destabilizing Global Politics | Foreign Affairs ; The Drone Threat Comes Home | Foreign Affairs ; Turkey’s drone diplomacy: Lessons for Europe – European Council on Foreign Relations (

[4] Why Nearsightedness Is on the Rise in Children – The New York Times ( ; What is driving global obesity trends? Globalization or “modernization”? | Globalization and Health | Full Text (


Print Friendly, PDF & Email