27 September 2023

Pandemic’s box: The coming changes that should have come anyway

Start the conversation

Zac Rogers* says the coronavirus pandemic has sped up changes that were already happening across society, from remote learning and work to e-health, supply chains and logistics, policing, welfare and beyond.


Big tech companies have not hesitated to make the most of the crisis.

In New York for example, former Google Chief Executive, Eric Schmidt is leading a panel tasked with transforming the city after the pandemic, “focused on telehealth, remote learning, and broadband”.

Microsoft founder Bill Gates has also been called in, to help create “a smarter education system”.

The Government, health, education and defence sectors have long been prime targets for “digital disruption”.

The American business expert Scott Galloway and others have argued they are irresistible pools of demand for the big tech firms.

As author and activist Naomi Klein writes, changes in these and other areas of our lives are about to see “a warp-speed acceleration”.

All these transformations will follow a similar model: using automated platforms to gather and analyse data via online surveillance, then using it to predict and intervene in human behaviour.

The control revolution

The changes now under way are the latest phase of a socio-technical transformation that sociologist James Beniger, writing in the 1980s, called a “control revolution”.

This revolution began with the use of electronic systems for information gathering and communication to facilitate mass production and distribution of goods in the 19th century.

After World War II the revolution accelerated as Governments and industry began to embrace cybernetics, the scientific study of control and communication.

Even before COVID-19, we were already in the “reflexive phase” of the control revolution, in which big data and predictive technologies have been turned to the goal of automating human behaviour.

The next phase is what we might call the “uberisation of everything”: replacing existing institutions and processes of Government with computational code, in the same way Uber replaced Government-regulated taxi systems with a smartphone app.

Information economics

Beginning in the 1940s, the work of information theory pioneer Claude Shannon had a deep effect on economists, who saw analogies between signals in electrical circuits and many systems in society.

Chief among these new information economists was Leonid Hurwicz, winner of a 2007 Nobel Prize for his work on “mechanism design theory”.

Information theorist Claude Shannon also conducted early experiments in artificial intelligence, including the creation of a maze-solving mechanical mouse.

Bell Labs Economists have pursued analogies between human and mechanical systems ever since, in part because they lend themselves to modelling, calculation and prediction.

These analogies helped usher in a new economic orthodoxy formed around the ideas of F.A. Hayek, who believed the problem of allocating resources in society was best understood in terms of information processing.

By the 1960s, Hayek had come to view thinking individuals as almost superfluous to the operation of the economy.

A better way to allocate resources was to leave decisions to “the market”, which he saw as an omniscient information processor.

Putting information-processing first turned economics on its head.

The economic historians Philip Mirowski and Edward Nik-Khah argue economists moved from “ensuring markets give people what they want” to insisting they can make markets produce “any desired outcome regardless of what people want”.

By the 1990s this orthodoxy was triumphant across much of the world.

By the late 2000s it was so deeply enmeshed that even the global financial crisis – a market failure of catastrophic proportions – could not dislodge it.

Market society

This orthodoxy holds that if information markets make for efficient resource allocation, it makes sense to put them in charge.

We’ve seen many kinds of decisions turned over to automated data-driven markets, designed as auctions.

Online advertising illustrates how this works.

First, the data generated by each visitor to a page is gathered, analysed and categorised, with each category acquiring a predictive probability of a given behaviour: buying a given product or service.

Then an automated auction occurs at speed as a web page is loading, matching these behavioural probabilities with clients’ products and services.

The goal is to “nudge” the user’s behaviour.

As Douglas Rushkoff explains, someone in a category that is 80 per cent likely to do a certain thing might be manipulated up to 85 per cent or 90 per cent if they are shown the right ad.

This model is being scaled up to treat society as a whole as a vast signalling device.

All human behaviour can be taken as a bid in an invisible auction that aims to optimise resource allocation.

To gather the bids, however, the market needs ever greater awareness of human behaviour.

That means total surveillance is here to stay, and will get more intense and pervasive.

Growing surveillance combined with algorithmic interventions in human behaviour constrain our choices to an ever greater extent.

Being nudged from an 80 per cent to an 85 per cent chance of doing something might seem innocuous, but that diminishing 20 per cent of unpredictability is the site of human creativity, learning, discovery and choice.

Becoming more predictable also means becoming more fragile.

Videoconferencing has boomed in schools and workplaces, with software like Zoom and Microsoft teams reporting enormous increases in usership.

In praise of obscurity

The pandemic has pushed many of us into doing even more by digital means, hitting fast-forward on the growth of surveillance and algorithmic influence, bringing more and more human behaviour into the realm of statistical probability and manipulation.

Concerns about total surveillance are often couched as discussions of privacy, but now is the time to think about the importance of obscurity.

Obscurity moves beyond questions of privacy and anonymity to the issue, as Matthew Crawford identifies, of our “qualitative experience of institutional authority”.

Obscurity is a buffer zone – a space to be an unobserved, uncategorised, unoptimised human – from which a citizen can enact her democratic rights.

The onrush of digitisation caused by the pandemic may have a positive effect, if the body politic senses the urgency of coming to terms with the widening gap between fast-moving technology and its institutions.

The algorithmic market, left to its optimisation function, may well eventually come to see obscurity an act of economic terrorism.

Such an approach cannot form the basis of institutional authority in a democracy.

It’s time to address the real implications of digital technology.

*Zac Rogers is Research Lead at the Jeff Bleich Centre for the US Alliance in Digital Technology, Security, and Governance at Flinders University of South Australia.

This article first appeared at theconversation.com

Start the conversation

Be among the first to get all the Public Sector and Defence news and views that matter.

Subscribe now and receive the latest news, delivered free to your inbox.

By submitting your email address you are agreeing to Region Group's terms and conditions and privacy policy.