Are we facing a digital dystopia? Nobel Laureate Jean Tirole sounds a warning

Senior Lecturer in Economics Dr Simona Fabrizi reports.

Nobel Laureate Professor Jean Tirole

How transparent should our life be to others? Modern societies are struggling with this issue as connected devices, social networks, ratings, artificial intelligence, facial recognition, cheap computer power and various other innovations make it increasingly easy to collect, store and analyse personal data.

While digital technology holds the promise of a more civilised society, many fret over the prospect of mass surveillance by powerful players engaging in the collection of bulk data in shrouded secrecy.

In the recent University of Auckland Business School Dean’s Distinguished Virtual Public Lecture, Nobel Laureate Professor Jean Tirole from the Toulouse School of Economics (TSE) used this dystopian scenario to emphasise the excesses that may result from an unfettered usage of data integration in a digital era.

Professor Tirole comprehensively dissected the behaviour and incentives of individuals, governments, and corporations in the face of increasingly pervasive mass surveillance in modern societies.
He began with the existing debate surrounding privacy and transparency. Whether one should be concerned about information availability depends on the potential use of the information, by whom, and to which end. On the one hand, transparency may increase accountability, facilitate trade, help eliminate unlawful behaviour, and assist in public adherence to norms considered essential for a well-functioning society. On the other hand, excessive transparency could unravel medical insurance markets, by means of mandatory disclosure of DNA tests. Transparency can also be the enemy in online transactions, because the internet never forgets, and therefore no one has a second chance to restore a tainted reputation.

How transparent should our lives be to others, when private information can be appropriated by governments to construct ‘social scoring’ mechanisms?

Social scoring could be aimed at promoting certain ‘desired’ behaviours, thereby potentially limiting people’s freedom, including freedom of expression. Referencing the Chinese social scoring system, Professor Tirole offered credible scenarios in which individuals who are concerned about their image and their identity could alter their behaviour to comply with what is expected by others.

This applies to both close contacts (strong ties), such as friends or work partners, and more distant or transient contacts (weak ties). By adopting a social scoring system, an authoritarian regime seeks to control individuals’ actions as well as their belief systems. So when should we start questioning the benefits that we so much took for granted by adopting ever more sophisticated gadgets, signing up to a variety of social platforms, or simply interacting with others who do? Professor Tirole’s answer is: When technology allows governments to effortlessly trace, record and exploit a person’s every move, decision, declaration, or association with others. This technology would also create the opportunity to map every individual’s social network, exploiting this knowledge to ‘keep tabs’ on both close contacts and weak ties.

The potential dystopian scenarios that would emerge would be far scarier than people experienced when subjected to the scrutiny of the Stasi secret police in the former German Democratic Republic.

Professor Tirole’s lecture sounded a warning to remain vigilant and not to be mindlessly seduced by the distractions and delights of the digital revolution. Because many of us are still reluctant to see the dark side of this shiny new technology; a side that sees us constantly scrutinised and rated on our worth, our access to markets, our standing in society, and our willingness to abdicate freedom of speech and actions in order to avoid persecution or ostracism.

After the lecture, there was a lively discussion amongst the distinguished panelists on how to ameliorate the digital dystopia. Could we expunge our digital records, thereby restoring the option of a ‘second chance’? Should we retain property rights on our private information and be responsible for ‘selling’ and ‘using’ it ourselves? Legal regulations may be introduced, but they will need to strike a balance between letting markets operate without interference and safeguarding individuals’ rights. The COVID-19 tracing app is a pertinent example. We want to unite in the fight against COVID-19, but can we be sure the information so collected won’t be kept and used for other purposes?

In the end, it was concluded that more education about previously unknown dangers prepares and empowers us to withstand the digital dystopia. It is up to every one of us to play our part in making sure this science fiction-like scenario will never become our reality.