Can individuals control the monetisation of their data?

Tech giants rule the world, making billions of dollars from our personal data. Could the ordinary Googling individual get a share of that?

Fernando Beltran
Fernando Beltran researches how individuals might regain control of the value of their data.
THE Impact Ranking 2020
The University of Auckland is ranked No.1 globally in the Times Higher Education University Impact Rankings for 2020 (also no.1 in 2019). The rankings assess how universities are working towards the UN Sustainable Development Goals.
Research that responds to the challenges of the UN Sustainable Development Goals.

Tech giants make billions of dollars as people hand over their personal information in exchange for free apps, search engines and social media platforms.

The raw material of our digital lives – location data, online searches, purchases, “likes” – is transformed into behavioural data that companies use to predict and guide our choices, from the shopping mall to the voting booth.

Creepily targeted ads aside, it’s a staggering asymmetry of power and knowledge. Now, the University of Auckland’s Fernando Beltran and Gehan Gunasekara are looking at the potential for reversing that big power imbalance. What if people could monetize their personal data? How would that work? Would it even be a good idea?
Can new privacy-enhancing technologies play a role in giving people confidence that what they sell will – really – stay private.

The project was triggered by Beltran’s nagging sense, watching the development of the internet over the past three decades: that it really doesn’t have to be this way. Consumers constantly hand over their personal information in exchange for “free” apps without reading any of the fine print that explains where their data is going and how it’s being used. (Think of the Houseparty app that became New Zealand’s most popular during a Covid-19 lockdown – and its automatic collection of users’ data.)

“Monetization of personal information on the internet is happening on such a big scale, why can’t the producers of that information benefit somehow?” asks Beltran, who’s an Associate Professor of Information Systems in the School of Business. “As digital consumers, we’re still quite ignorant.”

Beltran and Gunasekara, who’s an Associate Professor of Commercial Law in the Business School, along with PhD student Mengxiao Zhang, will this year get students into the laboratory in Auckland to get a sense of just what people think their personal data is worth.
 

Gehan Gunasekara
Gehan Gunasekara is exploring how 'differential privacy' could work in Aotearoa.

The imbalance of power was famously laid out in The Age of Surveillance Capitalism, the 2018 book where Harvard Business School professor emirita Shoshana Zuboff depicted tech giants such as Facebook and Google running amok, powered by personal data.

There’s the precisely targeted advertising, but also a lot more. In one experiment, Facebook tinkered with people’s moods by altering their news feeds.

In the Auckland project, students will be paid to take part in experiments in the business school’s DECIDE lab, a facility for research on decision-making, to discover what monetary values they put on their personal information. First, they will be asked to rank different types of information -- financial, health, religious and political – by value. Then, they’ll be asked to name their price for surrendering specific data.

An example might be, say, a health condition. If a deal is struck, they’ll be paid in real money. (The experiment will be structured to prevent people simply naming a sky-high price and grabbing the cash, drawing on Beltran’s past experience in designing auctions.)

Establishing how much money a person wants for their data begs one question… would anyone in the real world pay that much? Beltran says it is still early days in our understanding of these issues.

The DECIDE Lab: Better business decisions

In the DECIDE Lab in the School of Business researchers run experiments into decision-making. The researchers and their questions come from a broad set of disciplines from economics and psychology to management. At its core is a focus on the rapidly growing area of behavioural economics. The Lab is equipped with 32 workstations wired to a local area network, with removable privacy partitions on three sides of each machine, a glass-partitioned control room at the back of the laboratory, projection equipment, a large screen and sliding white-boards. Researchers also have access to our “mobile lab”, consisting of ​​30 Samsung Galaxy tablets. 

There have been some entertaining explorations before. “I’ve data mined myself. I’ve violated my own privacy. Now I am selling it all,” New York-based Federico Zannier said in an art project in 2013, called A Bit (e) of me. Zannier raised US$2,733 by selling bundles of records of his online activity, from his history of websites visited to GPS locations to a recording of all his mouse’s cursor movements – on Kickstarter with the smallest package consisting of a single day’s digital footprint.

“I’m selling this data for $2 a day,” Zannier said on his website. “If more people do the same, I’m thinking marketers could just pay us directly for our data. It might sound crazy, but so is giving away all our data for free.”

However, an average person might be pushing it to match Zannier – at least if they’re being paid rates the Financial Times reported to be typical of data brokers. General information, such as age, gender or location, could be worth a mere US$0.0005 per individual, the FT said, offering a calculator for working out what your personal data may be worth. (https://ig.ft.com/how-much-is-your-personal-data-worth/)

“Some of the most personal and secretive troves of data rank as the most expensive,” it adds. “For US$0.26 per person, buyers can access people with specific health conditions or taking certain prescriptions.” Some people find their data is worth US$1 or so. The FT tool was devised in 2013 and updated in 2017.

Beltran doesn’t see the miniscule amounts of money cited by the FT as indicating that data monetisation isn’t worth pursuing. The explorations by Gunasekara and Beltran tie in with developments that suggest a swelling interest in the topic. In the United States, lawmakers proposed legislation requiring tech giants to tell customers how much their data is worth. “You don’t own your data and you should,” tweeted congresswoman Alexandria Ocasio-Cortez in February this year, an idea supported by presidential candidate Andrew Yang.

Differential privacy

A second part of the academics’ research is focused on a technique for ensuring that individuals’ identities are kept secret when sets of anonymized data are moved around. Called “differential privacy,” this method was co-devised by Harvard computer scientist Cynthia Dwork and is being used by organisations, such as Apple, Google and the US Census Bureau.

Differential privacy aims to block the privacy villains who can reverse-engineer “anonymous” data sets to discover the information of identifiable individuals. Simply, it involves injecting “noise” into a data set so that it remains useful at the general level but – hopefully – impregnable at the individual level.

Take the US Census Bureau, which collects personal data on everyone living in the country once a decade. Its smallest grouping of individuals, the “census block,” can get down to as few as 20 people. Imagine if only one is a Filipino-American – the potential for working out that person’s identity.

The bureau already uses a protective measure called “data swapping” to make that harder, but this year will take the extra step of using differential privacy. The bureau says that is, in part, to counter increases in computer power that have made it easier for data analysts to cross-reference census data sets with each other or with outside data sources.

To Beltran, the method may offer potential for facilitating the monetising of data – by making consumers confident that their privacy will be preserved. Gunasekara is looking at how this type of privacy-protecting technology would fit into New Zealand’s privacy laws.

At the same time, arguments rage over whether helping individuals to sell their data will really prove empowering to them. Wouldn’t monetization just play into the hands of the rich tech giants, when tighter privacy rules are what’s needed to keep them under control?

“Putting a monetary value on it won’t necessarily fix the mischief,” says Gunasekara. “The tech companies may pay peanuts – and still use the information to make vast profits.”

Certainly, the giant market valuation of a tech firm that didn’t even exist 20 years ago reinforces the data-is-the-new-oil clichés. Founded in 2004, Facebook had a market value of US$689 billion in July of 2020. The company’s average revenue per user was US$29 in 2019.

“We rushed to the internet expecting empowerment, the democratization of knowledge, and help with real problems, but surveillance capitalism really was just too lucrative to resist,” according to Harvard professor and author Zuboff. She sees the trend only exacerbated by the spread of smart devices and personalized services, all collecting data begging to be analysed and exploited for commercial gain.

Beltran can imagine a different future, where an app on his phone is negotiating for payment with the companies who want to tap his data – and a huge power imbalance has been rectified. 

Story by Paul Panckhurst
Researcher portraits by Billy Wong

The Challenge is a continuing series from the University of Auckland about
how our researchers are helping to tackle some of the world's biggest challenges.

To republish this article please contact: gilbert.wong@auckland.ac.nz