Andrew Chen: bringing a digital lens to research

From 2020, Dr Andrew Chen was called on by many to explain the Covid-19 app. It helped that he had a genuine interest in its evolution.

Andrew Chen brings a ‘digital lens’ to research across various disciplines.
Andrew Chen brings a ‘digital lens’ to research across various disciplines. Photo: Billy Wong

Dr Andrew Chen is an expert in AI, machine learning, video analytics and technology ethics.

Being an excellent communicator of complicated technology concepts, he is bemused that the only time he’s made it onto one of the biggest broadcasters in the world, CNN, was to discuss something of global importance – Wordle.

“It was the weirdest, most surreal experience of the past year. I wasn’t talking about my academic work or my expertise …

“I was talking about the person who came up with the Wordle emojis, its familiar coloured squares. CNN found a BuzzFeed article with me and two others talking about emojis and contacted me. I thought I might as well, even though I hadn't officially had media training. 

“There’s a five-minute segment that aired at around 4am in the US, so not many will have seen it. After having put so much time and effort into Covid-19 communications, turns out the thing I’m on CNN for is Wordle.”

He’s not grumpy about it at all – it’s hard to imagine Andrew being grumpy.

And that media training he mentions? There’s a funny story about that too. When he was finishing his PhD in 2019, he was offered a part-time role as a research fellow at Koi Tū: the Centre for Informed Futures, directed by Professor Sir Peter Gluckman, to research digital technology and its impacts on society, including ethics and privacy.

But his partner, now wife, had just been offered a job in Wellington so he asked if he could do the research from there. That was fine and, as it turns out, fortuitous in many ways.

“We didn’t know that Covid was going to happen, and, of course, it didn’t matter where I was once that happened and we were working from home.”

Ultimately, it was useful for Andrew to be in the capital when the pandemic was proceeding full throttle.

“It was surprising for a few of the policymakers when I’d say, ‘I’m actually in Wellington, so I can come to the office and have a chat.’ In terms of building relationships, it’s easier to do that.”

Prior to the first 2020 lockdown, Andrew had been due to do the Science Media Centre media training course in May. Instead, he learned to swim by being thrown in the deep end as the local go-to expert on technologies developed in response to Covid-19. He appeared on radio, TV, online and in print media, as well as penning opinion pieces.

People were saying ‘just do it’ ... Yes, people were building their own systems overnight, but they weren’t good. They didn’t think about the things a government needs to think about. 

Dr Andrew Chen, Koi Tū: The Centre for Informed Futures Waipapa Taumata Rau, University of Auckland

His PhD was titled “The Computers Have a Thousand Eyes: Towards a Practical and Ethical Video Analytics System for Person Tracking” and he was able to adapt some of its concepts to the questions being asked early on about QR-codes, the Covid-19 app and Bluetooth tracing. It all seems like a blur now, but at its peak during the pandemic, there were almost four million QR-code scans in a day, from almost 1.5 million phones, and around 2.4 million devices exchanging Bluetooth handshakes around the country.

Before that, when the app was just being mooted by the government, Andrew was pushing against technologists who claimed building a good app was easy.

“People were saying ‘just do it’, like it should be done overnight. And, yes, people were building their own systems overnight, but they weren’t good systems. They didn’t think about all the things a government needs to think about. I was trying to paint that picture.

“Then the government released its app in May 2020. Before that, they didn’t know anything about me. But one of my friends found the app on a server and said it was available, the day before the government intended to release it.

“Then somebody else started tweeting about it and I thought, ‘Okay, well if they’ve already started talking about it, I should talk about it too.’ Previously I didn’t want to spoil their plans. So I told everybody ‘hey, the app is here if you want to try it’ and gave the link and apparently it crashed the system because they weren’t ready for it. So that’s how the government found out about me!”

The ministry asked to meet him.

“I explained to them that I was just a researcher. They weren’t angry. They were just kind of pleased someone was actually interested.”

There were already many science communicators talking about the disease and modelling, but not so much about the app and digital contact tracing.

“I was interested in the ethics of it as well and did some study on that.”

Andrew points out he was never paid by the government to comment on any aspect of the Covid-19 app.

“I wanted the freedom to be objective. And because I was in Wellington I could meet with officials and get an idea of what they were trying to do. It was always going to take them some time to roll out all of the features. But I got the chance to understand what was going to happen before they announced it, so that when the media asked me for an opinion, I was knowledgeable.

“Of course, there were times when the app could have been better. But that wasn’t down to the developers as much as getting the decision-makers and ministers to understand it. If I noticed something that wasn’t good, that’s where I could elevate issues a bit more.”

An example was when Covid re-appeared in the community in August 2021, but public health officials didn’t seem to be using the Bluetooth information coming in from around a million people, even after more than a week.

“I said ‘all these people are contributing data to the system, something should be appearing by now.’ I was asking questions and wrote an article on The Spinoff and the NZ Herald pointing out there must be a problem. In the second week the health officials said, ‘ah, yes, there’s something wrong’ and had to fix it.”

This was important because the original theory around contact tracing was that if the system could get people to isolate faster, it would slow the spread of the disease.

We’re looking at AI systems and how people contribute to these systems without even knowing it.

Dr Andrew Chen, research fellow, Koi Tū: The Centre for Informed Futures Waipapa Taumata Rau, University of Auckland

Andrew now has a couple of government contracts that largely arose from the way in which he communicated during the pandemic. He is also working for Matū in Wellington, a venture capital group focused on science and technology commercialisation.

“My Matū job involves looking at interesting projects, trying to figure out which might have commercial potential and then picking the ones we’re going to invest in and getting them through that process.”

Andrew has always been interested in figuring out how things work. He had a passion for robotics at high school on the North Shore, and when he started at the University he did computer systems engineering, thinking perhaps that he could merge those two interests at some point.

“But in my final year of undergraduate studies in 2014, I did a course on the fundamentals of AI. I realised there was so much that was about to happen with AI, I’d like to learn more about it. I ended up doing a PhD and it went from there.”

Right now, the University segment of his working week involves being part of a research project funded by the University of Auckland's Transdisciplinary Ideation Fund.

“It’s the Hidden Humans Project. We’re looking at AI systems and how people are contributing to these systems without even knowing it.”

There are examples everywhere online. One that might surprise people is Captcha.

“If you want to buy tickets, for example, a Captcha will pop up with images or a word in funny type, or you can click on all the examples of traffic lights. These systems are advertised to you as being used to detect whether you are a bot or a human.”

But, in fact, that’s only one of Captcha’s functions.

“If we take the ‘word Captcha’ example, generally it will show you two words. For one word, the people at the back-end know the correct answer. The other word, they actually don’t know what the word is. When you type in the first one correctly, they assume you will get the second one correct.

"Then they use the data from the second entry to learn how to read printed text. They are using you to train optical character recognition (OCR) models. This has then ended up being used to digitise all the books in the Library of Congress, which is a laudable goal, but they’ve been harvesting brain power from millions of people, for free, to help them learn how to digitise the books.

“Likewise, when you click on the images of traffic lights, that becomes data used to train autonomous vehicles, because they’re trying to help teach the car that ‘this is what a traffic light looks like’. They just don’t tell you this.”

It all sounds vaguely conspiratorial, but Andrew says this is no secret.

“Lots of people know about it. It’s just that it’s not in the public consciousness. They don’t reveal the true reason the system exists. They don’t ask for user consent. In fact, the user is presented a barrier because you have to do this in order to buy your concert ticket and you’re not compensated for it. At an individual level, you wouldn’t think of that as being work you’d be paid for. But when we take millions of people doing it, and contributing data, there’s value ascribed to that.”

But does anyone care?

“Well, this is what we’re testing. I’m open to the idea that maybe nobody cares about this and that maybe that’s why everyone’s fine with it. But I think there are some arguments we haven’t tested.

“For example, it’d be difficult to make the New Zealand government regulate the companies doing this. But if there are four million adults all doing these Captcha, maybe 50 times a year; each time it uses x amount of time and there’s value associated with that. When you aggregate it, New Zealand is actually exporting millions of dollars’ worth of time and labour and getting nothing back for it. Would the government care about it then?”

Food for thought at least.

If the public keeps saying, ‘We do actually care about our data. We want our data to be used well, not sold,’ that will urge governments to keep people informed.

Dr Andrew Chen, research fellow, Koi Tū: The Centre for Informed Futures Waipapa Taumata Rau, University of Auckland

Those doing the thinking alongside Andrew on that project come from all over the University: Dr Brent Burmester (Business), Dr Ellie Bahmanteymouri (School of Architecture and Planning), Dr Fabio Morreale (Music Technology), Matt Bartlett (Faculty of Law) and Dr Katerina Taskova (Computer Science).

“We’re a group of people who otherwise wouldn’t have been brought together, but we all shared this interest in AI and ethics. We’re about three-quarters of the way through the one-year project. We’ve run a couple of workshops recently with stakeholders to test some of these arguments and see if they resonate.”

In 2021 Andrew worked with police to help them better understand the software they were using for facial recognition.

“We did an audit and it helped them understand how they were using it because they weren’t sure at the time.”

He is now on the expert panel for emerging technologies for police, the Data Science Review Board for MBIE and the public advisory committee for disarmament and arms control for MFAT, with nonproliferation and arms control being another of his interests.

In the little spare time he has, he’s contributing to other Koi Tū projects and submissions to “bring a digital lens” to work being done there.

Are there enough people like Andrew in New Zealand?

“Oh, there’s a community. I see the same people at events. But I remember during the vaccine pass rollout saying to the Ministry of Health folks, ‘Are you talking to any other academics about the vaccine pass rollout ... so I don’t have to do all of it’. And they said ‘you’re the only one we’re talking to’. So I guess you’re right, the roster isn’t very deep.”

Looking to the future pool, Andrew thinks students would benefit from doing a programme like Global Studies.

“It’s a great course to get a transdisciplinary view of the world. We have technical degrees like engineering or science, but there are other opportunities to engage in things like public policy and to ask, ‘How does this algorithm I developed affect society? What are the ethics of it?’”

He edited Shouting Zeros and Ones: Digital Technology, Ethics and Policy in New Zealand (BWB Texts), a book released in August 2020. It’s aim was to help inform people so they can put more pressure on the government to do better in the area of data.

“If the public keep saying, ‘We do actually care about our data. We want our data to be used well, not sold,’ then that will urge governments to keep people informed and be transparent about what it allows to happen with data.”

It all sounds a bit overwhelming to the average human so I return to a very important subject, Wordle. Andrew was an early adopter. So did he get the word ‘parer’ that confuddled so many recently? Of course he did. In three.

“I didn’t realise it was a difficult one until afterwards when the New York Times said that 40 or 50 percent of people failed that day.”

By Denise Montgomery

This article first appeared in UniNews November 2022 edition. 

Email: uninews@auckland.ac.nz