Alternative realities

The world of mixed realities is expanding all around us, with multiple applications of a surprisingly practical kind.

Many of us were introduced to alternative realities by Pokémon Go, which seemed to spring into life from the pages of science fiction. But now the world of mixed realities is expanding all around us, with multiple applications of a surprisingly practical kind. Professor Mark Billinghurst, a world-renowned expert, is pushing the boundaries far beyond what we ever thought was possible.

Uh oh. A field technician working at a remote electricity substation in the Australian outback has accidentally shut a generator down, resulting in a power outage affecting thousands of residential homes.

The technician has no idea how to fix the problem, and the expert who does is a 45-minute drive away. When help arrives, the matter is sorted in minutes. But in the meantime, the power company has been stung with a six-figure bill by the state government for the hour-long outage.

World-renowned Augmented Reality (AR) expert Professor Mark Billinghurst, enlisted in May this year to spearhead the new Empathic Computing Laboratory at the University’s Auckland Bioengineering Institute, reckons the hefty fine could’ve been avoided if the expert assistance were delivered through “remote collaboration”: teleconferencing rebooted for the 21st century via next generation AR.

The possibilities for sharing experiences remotely are endless. Search and rescue. Surgery. Games and entertainment. You could be live with the band onstage.

Professor Mark Billinghurst

His vision is for the field technician to slip on an AR headset which streams 360-degree live video to the remote colleague’s computer screen. The remote helper will then be able to see what the technician sees, and more: the 360-degree video allows the helper to look anywhere, rather than simply where the technician happens to be looking. Eye-tracking technology allows both workers to gauge where the other is looking. Inside the headset, the technician would be able to see virtual ‘ghost hands’ of the helper pointing out the wires that need to be fixed.

And then there’s the ‘empathic’ element: the headset would have the capacity to relay emotional and physiological information about the wearer, such as their facial expressions and heart rate.

Facial expression recognition technology currently exists in AffectiveWear, glasses with small photo-sensors which measure the distance from the glass frame to skin, which changes when we smile or frown or gasp.

The Empathic Computing Lab’s mission, says Mark, is to develop software systems using emerging hardware technologies that allow people to share with others what they’re seeing, hearing and feeling.

“We wanted to relate the cues you usually have in face-to-face conversation, as these show understanding. Using our technology, it all becomes a much more immersive experience. You feel as though you’re standing inside the body of the person wearing the headset … In trials of sharing eye gaze, we’ve found people experience a stronger sense of collaboration and communication.”

The possibilities for sharing experiences remotely are endless, says Mark. Search and rescue. Surgery. Games and entertainment. “You could be live with the band onstage.” Sports.

“You’re watching the 2024 Olympics at home, and an athlete competing in downhill mountain biking streams 360- degree content through a video atop their helmet. You get to see where they’re biking, hear their heart rate and the sounds around them.”

As he explains, the work conducted in the Empathic Computing lab is at the junction of three computer interface trends.

“The first trend is the way we capture content, which has advanced from still photography in the 1850s to today’s streaming 360-degree video on a portable machine. The second is increasing network bandwidth – the super-speedy networks that allow you to download a Netflix movie at home in seconds, and do higher-quality video conferencing.

The third trend is towards ‘implicit understanding’, where computers are able to watch and listen to us in order to understand what we are doing. “In the 50s it was hard to learn how to program computers – you had to use punch cards, binary code and flick all sorts of switches. But now we have systems in our smartphones such as voice recognition.

The Xbox has a camera which enables you to play games by moving your body around. So to some extent computers have become more human-like; they can recognise our behaviour.”

Mark’s curiosity about AR was first aroused in the 80s, back when headsets weighed three to four kilos and cost about $50,000. (These days they’re about 100 grams, and you can get a cardboard VR lens to wrap around your mobile phone for about five bucks.) The final-year project of his mathematics degree at the University of Waikato involved building a mathematical model of solar flares.

Dissatisfied with plotting the data visually on a screen, he undertook a summer internship in Virtual Reality at Seattle’s University of Washington, where he could access VR technology on $250,000 computers which New Zealand simply didn’t have.

The university offered him a PhD, so he stayed on, his attention turning to Augmented Reality.

“With AR the goal is to overlay computer graphics on the real world and still be able to see your surroundings.

With VR you separate yourself from the real world. For some tasks it’s better to be able to see the real world.”

In the 90s, for his PhD, Mark and his colleagues developed the world’s first collaborative AR systems.

“Using our technology, two people could put a headset on, sit opposite each other at a table, see each other and manipulate virtual objects they could see in their headsets. An example we worked with was city planning. They could see buildings in their headsets that they could rearrange and work on together.”

The inspiration? A Star Wars movie. “There’s a scene in Return of the Jedi where they’re figuring out how to attack the new Death Star and they have a meeting, and a hologram of the Death Star appears in front of them. What we developed in the nineties would have made that possible. We could’ve had a Death Star on that table instead of city buildings.”

One of the challenges that Mark and his colleague Hirokazu Kato cracked – it took them six months – was to devise algorithms that allowed headset wearers to track their position relative to the real world, a key element of AR. Rapt with the results, they opted to share their knowledge in 2001 as ARToolKit, the world’s first open-source AR software developers’ kit. It’s since been downloaded more than a million times.

“It helped spur the whole AR research community, because instead of having to spend months and months solving this important problem, we gave people the tool to do that, so they could focus on building applications that used it.”

The consequent kudos also translated to more research funding for Mark, who continued his career at the University of Canterbury, where he created the HIT (Human Interface Technology) Lab NZ.

A highlight of his 13-year directorship was innovating the world’s first mobile phone AR advertising campaign in 2007 with Saatchi & Saatchi. A downloadable app allowed punters to use their phone to view zoo animals popping out of the newspaper pages, 3D style. He’s spent sabbaticals working with the likes of Nokia and Google. At the latter he worked on Google Glass: ‘smart glasses’ with an optical head-mounted display.

In 2015 he went to the University of South Australia, where he helped devise a free app to enable live-tracking of the top 10 cars and drivers competing in the Bathurst 1000, as seen on a 3D model of the circuit.

Home again and happy to be closer to his whanau, Mark has also been tasked with forming an Australasian consortium of AR/VR researchers – around 140 academics and students from nine universities across Australia and Aotearoa – in order to grow the industry: to create jobs and potentially millions of dollars in export revenue.

Collectively the groups have already secured more than $US33 million worth of funds for projects from a variety of local government, national and international sources.

Who knows what alternative realities the Australasian collective will dream up? Mark believes science fiction is the inspiration for many who work in computer interface technology – so perhaps our future lies in another scene from Star Wars.

By Stacey Anyan

Ingenio: Spring 2018

This article appears in the Spring 2018 edition of Ingenio, the print magazine for alumni and friends of the University of Auckland.

View more articles from Ingenio

Join our ABI mailing list

Don't miss out on the latest news from the Auckland Bioengineering Institute - join our mailing list so we can keep in touch.