Fixing data security at the source

Opinion: Nalin Arachchilage asks us to imagine a world where we could be confident that our use of software minimised the risk of losing our data.

Google creates a strange digital copy of your life that most of us never know about. Photo: iStock

When you use Google and Facebook, the internet giants create a strange digital copy of your life that most of us never know about. In return for free services, individuals give up their data to be collated into the datasets that drive highly profitable digital advertising.

In some cases, the extent of the information we give away might look like a good deal. In others, there’s often an accompanying loss of privacy, that while not necessarily abused, could easily be so.

For example, the AA Road Service mobile app provides a driver with roadside assistance. However, within the app, detailed GPS logs not only reveal where she travels but how fast she drives, which route she takes, which ATMs she stops at, and what medical clinics she has visited.

Data privacy is always in the top five for IT management issues and investment costs. Despite all this attention, security lapses continue. Zoom, the video conferencing software that has enjoyed a boom during the pandemic, also experienced a serious data breach with hackers obtaining some 500,000 passwords, reportedly now for sale on the dark web. Media report that more than 12 percent of professionals stopped using Zoom, which equates to a serious loss of trust by consumers and a large dent in revenue.

A central issue is human behaviour. We’re lazy. If we can avoid a barrier, it’s human nature to do just that. Meanwhile, software designers worry about data breaches so when they create software, they sacrifice usability for security, while human behaviour leads us to sacrifice security for usability.

I have a favourite Dilbert cartoon that shows the kind of barriers human behaviour creates. The cartoon features Dogbert offering a password recovery service for ‘morons’. A hapless salary man comes up to Dogbert. “I don’t remember my password,” he says. Dogbert replies: “Is it 1,2,3? Hapless salary man returns to cubicle. His thought bubble reads, “That’s just spooky.”

If far too many of us have easily guessed passwords, just as many never read the terms and conditions of how their data will be used. We tend to just tick the agree box and move on. From our research we suggest that it would not be too hard for policies to be presented with the user in mind rather than designed by lawyers briefed to protect their client. Twitter has a great example: “What you say on Twitter Services may be viewed all around the world instantly. You are what you tweet.”

Twitter still has a 19-page policy that nobody except a lawyer might read. But what if we could ensure apps explain in plain language that their users have a right to privacy and need to understand how the app collects, stores and shares user data? And even better embed the user’s right to privacy in the heart of the code.

Our study gave 36 software developers a design task to embed privacy in the code to respond to privacy problems. The developers came back with some specific issues. They found it hard to marry privacy requirements with software engineering techniques and in fact, privacy concepts, they said, might not work in software development environments. Most lacked knowledge on privacy concepts. Without that knowledge, the opinions of the developers and their wider brief took precedence over privacy requirements. Yet in the world of software development, there are a number of well-established and widely known practices to embed privacy into software systems: Privacy by Design, Fair Information Practices, and Data Minimisation. These privacy protocols align with the standards of the world-leading General Data Protection Regulation (GDPR) principles in the EU.

Imagine a world where we could be confident that our use of software minimised the risk of losing our data. It is entirely possible through education and training for software developers to be upskilled to build in privacy principles in the creation of new software.

But alongside that there needs to be culture and organisational change. Software and app companies need to listen to what their customers want and need when it comes to privacy. Unfortunately, that is unlikely to happen without legislation. The GDPR law (or an equivalent standard) should be made international law; adopted by all nations, protecting privacy for everyone.

Dr Nalin Arachchilage is a lecturer in cyber security and privacy, School of Computer Science, Faculty of Science.

This article reflects the opinion of the authors and not necessarily the views of the University of Auckland.

Used with permission from Newsroom Fixing data security at the source 13 January 2022

Media queries

Alison Sims | Media adviser
DDI
09 923 4953
Mob 021 249 0089
Email alison.sims@auckland.ac.nz