Our use of data analytics and automated decision-making
We use data and technology to make our products and services better, improve our efficiency and generally provide our data subjects with a better and more meaningful experience.
Data-driven automation brings real benefits to organisations and their people. Automated processing – including profiling and automated decision-making – can inform more the development of more responsive and accurate processes and services, and can lead to faster, more consistent and more predictable decision-making.
But, we’re also very aware that such processing can create risks to our people if it is not managed with care. For this reason, we ensure that our use of large datasets and technology adhere to a number of important principles intended to protect privacy and human rights.
First of all, we ensure that automated processing is consistent with our guiding privacy principles:
- Data minimisation – we will use only the personal information we really need to drive an analytics project or automated decision-making process
- Transparency – we will always be open with people about the automated processing we’re undertaking
- Security – we take all reasonable steps to ensure that we protect the data we’re using
- Use limitation – we will make sure automated processing is only used where it is necessary to support our lawful purposes, and we will make sure it’s proportional
- Rights focused – we will make sure that people can exercise their important privacy rights, including to access and correct their information or object to processing
Secondly, we make sure it is consistent with our learning analytics principles (from our Learning Analytics Policy), including:
- Learning analytics must be ethical, transparent and focused on the enhancement of the student experience
- We understand that data will always provide an incomplete view of students’ capabilities or likelihood of success
- When data is used to inform an action at an individual level, it will always be accompanied by personal intervention by University staff
- We recognise that data and algorithms can contain and perpetuate bias and we will work to avoid this
- Good governance is core, to ensure that analytics are conducted according to ethical principles and align with our values
- Where possible, analytics will only be shared on an aggregated and anonymised basis
Overall, we aim to be transparent and ethical about the use of analytics and automated decision-making, and to ensure that there is always a way for people affected by our automated processing to challenge our decisions with a human.
Using data to drive better outcomes
We’re working hard to improve the student digital journey. To enable this, we analyse the personal information we collect about students to understand their behaviour and use of our systems and platforms. We also use this data to better identify students who are disengaging with our processes in a way that might indicate risk to their success or wellbeing.
Our major data analytics projects include:
- The Student Canvas Engagement Report – we use information about student engagement with Canvas, in addition to other datasets we collect, to monitor early indications of reducing engagement in academic life. We use identifiable data about student engagement to generate identifiable operational reports that are shared with relevant student advisors, and de-identified strategic reports that are shared with faculties more widely in order to develop and refine student support services. Our staff may use this information to intervene early to help at-risk students become more engaged in their studies.
- Academic progress tracking – we use information about a student’s Grade Point Average (GPA) and other academic information to track their academic progress. We provide students with visualisations of these analytics on the Student Portal. We also use this tracking information, combined with information about student engagement, to intervene where necessary to held at-risk students become more engaged in their studies.
- Monitoring student wellbeing – we use information about student use of certain services – including Wi-Fi available at University premises – in order to identify patterns or changes that might indicate a risk of harm to a student. For example, where we see that a student’s Wi-Fi use changes suddenly and in an unusual way, this may indicate a wellbeing issue that we may investigate further.
Using automation to improve efficiency and experience
We use automated decision-making tools (that analyse personal information) to make many of our processes more streamlined. For example, we use automated tools to:
- Suggest study pathways and possible career outcomes to school leavers, as part of the Future Student Guide programme (we access and use data from this purpose with the authorisation of the school leavers)
- Determine your eligibility for programmes of study, scholarships or other services
- Assist us with the marking of multiple choice assessments
- Target our advertising, using services such as Google Adwords, and Google Analytics, Facebook Custom Audiences and Google Display Network. This allows us to provide a better online experience and connect with visitors, based on their past interactions with the University’s social and digital platforms
Talk to us about your concerns
To ask about, or object to, any analytics or automated decision-making process that has affected you, please:
- call us on +64 9 373 7999
- email us at firstname.lastname@example.org
- write to us at Privacy Officer, University of Auckland, Private Bag 92019, Auckland 1142, New Zealand