AI workshops

Register for training sessions on responsible use, AI transcription or specialised qualitative analysis workshops.

Responsible AI in research

An online workshop introducing the benefits and considerations of using AI tools for conducting research.

This one-hour workshop is aimed at research students, but all staff are welcome. We recommend that academic staff attend the Responsible AI in research for supervisors version of this workshop.

What is responsible AI?

Artificial intelligence (AI) tools and technologies are increasingly used in research e.g. data collection and generation, analysis, interpreting results, and writing. Whilst there can be significant benefits, responsible use of AI requires researchers to also understand the risks and limitations of these tools.

2026 dates

  • Wednesday April 15, 2-3pm
  • Wednesday May 13, 2-3pm
  • Tuesday June 9, 11am-12pm

Learning outcomes

Attendees will be able to:

  • Describe how AI tools work and define important terminology
  • Describe the potential benefits of using AI when conducting research
  • Understand the potential risks and limitations of AI tools
  • Apply University policies, data classification and best practices when using AI tools
  • Critically evaluate the potential inclusion of AI tools for conducting research

Resources

After this workshop

Postgraduate and doctoral students:

  • Should discuss and formally agree on any use of generative AI technologies and tools with their supervisor before integration into their research
  • Are encouraged to attend workshops, such as AI in literature review workflows, to ensure the ethical and effective use of these tools
  • Are invited to engage with their peers across the University research community by joining Hacky Hour to post questions and exchange ideas.

Responsible AI in research for supervisors

An online workshop introducing the benefits and considerations of using AI tools for conducting research designed for supervisors.

What is responsible AI?

Artificial intelligence (AI) tools and technologies are increasingly used in research, e.g. for data collection and generation, analysis, interpretation of results and writing. While there can be significant benefits, responsible use of AI requires researchers to also understand the risks and limitations of these tools.

Attendance counts toward the supervision development activities required for the five-yearly reaccreditation of doctoral supervisors.

Dates for 2026:

  • Tuesday 24 February, 1-2.30pm
  • Monday 13 July, 1-2.30pm
  • Thursday 15 October, 2-3.30pm

Learning outcomes

This introductory workshop aims to enable supervisors to:

  • Describe how AI tools work and define important terminology
  • Describe the potential benefits, risks and limitations of using AI when conducting research
  • Support their students to apply relevant University policies and processes, including the School of Graduate Studies' proposed guidelines for the use of AI in doctoral research
  • Guide students to critically evaluate the potential inclusion of AI tools for conducting research

View the workshop materials.

Transcription using AI

An online workshop introducing AI tools for transcribing research data and important considerations for their use.

Transcription tools using artificial intelligence (AI) can transcribe audio into text quickly, accurately, and cheaply, but researchers need to consider the risks of use.

Dates

No sessions are currently scheduled, but research groups are welcome to request a custom workshop on this topic.

Learning outcomes

Attendees will be able to:

  • Identify University-approved tools to support transcription, translation, and diarisation
  • Understand the benefits, policies, limits, risks, and compliance aspects of AI transcription tools
  • Know where to seek support for transcription, translation, and diarisation needs

Resources

Introduction to AI-assisted workflows for qualitative analyses

An online workshop demonstrating the benefits of using large language models (LLMs) to label and analyse qualitative research data.

Qualitative research often involves analysing large amounts of information, which can be very time-consuming. In this workshop, we introduce LLMs as a tool for identifying key ideas, patterns, and themes in text and images. Our goal is to take you beyond the chatbot, demystify this technology, and show how these tools can be used responsibly and effectively.

Dates for 2026:

  • Tuesday 3 March
  • Friday 14 August

Learning outcomes

  • Understand the core concepts of building an LLM-assisted research workflow
  • Identify key ethical risks, such as biases and hallucinations
  • Critically evaluate and interpret the outputs of LLM-assisted analysis methods

View the workshop materials.

Embedding large language models (LLMs) into qualitative research workflows

A three-hour, online, hands-on workshop where participants use a Large Language Model (LLM) to label and analyse qualitative research data.

Qualitative research often involves analysing large amounts of information, which can be very time-consuming.

Building on concepts showcased in the Introduction to AI-assisted workflows for qualitative analyses workshop, participants will build an AI-assisted workflow that leverages large language models (LLMs) to extract features and identify themes in text and image collections. These techniques enable you to scale up your analyses and derive insights from large collections of text and image data.

Dates for 2026:

  • Tuesday 14 April
  • Tuesday 6 October

Learning outcomes

  • Build an LLM-assisted research workflow to aid analyses of qualitative data.
  • Practice effective prompting techniques to prepare, label, and analyse unstructured text and images.
  • Identify and mitigate key ethical risks, such as biases and hallucinations.
  • Critically evaluate and interpret the outputs of LLM-assisted analysis methods.

After this workshop

Contact

Research Data Support Services
Email: researchdata@auckland.ac.nz