<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=1005154772848053&amp;ev=PageView&amp;noscript=1">

Key Insights from the NZ Perspectives on GenAI in Higher Education Symposium

Evelyn Levisohn

Jul 24, 2025

At our most recent 'Students First' Symposium, exploring generative AI's impact on New Zealand Higher Education, leading academics and a student representative came together to discuss the multifaceted nature of what the sector is facing in 2025. The panel explored developments, challenges, and unique considerations in the New Zealand context, and Prof Judyth Sachs was the session Chair.

Panelists:

  • Prof Catherine Moran, Deputy Vice Chancellor Academic, University of Canterbury
  • Prof Naomi Cogger, Strategic Lead - GenAI Engagement, Massey University
  • Prof Martin Carroll, Deputy Chief Executive Academic, Manukau Institute of Technology and Unitec
  • Luc MacKay, President, University of Canterbury Student Association (UCSA)

NZ AI SymposiumWatch the full recording here. 

Unique challenges in the New Zealand context

While acknowledging that New Zealand faces similar AI challenges to other countries around the world, Professor Naomi Cogger from Massey highlighted a crucial nuance: the integration of Māori perspectives. "There are some nuances that we need to consider around Māori and how that happens," Cogger said, emphasising the importance of addressing AI's foundation in Western knowledge systems which risks "damaging information and knowledge systems for Māori in particular." Massey University, being a Tiriti partnership-led institution, has a responsibility to take this seriously, with Cogger’s role being to "listen and support and if asked, help with the heavy lifting" for Māori-led initiatives.

Professor Martin Carroll echoed this, noting that New Zealand is "putting quite a lot of effort into the local contextualisation of culture, of language, Te Tiriti-led." He also pointed to the emergence of Māori data sovereignty principles influencing the selection and training of Large Language Models (LLMs), citing the work being done by Peter Lucas-Jones from Te Hiku Media around Māori language in LLMs.

Reimagining assessment and the "Illusion of Knowledge"

A recurring theme was the need to rethink assessment in the age of AI. Professor Cogger identified two "illusions" universities must confront: the belief that AI detection tools are effective and the "Illusion of Knowledge," where students don't always understand how their AI use can harm learning if it's solely for generating answers. The challenge, she argued, is creating curricula that teach students "to develop evaluative skills".

The student representative on the panel, Luc MacKay, affirmed this, stating, "I think there are a lot of people who aren't aware of the impact that it's having on their education, on how they learn about things, how they interface with the world." He suggested that much of the potential 'damage' that AI does can be mitigated by rethinking the way that we assess students. Professor Catherine Moran shared that the University of Canterbury is redesigning assessments to focus on critical thinking and questioning.

The challenge of "Authentic Voice" and human trust

The panel delved into the concept of "authentic voice" and the erosion of trust when AI is used inappropriately. Professor Sachs recounted an experience where an AI-polished email "was not my voice... it wasn't authentic." Professor Cogger acknowledged this, but also highlighted the benefits of genAI for individuals with learning disabilities such as dyslexia, noting how it has significantly reduced the time and anxiety she used to spend polishing her own writing.

NZ Perspecitves on GenAI in Higher Education - a Students First Symposium-high

Luc talked about the importance of a trusting relationship between lecturer and student. He believes transparency is key, stating that for example if a lecturer uses AI for content, it would need to be acknowledged really well and explained to the students. Professor Cogger and Professor Moran agreed, stressing that the critical issue is not if AI was used, but whether a "high quality learning environment" was created by the lecturer or educator. If that is the case, the use of AI is neither here nor there.

Addressing risks and future directions

The panel also explored the risks associated with AI. Professor Carroll identified three major risks: the existential challenge to the value of tertiary education, the authentication of learning, and accountability for AI-generated errors or biases. Luc echoed the concern about critical thinking, building on the analogy of climbing a mountain versus taking a helicopter to the top (outlined in this article by Associate Professor Grant Blashki from the University of Melbourne). He agreed that relying solely on AI, the "helicopter" approach, means missing the journey of learning and developing foundational skills and perseverance. 

helicopter-mountain
Source: Google Gemini, an AI-generated image of a helicopter flying to the top of a mountain. 

The discussion highlighted the need for a balanced approach to assessment, recognising that genAI can fast-track to answers, potentially bypassing the crucial processes of "collection, synthesis, analysis, evaluation". Professor Carroll underlined the challenge in the VET sector, where prescriptive learning outcomes leave "very little room to include additional approaches to learning and assessment". Professor Cogger talked about the "two-lane approach" where secure assessment is used for foundational skills and more process-focused methods are used when AI assistance is permissible. The panel ultimately underscored the ongoing evolution of AI in education and the collective effort required to adapt effectively while protecting student learning and prioritising ethical considerations.

You can watch the full recording here. 

About Studiosity

Studiosity is ethical, AI-powered writing feedback and study support at scale for student learning. Universities partner with Studiosity to support student success at scale, with a 4.4x return as retention, protecting integrity and reducing risk.