<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=1005154772848053&amp;ev=PageView&amp;noscript=1">

Higher Education's Thoughtful Response to Robot Writing - a panel discussion recap

Professor Judyth Sachs

Professor Judyth Sachs

Feb 28, 2023

To start with I must point out: this session was by far the largest of any webinar that we had hosted before, with over 1,700 registrations and over 1,000 people in attendance in real time. Clearly, the appetite for thoughtful discussion and reflective debate around this topic is of high interest to people in the sector. 

I wouldn't be surprised if ChatGPT was 'the word' of 2023, or in fact by December it could be so 'old hat' that people will think, 'what was ChatGPT?'. Who knows? Since its release in November 2022, every day there is some mention about AI or ChatGPT in our media feeds and social media sites. The first response a few weeks ago was something of a panic: is the end of universities as we know them imminent? Is the essay 'dead'? As some of this reactive heat subsided, we brought together a group of university leaders to discuss this topic from a particular perspective.

Symposium Robot Writing - full recording.mp4.00_01_32_19.Still001

On 7 Febraury 2023, I had the honour of Chairing the Robot Writing Symposium session and discussing this topic with four engaging experts: Professor Rowena Harper, Deputy Vice-Chancellor (Education), Edith Cowan University; Professor Giselle Byrnes, Provost and assistant Vice-Chancellor at Massey University; Dr Julia Christensen Hughes, President and Vice-Chancellor, Yorkville University; and Professor Theo Farrell, Deputy Vice-Chancellor and Vice President (Academic and Student Life) at the University of Wollongong.

>> Watch the full recording, here.

Each panelists generously shared their time, expertise and thinking around the emergence and popularisation of generative AI technologies in higher education, along with a lively chat discussion happening in session among the thousand-strong attendees. 

Specifically, we were discussing the impact that this technology has on learning and teaching and assessment, both in terms of the opportunities present, but also the areas of concern, and the strategies - both long and short term - that institutions can start to implement in response. Some key themes emerged:

Higher Education is at an exciting 'tipping point'

Each of the panelists agreed, this is a significant moment in history, for our sector. Prof Byrnes introduced Malcolm Gladwell's concept of a 'tipping point', from his debut book of the same name, "the term is often used just to signify a moment of crisis and opportunity when we can really leverage something that is new", meaning that the issues brought to the surface by ChatGPT/robot writing tools are not new, but rather this new tool is a catalyst for the upheaval of the status quo. Although some in the chat disagreed, the panelists all felt this tipping point to be a reason for excitement. 

Dr Julia Christensen Hughes:

Is this a profound assault on the Academy or is this a natural evolution of the sort of the technological revolution that we've been experiencing forever? I do think it is profound, yet a natural next step. I am excited about this actually. I think the questions that ChatGPT and other forms of artificial intelligence present causes us to question much more deeply than I think we have in the past, "what is the role of the faculty and what is it that we want students to learn in terms of their knowledge, skills and values?" How do we facilitate that learning and how do we assess it? So I think this is going to cause us all to reflect deeply and ensure we put in place the very best learning opportunities for our students.

And that's exactly what we have underway at Yorkville University right now. We've embraced some signature learning outcomes. We are going to be articulating signature learning pedagogies and signature approaches to assessment, focusing on authentic assessment that we feel are going to support our students and prepare them for this brave new world. 

Dr Julia Christensen Hughes, Yorkville University, at the Studiosity Symposium

Prof Giselle Byrnes:

We really need to think hard about what we want students to learn, what teachers teach. And I would argue for much, much more emphasis on graduate attributes, skills, those higher order skills that we talk about all the time, and how do we really redesign assessment to speak to the acquisition of those skills rather than the content and the focus on information? And I know many of us have been on that journey for some time.   

Prof Theo Farrell:

... our institutions have a common mission, which is to prepare and empower students for their futures. And we've known for many years that the future is going to be profoundly reshaped by AI automation technologies, it's been pretty well signposted, actually. And so this is a good example of how we have to get on with the business of reshaping our higher education offerings to better prepare our students for the future. It's a very exciting moment, actually in higher education. There's lots of opportunity here and I'm really looking forward to how we can get into realising those opportunities for our students, and support our staff to do that.

Prof Rowena Harper:

I think artificial intelligence has been lurking on the periphery for most unis for some time, but I think ChatGPT has ... really crossed a threshold that really forces us to look at this technology and really incorporate it into what we how we function as universities to our learning, teaching and our research.

Assessment has needed a rethink for a while, and the integrity crisis has been escalating for a decade

Many institutions are already implementing all kinds of authentic assessment strategies, and have been for some time. Those working in Teaching & Learning (T&L) are very familiar with the rising tide of contract cheating websites, academic integrity best practices, and government bodies tasked with sector oversight to mitigate these issues.  

Prof Rowena Harper:

For some time, particularly since contract cheating really came to the fore as an academic integrity issue, there's been a really intensive focus in universities on assessment security and within our current structures where in every single unit we have probably at least three assessment points, at most universities, staff put an enormous amount of work into securing every single assessment point in a unit, every single assessment point in a course. And increasingly, that's becoming unsustainable.

Prof Rowena Harper, ECU, at the Studiosity Symposium

A lot of educational researchers and academic integrity researchers have been pointing to programmatic assessment as a really useful way to step out of this unsustainable system that this kind of rod we're making for our own back, if you like, around assessment security. And so programmatic assessment really entails fewer summative assessment points. Those assessment points you do have in a course, are highly authentic and invested with lots of resources, but really high quality, valid assessment. And then all your other resources that would typically go into lots of other summative assessment points go into actual teaching practice. Formative feedback, developmental feedback for students. There's some really interesting models of this happening around the world. I'm not aware of anywhere where it's really yet happening at scale, I think because the kind of structural inertia within universities makes it very difficult. But I'm sort of hopeful that this is a moment where we might actually really begin to seriously look at programmatic assessment.

We really need to start distinguishing better between learning, that is actual learning that's happened, and performance, and we've historically assessed students via artefacts: essays, reports, etc., through which we infer that learning has actually happened. And what we're now seeing with ChatGPT is a tool that can replicate the artefact pretty successfully, so we're really going to need to step up our assessment approaches to better assess learning itself and not just the artefact. And we really need to try and work harder to better observe learning, support learning and assess the learning process, and not just products, which are not really going to be a good proxy for that process anymore. So I think what's happening here is that we're really confronting our kind of historical use of the artefact as a form of assessment, and we might see that in the longer term start to sort of fade away as a form of assessment.

I'm hearing lots of people start to talk about 'collaborative intelligence', which is the recognition that the future will involve humans and machines working together, you know, the intersection of human learning and machine learning. This is really an opportunity for universities to focus on those really uniquely human skills that we need to develop more powerfully in students. Empathy, communication, teamwork, problem identification, and problem solving. It's really our opportunity to to really lift those up and elevate those in the curriculum in the way we teach and assess. 

Higher Educations Thoughtful Response to Robot Writing-low

Dr Julia Christensen Hughes

What is it we want our students to be able to know, do, and value? And then we need to be very thoughtful and creative about the learning experiences that we provide our students. So how do we facilitate their learning? This is all about becoming much more intentional and creative and I think we've used artefacts as a proxy for learning - "provide me with a paper on this topic" - without really unpacking for the students, deliberately breaking that task apart and giving the student feedback at every step of the way.

So I just don't think that we have been sufficiently thoughtful. We've made too many assumptions that when we assign work that all of this learning is going to occur. So I think we have to become much more specific about, again, what we want our students to know, do, and value; be very creative, very deliberate about how we facilitate their learning, and then make sure that the assessment is valid.

We've had academics, people working in our teaching centres, educational developers and instructional designers asking these questions forever. And we can't ignore them any longer. And that's why I'm excited about this. It's an opportunity to get really serious about student learning and to understand our own role in facilitating and assessing that. 

Prof Theo Farrell:

It's an opportunity for us to take a step back and think about assessment in a more deliberative way at a course level or programmatic level. Many universities are full of colleagues who are really passionate around learning and teaching, really committed to doing the best for their students. We all have to recognise that, certainly in Australia and New Zealand and many other countries, our academic colleagues have just been through a couple of really challenging years where we've asked a lot of them. Many have upskilled. Many have transformed what they've delivered through integration of digital technologies and a lot of that digital transformation has been locked in and is now obviously improving the experience of students. And now with ChatGPT, there's probably a sense that many academics are probably breathing a bit of a sigh of, "Oh my gosh, now again, we have to redesign what we do". As as academic leaders, we have to recognise the burden that's now falling on academic labour and support our and support our staff to be really empathetic in that.

Where do we put the emphasis on protecting academic integrity, but where else do we put the emphasis in terms of innovation, particularly around authentic assessment? That's obviously been on the radar for ages. It's just quite difficult to do authentic assessment, and it's resource-intensive... And broadly speaking, we over-assess, we perhaps don't explain enough to students about the purpose of assessments. And so, again, as part of this very genuine engagement with students to help understand why they're doing this assessment, how it helps them develop skills and in that process, ourselves, thinking about where can we, in a resource-efficient way, protect academic integrity but also do some creative stuff that leverages the opportunities that AI provide, understanding what we can expect from AI and what we cannot expect from AI. 

Prof Giselle Byrnes:

About the assessment piece and how we need to to keep amending what we do and reflecting on how that is actually preparing our students for the world beyond university. I mean, the reality is that AI is with us in every aspect of our lives. So I think it's a huge challenge for universities globally to really think about what the purpose of education is. So actually adopting that kind of critical, skeptical, self-reflective lens and take that approach, I think that's the best gift that we can give our students. Now, the impact that has then on our teachers, short term, I mentioned, you know, a couple of weeks runway to the start of semester one. Exams seem like an easy option to go back to, but it's and I do say go back because I think we made great strides forward, particularly those of us who have been distance providers for a long, long time like my university. I think exams have their place. Absolutely. But let's not lose the momentum and the innovation that we have turned ourselves inside out for over the last three years. And let's keep challenging ourselves. You know, let's take that challenge that we push to our students, which is to challenge ourselves to be a learning community and to really leverage this moment to redefine what we mean by university so that we are relevant, because there are those big questions that are being asked out there at the moment. 

Dr Julia Christensen Hughes:

I think the challenge for all of us is how do we embrace our humanity, have our students embrace their humanity in the learning process, and then apply what they're learning to their own lives. And I think that's some of the answer, and that's what we can create, that learning context and that assessment context in the classroom and beyond exams, that can only test to a certain extent.

There is an ethical dilemma here, which needs to be explored further

In the chat, some participants were asking about the ability for teachers and educators to use AI tools for marking, or reducing their own workload. Which begs the question: if students are using it to write, and teachers are using it to mark, where does the learning come in?

Prof Rowena Harper:

[If we] try to prevent or prohibit the use of ChatGPT, as a strategy, students wouldn't actually learn how to use it. And I think in the learning how to use it... we've talked about a lot of things. We've talked about the importance of information, literacies, critical literacies, etc. in the learning to use ChatGPT. But I think there's also a really significant ethical dimension to learning about how to use ChatGPT and its strengths and weaknesses. And also its kind of threats. And I do think our students are kind of ready for that. Our students are really attuned in now to the ethics of what they use and consume everyday. They know where their coffee beans come from, they know where their furniture comes from, where their clothing comes from. They're tuned into issues like fast fashion, single use plastic. They're tuned in to things like worker exploitation and other social justice issues. Students are therefore quite concerned about integrity in its broadest sense.

So I think we do need to do better in universities at actually connecting students' concerns about integrity to how knowledge gets made and to how information gets generated. I think if we can connect those issues of information and knowledge to the broader social discourse around integrity, that that we will that will be an important step forward. 

Prof Theo Farrell:

I think Rowena's point I think you made it really, which is really fabulous about connecting these issues around integrity and ethics back to the broader social concerns that our students are bringing with them into university. I couldn't agree more, actually. In the area that I work on, which is is defence policy, military affairs. There's a ton of research on the ethics of AI and the use of AI in warfare because it's endemic in warfare now. And so there's a large body of academic work actually on the on the ethical use of AI, and I'm sure likewise in health and so forth. So it's a very rich stream that we can tap and we can bring in to also our teaching practice. So it's a it's actually an important and good moment for us in the university sector to really ponder and think in the broader sense around the ethics of AI. And it's one of the ways that we as anchor institutions in our societies are going to help shape a positive future for all as we move towards an age of automation.

Prof Theo Farrell, UOW, at the Studiosity Symposium

What can university leaders do, right now, to help mitigate these issues?

Panelists took us through some strategies.

Dr Julia Christensen Hughes:

I think by presenting our students or having them play with [ChatGPT] and then together critiquing it and again that's back to that, becoming a savvy consumer of things that are written. I think that just the recent pandemic has shown us how, how much under threat the Academy is in terms of science and evidence. And we need all of our students to be equipped right, to be savvy consumers of information to turn that into knowledge, to have confidence in what they know. And so I think that there's so many things we can be doing.

Prof Giselle Byrnes:

We expect our academic faculty to be flying an aircraft while they're building it at the same time. We do that all the time, and leaders like us expect colleagues to spin on a dime. "Pivot" is one of the most overused words in our language I think, from the last couple of years. And I just want to pick up Thea's observation from a couple of comments ago, and it loops into the question that was asked about why doesn't authentic assessment take hold? Why doesn't programmatic assessment stick? And it's because assessment is a part of a much more complex ecosystem. So I think it actually behoves those of us in leadership positions to cut through some of the bureaucracy that we've created for ourselves. We were liberated through COVID to be able to do this, and I suspect there's been a bit of a swing of the pendulum back to some of those processes, back to the sort of ritual and custom that we were comforted by. But actually I think it's about making swift decisions, giving real clarity to our academic teaching colleagues and to students, and really thinking about how we have the responsibility as leaders to try and shield teaching academics from the bureaucratic rain that they cope with every single day. So, you know, I'm constantly telling myself, you know, it's a great idea, you know, to keep the university innovating, to keep us ahead. But then what is the impact on our teachers in the classroom? So I think authentic assessment and the programmatic assessment and the other things that have been talked about here today - fabulous ideas. But I think the challenge is for us, how do we really start to lead? How can we speak directly to students about the nature of information and knowledge? You know, how do we square the circle in our communications? How do we really give that clarity of expectation? So that's what I'm challenging myself with. 

"...how do we really start to lead? How can we speak directly to students about the nature of information and knowledge? You know, how do we square the circle in our communications? How do we really give that clarity of expectation? So that's what I'm challenging myself with."

Prof Giselle Byrnes, Massey University, at the Studiosity Symposium

Prof Theo Farrell:

[Regarding] the structure that we have in our universities of courses, courses and units, it's very structured how we deliver education offerings to our students. We could be at a moment in time where we, because of new technologies, we could be moving to a much more flexible mode of how students package their learning, and their learning journeys.

So I would predict the next few years we're going to move perhaps to our existing structure of hundreds of courses and and, you know, to perhaps a fewer number of courses and technology enabling students to curate their own learning journey. I mean it's already happening in the private sector. And AI is going to power this. So in fact, we are probably had a moment where we're not really seeing the impact of AI round the around how it's helping students access content and with the implications for academic integrity, but AI is probably going to transform our future structuring and delivery of education offerings.

So the thing that's going to slow us down to leverage technology to provide better opportunities for our students is our bureaucratic structures and our regulation. Partly it's down to our university cultures, partly it's down to external regulations. Regulators. Somebody previously in the chat made a really good observation with respect to accrediting bodies around how it's going to be a lot of cultural work actually, and cultural change, working with our external regulators and accrediting bodies and in our own university communities to really reimagine the future. You know, can we imagine a future where we empower student choice, we guide but empower student choice, and we liberate students to be able to actually curate their learning journeys too. Because the bottom line is there's a whole set of jobs that are coming down the line that right now we can't even imagine. But they're going to happen, you know. 

Prof Rowena Harper:

Thinking, too, in the short term just on that question of how we teach students. One of the things we're encouraging staff at ECU to do is get AI in some form into your marking criteria and assessment rubrics to give yourself an opportunity to give students feedback on their use of it. At ECU, we're following the approach of many other universities where we're acknowledging that students are likely to be trying to use ChatGPT in their assessments this semester. And we're requiring that if they do, they acknowledge it through some kind of citation acknowledgement, which we're giving them advice on at the moment. But we keep reminding staff that if a student uses ChatGPT in their assignment and they acknowledge it, it doesn't mean that it's a high quality assignment. It doesn't mean they have to pass. It might not be academic misconduct, but we have to distinguish between misconduct and a passing assignment. So incorporating into your rubrics and marking criteria is really good way just to start having that conversation with students. 

Pottery by hand instead of the pottery wheel

This was a popular metaphor that came up in a question from an attendee named Deborah. "...students are moving from making pottery by hand to using the pottery wheel." She went on to ask the panel what new skills would be needed to "use the tool effectively while still understanding the contents of the material used to create the artefact."  Are the skills and capabilities that students need to develop - and that employers require - fundamentally different now that tools such as ChatGPT are readily available?

Dr Julia Christensen Hughes:

I think maybe a starting point is just to consider the profound ways in which all kinds of professions are being changed. And I'm thinking of if you were a student in a law school, for example, and it wasn't that many years ago that those students had to graduate, being prepared to look through all kinds of case law, synthesise something and come up with a recommendation. And now it can happen instantly or I'm thinking of journalism students or students in marketing, that artificial intelligence, you know, the Internet even prior. I mean, all of this is revolutionising the workplace. And what students need to know how to do, to be fully competent. So I think we just have to move that back into the universities.

"And of course, our role isn't just to prepare students for careers, but lives of meaning and purpose."

And of course, our role isn't just to prepare students for careers, but lives of meaning and purpose. So it is a host of things we want them to learn. But I think if we really can understand how profoundly the workplace is changing and use that right, I think we're going to have to engage more effectively with employers and the professions and and partner with them bringing that into our classes.

One of the biggest implications of all of this is: what do we do to support our faculty? To help them continue to develop the skills that they need and also what what needs to happen in our graduate education programs that are preparing the future professoriate and that they can then bring this into their undergraduate classrooms. So I guess I'm just seeing a lot a lot of thinking, collaboration, partnering and skills development going on for all of us. 

Prof Rowena Harper:

I think the pottery wheel is a really interesting analogy because I think for students - and I've done one pottery class in my life, but the student's really going to need to learn what can a pottery wheel help me do more efficiently and more effectively? And what do I need to continue doing by hand? The analogy works in lots of ways. Students are really going to need to learn in relation to artificial intelligence and particularly these kind of algorithmic or machine learning tools that generate information. They're going to need to learn in much more depth what's under the hood. So how do those tools work? What information sources do they draw from? How do I know they're credible? They're really going to need to take those sort of critical information literacies to the next level by really learning how these tools function, because that's an important part of assessing the veracity of what comes out of them, their outputs.

I think they're also going to need to learn to analyse very critically the quality of the outputs, what the outputs do and don't do. So if we think about what ChatGPT can do, it's designed, as we've said, to basically generate the most statistically likely responses to a prompt. And so it's not very good at divergent thinking. It's going to give you the most likely answer. So how can we use chatGPT as a kind of leaping off point for creativity? You know, looking at what did ChatGPT not think of? What's missing? How can we build on what's come out of the tool? It's really good at lists and it's really good at summarising. So what other formats of writing do we need to teach students?

I think writing will remain an absolutely vital skill, it's not going away. We need to teach students to be powerful writers. When we hear from employers what the number one skill they look for is in graduates, it's communication, and writing is a really powerful tool for getting things done in a workplace. Think about persuasive writing. Think about the ability to argue, the ability to reassure, the ability to inspire through a piece of writing. So AI tools might get better at doing that over time. But students will still need the ability to assess the output of any tool against their purpose. What are they trying to achieve with a particular audience in a particular context to solve a particular problem? So it's that thinking of what's under the hood, but also what's coming out of it and being able to critically reflect on it and build on it is going to be the key skill set. 

Prof Giselle Byrnes:

A very similar response for me, I love the pottery analogy, so thank you to the person who volunteered that. I'm going to come back to graduate attributes. So the kind of higher level metacognitive skills that we say that we assess and that we develop in our graduates, you know, leadership, teamwork, you know, all of those things that used to be called 'soft skills', completely erroneously, because they are really the things that make us human. And going back to that theme of complementarity in regards to the question that was asked, I think that's where we should be focusing. So what can the robots do? What can the robots not do? What are the weaknesses or the pitfalls and the limitations of that? What does it mean to be human in the learning enterprise and endeavour? 

"What does it mean to be human in the learning enterprise and endeavour?"

Dr Julia Christensen Hughes:

As part of this discussion, we also have to think about how we have assessed faculty and this the whole sort of publication game. I'm going to say that in quotation marks, where it seems to me like the number of publications has started to count more than the quality of the work in some quarters. And I think driven by metrics that, you know, rank in universities and rank in different business schools, for example, I've done some work in that area. I actually think then as well as talking about authentic assessment of students, we might want to consider authentic assessment of the faculty and and really understand the faculty members role in terms of the creation and dissemination of knowledge with impact. I'd like to see faculty rewarded more for working in partnership with organisations contributing to development of policy and improvement of society. There's been a proliferation of journals around the world. If people can't get peer reviewers anymore, there's so much work that needs to be done in that regard.

I'm actually quite concerned about the reputation of the Academy writ large in terms of both the work of the faculty and the degrees we confer. So I just wanted to throw that in as well. I think I think we've got we've got a lot of thinking to do about publication in general. 

Higher Educations Thoughtful Response to Robot Writing-low (1)

And what a great statement to finish on. I think we have our work ahead of us. But the thing that I really picked up is the intersection of human learning and machine learning, and the "soft" skills that are required to learn in universities and prosper in the world beyond academia. Institutions remain committed to ensuring that learning is authentic, deep, and that students develop the skills they need to succeed in the workplace, regardless of whatever new ‘shortcuts’ may emerge. 

>> Watch the full session here

To see past sessions and register for upcoming sessions, please visit our Symposium series webpage

About Studiosity

Studiosity is personalised study help, anytime, anywhere. We partner with institutions to extend their core academic skills support online with timely, after-hours help for all their students, at scale - regardless of their background, study mode or location. 

Now you can subscribe to our educator newsletter, for insights and free downloads straight to your inbox: 

Subscribe >>