ChatGPT has dominated news headlines since the release of ChatGPT 3.5 in November 2022 and has been followed by an explosion of GenAI tools and services. These AI tools and services will likely grow and continue to develop new capabilities in the future. How can the academic community stay up-to-date about the latest developments and be supported to leverage AI for academic purposes?
Earlier this year, the Centre for Higher Education Research, Teaching, and Learning (CHERTL) colleagues created and shared three guides to support students and lecturers with using AI. The guides are available publicly (linked below) and on RUconnected. Rhodes University is one of the few South African universities that have produced such guidelines. The student guidelines introduce Artificial Intelligence (AI) broadly and some current debates about the pros (advantages) and cons (disadvantages) of using AI. It focuses on Generative Artificial Intelligence (GenAI) tools and explains how these tools generate AI outputs, how to evaluate these outputs and use these tools in an ethical and responsible way. The two guides for lecturers focus on learning and teaching, and assessment. Both are underpinned by a critical AI literacies approach that involves understanding how GenAI works, recognising inequalities and biases associated with GenAI, examining ethical issues in GenAI, crafting effective prompts and assessing appropriate uses of GenAI (Bali, 2024).
The guidelines are research-informed and the result of CHERTL colleagues' extensive engagements in local, national and international conferences, workshops and conversations in various formats where they have been called upon to share their expertise on GenAI tools and services over the past two years. CHERTL has facilitated cross-departmental conversations, departmental workshops, sessions with student residences and postgraduate students on using AI ethically in research. They have also presented at the Registrars’ Imbizo, Pharmacology conference and national SAERA conference last year and more recently, the international Higher Education Close-Up 11 (HECU 11) conference hosted at Rhodes University.
Among these contributions, Neil Kramm co-authored a paper with former CPGS director Prof. Sioux McKenna for Teaching in Higher Education entitled “AI amplifies the tough question: What is higher education really for?”. Mr Kramm is currently working on a PhD study in which he is investigating the implications of GenAI for assessment design. He presented a keynote for the University of Fort Hare, presented for the University of Kent, Imperial College London, Central University of Technology, and has facilitated workshops for the University of Zululand, among others.
In her role as the Digital Learning and Teaching team project lead for The Higher Education Learning and Teaching Association of Southern Africa (HELTASA), Dr Pallitt collaborates with colleagues across universities to host and facilitate regular online events. The most recent digital dialogue explored the implications of AI for curricula. Her interest is in critical AI literacies, and she has co-facilitated online workshops with Prof. Maha Bali (American University in Cairo, Egypt) on this topic. Dr Pallitt has co-authored papers on equity-oriented learning design and brings an interest in socially just participation to her AI support work.
Associate Prof. Mags Blackie’s scholarship on knowledge building has also given CHERTL colleagues’ work on AI a critical edge. She has written ‘Reimagining Assessment Practices at Rhodes University’ which is currently under discussion. She encourages lecturers to draw on knowledge-building frameworks (Blackie, 2022) and a taxonomy of restrictions in assessment tasks (Dawson et. al, 2023) to reconsider appropriate assessment practices in light of GenAI. Her point of departure is that GenAI will substantially change the world of work and therefore we need to do better at preparing students for lifelong learning.
Educational Technology Specialists, Neil Kramm and Dr Nicola Pallitt emphasise the importance of keeping up to date with the latest developments in AI and engaging in ongoing conversations with staff and students to ensure the guidelines remain relevant and effective. The guidelines are research-informed and aim to support lecturers and students in using AI appropriately and in a way that promotes disciplinary knowledge building. Lecturers in particular, are encouraged to use the guidelines to facilitate conversations around appropriate uses in their disciplines. Kramm and Pallitt note that quite a few Rhodes University lecturers have also contributed to AI conversations in their disciplines internationally, which is particularly exciting.
Links to the guides:
Student Guidelines for the Use of ChatGPT and other Generative Artificial Intelligence Tools and Services https://bit.ly/3HW4xPr
Learning and Teaching with AI tools https://bit.ly/3UBO9eD
Guidelines for Assessment in the time of AI https://bit.ly/3UEIT9U
Links to a selection of articles and event recordings for interest:
Blackie, M. A. L. (2024). ChatGPT is a game changer: detection and eradication is not the way forward. Teaching in Higher Education, 29(4), 1109–1116.
Kramm, N., & McKenna, S. 2023. AI amplifies the tough question: What is higher education really for? Teaching in Higher Education, 28(8), 2173–2178.
The need for critical responses to AI within and by Higher Education (e/merge Africa webinar, recording here)
Critical AI literacy workshop (MyFest23, recording here)
ChatGPT is the push higher education needs to rethink assessment, The Conversation