Choose Integrity
- Integrity Tutorial
- Artificial Intelligence in Education
- Academic Integrity Training
- Integrity Mentorship Program
- Cheating Defined
- Learn More
The release of Open AI's ChatGPT in November 2022 disrupted predominant methods of teaching, learning and assessment in education overnight. And ChatGPT is just one Large Language Model (LLM) tool. There are thousands of others. Academic integrity was obviously a primary concern - which assessments could be generated by an LLM? Which exam questions could be answered? Is essay writing dead?
The world, and the world of education, are still working to determine how LLMs and other Artificial Intelligence tools (generative or predictive) are going to change what we do, how we do it, why we do it, and when and where we should do it. Teaching, learning and assessments all need to change.
This page will try to keep the UC San Diego community up-to-date on all things related to artificial intelligence and academic integrity in higher education.
Generative AI (GenAI) tools, like Claude or ChatGPT, produce text, images, and code in response to a prompt, and they do it quickly and in ways that make it increasingly difficult to detect. GenAI tools present both challenges and opportunities for education. They can replace and augment human thinking. They can take-over and enhance problem-solving. They can hinder and amplify human creativity. And they can hinder and accelerate learning.
Students are already widely using the technology. In one survey, 46% of teens have used the tools and 46% of those have used them to complete assignments/exams. In another survey, 35% of teens admit to using the tools at least once/work and 74% of them use them to improve their school performance. Yet, students are also asking for help in using these tools ethically and responsibly. Students are worried that their critical thinking skills will decrease with their use of AI and 47% believe its easier to cheat with the tools. And 70% believe universities should be providing AI Training and 55% feel unprepared to use generative AI technology in the workforce.
Although there is not yet campus-sponsored AI Literacy training, you can help educate students. This module, adapted from one created by Rush University, aims to help you engage students in conversations about AI, even while it is still relatively new to all of us. Your expertise in the course content, along with this AI Literacy Module and the AI Office's Guide for Instructors, can help you guide students in using GenAI tools ethically and responsibly for your course(s).
"We assume our students are inherently digitally native, but they don’t always understand the professional, ethical ways to deploy new technology like gen AI. They need to be taught how to use these tools responsibly, effectively, and efficiently" ~Sid Dobrin (Warner et al., 2024).
Education and technology are increasingly interconnected, from the prevalence of Microsoft Office and Google products, to learning management systems (CANVAS), to Zoom rooms and now GenAI. It has become challenging to teach, learn and assess without technology.
AI Literacy matters because we - instructors and students - are not always cognizant of what tools we're using, why we're using them, and how they impact our approaches to teaching, learning, and assessment. Students accept Grammarly suggestions without realizing that they are AI generated which might make their writing sound robotic. Instructors may falsely believe that so-called "AI detectors" are valid and accurate, thus using them to "catch" students who are cheating with GenAI. Other AI tools are working behind the scenes - like the AI summary when you do a Google search - so innocuous that many do not realize that their reading of the summary has replaced the critical discernment of "hits" that they used to do.
AI Literacy helps instructors and students:
AI literacy is the ability to identify if, when, where, and why to use GenAI tools. Being able to do this requires that users understand the capabilities, limitations, and broader implications (e.g., ethics) of GenAI.
Key aspects of AI literacy include:
It's crucial to note that AI literacy is not about promoting AI use. Instead, it's about equipping individuals with the knowledge and critical thinking skills to navigate a world where AI is increasingly prevalent. This includes recognizing situations where human judgment and skills are irreplaceable.
By fostering AI literacy, we aim to create a society of informed individuals who can engage with AI technologies thoughtfully and judiciously, always prioritizing human values and societal well-being.
The Critical AI Literacy Canvas Module was adapted by the Academic Integrity Office and the Teaching & Learning Commons from a module designed by The Center for Teaching Excellence and Innovation (CTEI, Rush University inks to an external site.). It is meant as a resource to support instructors to introduce students to fundamental AI competencies so they might become critical users and understand if, when and how they can leverage the technology responsibly and ethically.
The module aims to increase in students the four competencies of AI Literacy - Recognition, Comprehension, Critical thinking, Proficiency - through key questions that students ask about GenAI: What is Artificial Intelligence, How Does It Work, How Do I Use it, and How Can I Use it Responsibly?
By integrating this module and these additional strategies into your course, you can empower students to use GenAI tools ethically and responsibly.
"The only bad way to react to AI is to pretend it doesn’t change anything" (Mollick, 2023).
![]() |
New Book by our Director, Tricia Bertram Gallant
Tricia and her coauthor David are doing Podcast Episodes with instructors and practitioners who have put into practice the strategies mentioned in the book. These episodes provide just-in-time inspiring lessons learned to help others figure out which strategies might work for their context.
|