Young woman working on computer with computer algorithms superimposed

Incorporating AI in the classroom ethically

What do you think of AI? (Redux)

In the blog post “Generative-AI-resistant assignments,” we posed the question, “What do you think of AI?” Whether we love it, hate it, or fall somewhere in the middle, AI in general and generative AI in particular are here to stay. Students will need to be able to use these tools in the workplace of the not-so-distant future, but how should/can we teach students to use them responsibly and ethically?

AI detection: A losing battle?

The concept of a program to detect use of generative AI is obviously appealing, but unfortunately what these detectors mostly provide is a false sense of security. Research has shown that AI detectors are unreliable and inaccurate, they are biased against non-native English writers, and students can easily get around AI detection software. Another issue is that there isn’t a paper trail. Faculty are just expected to trust the results without any details about the methodology for determining whether the writing is AI-generated and no information about the original source(s) of the writing—things faculty would normally include in a plagiarism report.

Disappointed stressed woman and AI robot sitting at the office desk and workingSome faculty are even turning to unethical practices to catch “AI cheaters.” One method gaining popularity is embedding a “Trojan horse” into the assignment prompt. That is, include an absurd or nonsensical instruction somewhere in the prompt, highlight the text, and change the font color to white so that it cannot be seen by students but will be fed into ChatGPT if a student copies and pastes it there. Thus, the premise goes, if the student uses AI-generated writing, it will include reference to this “Trojan horse.” This strategy is unethical for a number of reasons. First, it can erode trust between students and their educators. Students will catch on to this strategy, and it will break whatever connection a student and their faculty member have built.

Second, some of our core values at NIU include that “We model ethical behavior in and out of the classroom” and “We promote our students’ success through advising and mentoring.” We are not modeling ethical behavior if we assume our students will cheat and use deception to “catch” them out. We don’t promote our students’ success if we aren’t advising them on when and why they shouldn’t use AI—and when, why, and how they can and should use AI. Generative AI isn’t going anywhere. Our students will need to engage with it ethically, and we need to help them figure out how to do so. We also need them to understand that using AI unethically is detrimental to their learning and future success. We don’t accomplish any of these things by engaging in deceptive teaching and assessment practices.

Finally, the “Trojan horse” strategy may discriminate against non-native English writers and students with disabilities, particularly students who use screen reading technology. If a non-native English writer uses a translator to help them understand the prompt better, they will likely be confused by the “Trojan horse,” but since it is in the instructions, they may include it in their essay anyway. The same thing may happen with a blind student who uses a screen reader. The screen reader will read out the instructions to the student, including the “Trojan horse.” The student may believe it is a legitimate, though odd, requirement for the assignment and include it in their submission.

Why should we integrate AI into our courses? 

We are already using AI in higher education, often without recognizing it. We use it when we assign adaptive learning programs, such as Wiley Plus, Cengage MindTap, Pearson MyLab, or Lumen Learning Waymaker. We use it to interpret data and learning analytics. We use it to check our grammar and spelling. We use it to translate a scholarly article that’s written in a language we aren’t fluent in. We use it to target students for recruitment, admission, and retention efforts. We even use it to try to detect student plagiarism.

Woman smiling and holding laptop in a computer labMoreover, workers are using AI in their jobs right now. Employers want students to graduate and enter the workforce with the skills to use AI effectively and efficiently. Students want to be prepared for jobs that will invariably incorporate AI in some capacity. To enable our future graduates to be competitive candidates, we need to help them demonstrate to prospective employers that they have the skills employers are looking for, including AI skills.

Beyond employment, students who are taught the capabilities and limitations of AI along with the ethical implications around its use may be better prepared to engage with misinformation, disinformation, and deep fake images and videos. They may be more inclined to be skeptical of information they encounter online. By integrating AI into our classes thoughtfully and ethically (alongside information literacy), we can help our students become more skilled users of AI as well as more discerning consumers of information and media.

How can we integrate AI into our courses? 

How can we assess students’ learning and progression towards learning outcomes in meaningful and engaging ways while integrating AI thoughtfully and ethically? What assignments can we create that students will actually want to do and that will work best in tandem with AI? The task of changing our approaches to teaching and assessment in the age of generative AI may feel compelling, perhaps repellent, or maybe an uneasy combination of both. However, as we prepare our students for the workplace of the future, we cannot ignore the role AI will play in their future careers and the impact it will have on which careers are available to them—or are completely transformed or supplanted by AI. There are some specific ways we can integrate AI into our courses and even small changes we can make to ease ourselves and our students into the process.

AI policy collaboration

Collaborate with students on an AI policy for your class. Have students do some research so they can learn more about AI and provide informed suggestions for how it should be handled in your particular course. They could collaborate on and discuss when it would be appropriate to allow the use of AI and for which tasks and activities it should be prohibited or limited. Through this process, students will develop a better understanding of AI and when it’s appropriate or inappropriate to use, and they will have a sense of ownership over the course policy (FYI, this also works for collaborating with students on other course policies).

Enhance learning

Teach students how to use AI tools in ways that will enhance their learning. For example, have students use AI to brainstorm ideas for a project or paper. Show students how to use AI to generate an outline for a paper they will ultimately write themselves. Go a step further and have them critique and revise the brainstorming output or outline instead of just using it unquestioningly. If you don’t want to require students to create a login or feed into AI algorithms, you could do these exercises as a whole class activity projected on a classroom screen. You can also use Microsoft Copilot, which is included in NIU’s Microsoft 365 subscription. Students can use their NIU account for access and benefit from commercial data protection to maintain their privacy.

Critique AI with students

Assign students to evaluate and critique AI-generated content in class. Students could identify sections of AI-generated writing that need revision, refinement, or fact-checking. You could task students with discovering a factual error along with a source to support their finding. Speaking of sources, encourage students to evaluate the lack of source citations—or, if you have AI generate citations, have students investigate whether those sources actually exist, are cited properly and correctly, and are used appropriately within the text. Discuss with students what context may be missing from the AI-generated content that leaves it “lacking.” Also, have them point out the strengths of the AI-generated content. Encourage students to reflect on what we lose (or give up) if we rely on AI to generate content for us, and conversely, how AI could be used (ethically) to benefit humankind.

Teach about AI

Discuss how AI content is generated and the ethical implications therein. Discuss where AI images come from. Explore the intellectual property and copyright issues around generative AI. Explore plagiarism and citation issues around AI-generated content. In teaching about AI, you could also teach students about AI in your field. What does AI look like in your discipline? How is it used? How could it be used? Have students discuss the practical and ethical implications of AI in/on your specific discipline. This is a discussion that would look very different for different courses and disciplines and could illuminate for students how AI is more than just ChatGPT. 

Explore AI as a component of digital literacy

As stated by the American Library Association: “Like information literacy, digital literacy requires skills in locating and using information and in critical thinking. Beyond that, however, digital literacy involves knowing digital tools and using them in communicative, collaborative ways through social engagement.” Consider how you already integrate discipline-appropriate and specific digital literacy into your courses. How could you go a step further to include AI in ways that make sense for your course and that complement your existing efforts to increase students’ digital and information literacy?

Final Thoughts

Hand appears to be holding a virtual image that says AI in the middleWe’ve been using AI in our everyday lives for years. We use it to check our spelling, open our smartphones, play music for us on smart speakers, change the temperature on our thermostat or preheat the oven, avoid traffic while using navigation apps, detect fraudulent charges on our bank accounts, and get recommendations on streaming apps. Faculty, students, and higher education staff and administrators have also used AI in a multitude of ways, as mentioned above. AI isn’t going to disappear, and our students’ lives and careers will be impacted by it in one way or another. Rather than ignoring or resisting it, let’s use AI as a catalyst for rejuvenating and inspiring our approaches to teaching and assessment to prepare our students for responsible citizenship and the job market of the future.

Print Friendly, PDF & Email

Leave a Reply

Your email address will not be published. Required fields are marked *