Date Uploaded
Summer 7-19-2024
Activity source
Original
Summary
Instructor asks ChatGPT questions related to a key concept to find one where ChatGPT has conceptual errors. Ask students to find the errors and explain them.
Extended Summary
AI often generates information that, while sounding very confident and correct, simply isn't. Particularly with deeper concepts, ChatGPT is likely to make similar mistakes to students who only have a surface-level understanding. This is an idea for an exam question (or in- or out-of-class activity) in which the instructor can ask deeper conceptual questions of ChatGPT (or other AI) until it finds a response that has numerous issues. This process works particularly well with concepts related to computational equations and assumptions behind them, a common topic in engineering, physics, and chemistry. As part of the exam, you can ask students to annotate where a few of the issues are and explain them. I add some lightheartedness by framing it in terms of them correcting their "fellow intern, Chet Geepeatee".
Student Learning Objectives
Students will:
1. Test their conceptual understanding of a topic
2. Gain an understanding of how accurate AI is (and isn't)
Assignment Type
Both in-class and out-of-class
Course level
Multiple Course Levels Apply
Used in course?
yes
If yes, enter name of course in box
Process Analysis and Thermodynamics
Type of Student-AI Collaboration Required
Completely AI-generated (human evaluation only, no intervention)
Type of AI Task(s)
Summary
Second Type of AI Task
Problem Solving
Uploader/Author Affiliation
Faculty
Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.
Document Type
Teaching Material
Recommended Citation
Brennan, Janie, "Activity: Students find conceptual errors in AI output" (2024). Generative AI Teaching Activities. 6.
https://openscholarship.wustl.edu/ai_teaching/6
Included in
Chemical Engineering Commons, Chemistry Commons, Mechanical Engineering Commons, Physics Commons