Teaching with Chad

“Human beings have dreams. Even dogs have dreams, but not you, you are just a machine. An imitation of life.” -- Detective Del Spooner, I, Robot

Image from Unsplash

Alex Proyas’s movie I, Robot presents a dystopian future replete with robots who interact with humans on a daily basis. Eerie in their human-like resemblance, the robots operate under “Laws of Robotics,” designed to protect humanity at all costs, even if it means protecting humans from themselves. The potential for overreach is blatantly clear and explored throughout the remainder of the film, which serves as a warning against human dependence on technological advancements. 

Set in 2035 but released in 2004, the technology in I, Robot seemed far-fetched and impossible to most viewers at the time. Within the past few years, however, AI (Artificial Intelligence), has evolved and infiltrated even the most mundane parts of our lives, suggesting an even heavier presence planned for the near future. 


Major tech companies have embraced and promoted AI chatbots. Both Google and Meta now have their own versions to compete with the uber-popular ChatGPT. 


My knowledge of this level of technology is limited, to say the least. My students’ skills with chatbots and AI, however, are not. 


Beginning in January my students started talking more and more about getting help from “Chad” with their papers and essays. References to “Chad” occurred in conjunction with “the side eye,” smirks, and giggles all around. Just who was this “Chad” character?


Thank God for NPR. Not too long after Chad became a hot topic in my classroom, NPR ran a story about ChatGPT, and I started connecting the dots. ChatGPT is the elusive “Chad”!


The NPR correspondent presented the pros and cons of chatbots, highlighting their potential to save real estate agents time by generating efficient and compelling descriptions of houses for sale, freeing the realtors to devote more attention to open houses and other responsibilities. And, of course, ChatGPT can write an essay or a research paper on a variety of topics. 


ChatGPT is trending in pop culture, too. A recent episode of South Park featured  students using ChatGPT to write essays and text girls. The kids were trying to score in the classroom and out! The best part of the whole thing: the creators co-wrote the episode about ChatGPT with ChatGPT. Talk about meta!


It’s almost too easy to invite “Chad” into your assignments. All you have to do is open up ChatGPT and enter a command, like, “Write a multi-paragraph essay on the French Revolution.” And voilà! An essay emerges. 


I had to have “the talk” with my students about the ethics surrounding the use of “Chad.” I compared chatbots to other online sources like SparkNotes or GradeSaver, although it’s not really the same. Through the years, English teachers have become regular Sam Spades as we pasted portions of essays in Google searches or used turnitin.com to catch online plagiarism. Those investigations now involve running student writing through “AI checkers” (which aren’t very reliable), looking for robot-crafted passages that don’t resemble our students’ voices. I ended my speech with a warning: Chad might write an essay for you, but he also might invent sources and quotes to populate it (true story). 


The first question after I finished: “Who ratted us out? Who told y’all teachers about ChatGPT?”


Um, The New York Times?


As if ChatGPT’s sole purpose is to allow students to plagiarize, and as if some student at my school spilled the beans and ruined everything for students around the world. 

Image from Unsplash

Second question: “Yeah, but is it really plagiarizing? You can’t find the source I used, and there’s no record of the writing anywhere else. AI isn’t even a real person.”


Tougher to answer. The ethics surrounding chatbots are unique and troublesome, particularly for schools. 


Teachers have noticed “red flags” that often indicate AI-created material. Many paragraphs include several sentences in a row that begin with the same word or a series of sentences that are just reworded versions of each other. 


Revision histories offer insight into the students’ creative processes as well. Opening a document at 7:56 pm, pasting a block of information at 7:57 pm, and then submitting the assignment at 7:58 pm might raise suspicion.  


Most telling, though, is the lack of “heart” and “humanity” within the generated writing. My years of experience teaching writing have taught me to recognize my students’ individual narrative voices, styles, and expressions.  In comparison to these unique compositions, bot writing feels stiff and “mechanical.”


And AI has a lot of potential to help us, even in the education setting. Chatbots’ primary function is to create conversation and help users perform “routine” work that will ultimately free them up to complete other (more important) tasks. (At least I think that’s the goal! This article is definitely not designed to explain the technical side of AI!) 



As a classroom teacher, I can ask “Chad” to create discussion questions, a quiz, or a writing prompt for the novel that I am teaching. Students can ask for an explanation of semicolon rules with examples or help solving a complex math problem, almost like having a private tutor at their beck and call whenever they need it. 


So is this the new “normal” for classroom instruction? Perhaps. But maybe students will begin to see the flaws in the system that we already recognize, and the novelty will eventually wear off. Maybe teachers will restructure assignments to avoid the temptations of bot-generated content and the entire landscape of education will be transformed. It wouldn’t be the first time education had to change to accommodate an event of world-wide significance (See March 2020). 


(This article was originally published March 25, 2023, in the Southern Spice section of Times-Georgian.

Previous
Previous

The TikTok Generation: How Social Media Is Redefining Teen Culture

Next
Next

A New Generation of Public School Teachers Is Leaving the Profession in Droves