POV: With ChatGPT’s Arrival, Should Educators Be Mourning the End of the College Essay?
POV: With ChatGPT’s Arrival, Should Educators Be Mourning the End of the College Essay?
“With these advances in AI technology, we have an opportunity to think deeply about our ‘why’ and reevaluate what we can and should be teaching in a changing world”
In the weeks since the release of ChatGPT, a new AI technology that can write convincingly humanlike passages of text, we have seen a flood of worry among educators that students will use AI to write their term papers and that “ChatGPT will make cheating easier than ever.” While this technology will certainly impact how we use writing assignments, plagiarism isn’t high on our list of concerns. Rather than focusing on work-arounds—so we can use the same assignments we have always used—we think we need to have a much bigger conversation. In the midst of a changing world, we need to ask ourselves what we teach students and why, and sort out which learning objectives we should retain, which will become obsolete, and which we should add to the curriculum.
Automated text generation has the potential to be as transformative as the printing press. Trying to prevent students from using this technology, as the New York City education department has attempted, seems as impossible and unnecessary as trying to force people to travel by horse and buggy after cars were invented. Moreover, trying to maintain the status quo might actually do a disservice to our students. This technology will change many of the professions where our students will ultimately work. They’ll have to navigate a world that makes use of language generation technology once they graduate, and we should let them explore how to use these tools appropriately in spaces designed for learning and exploration.
Instead of trying to preserve writing assignments just because it is how we’ve always done things, let’s look at the purposes these assignments serve and consider how best to achieve these functions. Through writing, students learn to organize their thoughts, draw together evidence, synthesize complex ideas, develop compelling arguments, and much more. While AI can help with some of the mechanics of writing, students will still need to hone many of these skills. In one course this semester, we spent several class meetings on how to prepare and format a research manuscript. If formatting a paper is not a skill students will need in the future, perhaps that time could have been better spent working on these deeper intellectual skills. Writing assignments are also a way for instructors to evaluate what students have learned, and what more we need to teach them. In some cases, we imagine writing assignments are still the best means of serving these functions, but in other cases, other kinds of assignments might be more useful.
Just as society had to build literal guardrails to make driving cars safe, we are going to need to figure out how to prepare students to handle risks inherent to the new technology. With the ability to generate text instantly, we will have to redouble our efforts to teach students how to wade through the coming influx of text to identify misinformation. ChatGPT can make spectacular mistakes that students will need to learn to spot. In an essay where we asked it to write about American Sign Language, it wrote, “A sign for ‘I’m scared’ might be made by pressing a flat hand against the forehead.” This gesture—::facepalm::—is not correct. Students will need to know when AI works well, and when it doesn’t (e.g., in smaller fields, like deaf education, the technology doesn’t have much text to learn from and so it doesn’t work very well). We will need to teach (and create) conventions for properly attributing sources so readers can differentiate between AI- and human-written text. AI is trained to write based on human writing, and so it has learned to reproduce our ableism, racism, sexism, and other biases, especially in response to biased prompts. We will need to teach students how to root out and respond to toxic, biased, and harmful text, no matter how it is produced. Students will also need to learn strategies for getting useful output, and employing it in constructive ways. For fun, we tried to prompt ChatGPT to write this essay, and while the essays were not very creative (they shared much of the same content, structure, and words), the prompt makes a big difference in essay quality. These are just some of the foreseeable skills students will need to learn.
This is hardly the first time educators have had to grapple with how and when to use a new technology, or the first time people have feared that new technologies would destroy students’ thinking. Even Socrates worried that teaching students to write at all would lead to a populace with weak memories. Before calculators and computers, statistics was taught by paper and pencil. Now most statistics courses not only allow students to use computers, they explicitly teach students how to use the software. Certainly, there are times when it is pedagogically important to ask students to put their calculators away, but we have also made space for technology in our instruction. By delegating the tedious parts of the calculations to a computer, statistics courses can spend more time on “why” and delve into much more sophisticated techniques that would be impractical to carry out by hand. With these advances in AI technology, we have an opportunity now to think deeply about our “why” and reevaluate what we can and should be teaching in a changing world. In the end, making our assignments more meaningful to students may actually be a very effective way of dealing with plagiarism, as students are less likely to cheat when they are engaged in work they believe will help them become the future selves they envision.
Note: This essay was generated by human beings, with the help of spell-check.
Naomi Caselli (Wheelock’09, GRS’10) is a Wheelock College of Education & Human Development assistant professor of deaf studies and deaf education and codirector of BU’s AI and Education Initiative. She can be reached at nkc@bu.edu. Christina Dobbs is a Wheelock assistant professor and codirector of the English Education for Equity & Justice program. She can be reached at cdobbs@bu.edu. Derry Wijaya is a College of Arts & Sciences professor of computer science and codirector of the AI and Education Initiative. She can be reached at wijaya@bu.edu.
“POV” is an opinion page that provides timely commentaries from students, faculty, and staff on a variety of issues: on-campus, local, state, national, or international. Anyone interested in submitting a piece, which should be about 700 words long, should contact John O’Rourke at orourkej@bu.edu. BU Today reserves the right to reject or edit submissions. The views expressed are solely those of the author and are not intended to represent the views of Boston University.
Comments & Discussion
Boston University moderates comments to facilitate an informed, substantive, civil conversation. Abusive, profane, self-promotional, misleading, incoherent or off-topic comments will be rejected. Moderators are staffed during regular business hours (EST) and can only accept comments written in English. Statistics or facts must include a citation or a link to the citation.