Educational institutions across the world are bracing themselves for the widespread dissemination of artificial intelligence (AI) technology. Monmouth professors and students weighed in on the topic, discussing their thoughts on technology that may change how educators teach and assess students’ writing in the future.
Stanley Blair, Ph.D., Associate Professor for the Department of English, began, “Like any technology, this type of artificial intelligence has the potential to be a beneficial tool or a harmful weapon. The difference between the two has less to do with the technology itself than with the person who uses it.”
AI tools are not limited to technology akin to the highly debated ChatGPT, one of the more popular chatbot assistants debuted this past year. Rather, spellcheck and autocorrect also fall under the umbrella of artificial intelligence.
Heide Estes, Ph.D., Professor for the Department of English, put ChatGPT to the test, evaluating its efficacy and analyzing its results from the perspective of an educator.
“I asked ChatGPT to provide an environmentally focused reading of some Old English riddles. It provided a brief summary of ecocriticism, the critical area that deals with environmental readings of literature, and a brief summary of Old English riddles. However, it did not do a good job in connecting them, provided no detailed analysis of text, and invented citations by famous academics (but not academics who write about ecocriticism),” started Estes.
She concluded that ChatGPT cannot analyze literature, and if it doesn’t have the information someone asks for, it will make something up.
“This program and others like it, as they currently exist, will challenge faculty to write better paper assignments. If students are required to connect things they read to experiences they have had in or outside the classroom, ChatGPT will not produce an appropriate answer,” said Estes.
John Morano, Professor in the Department of Communication, elaborated on the distinction between what is familiar AI and what is not, and how it translates in the classroom.
“As a journalism professor, most of what my students produce is original news writing…Typically, the material they use is generated from Monmouth sources, making it somewhat harder for them to use online applications. That said, once they have their information in hand, I am concerned that they don’t use any outside technology to craft the story,” said Morano.
He expanded, “But to be honest, in a smaller way, students have used spelling and grammar checkers for years— I encourage it. Yet, I do think we’ve crossed over onto another level of magnitude with recent AI developments.”
Aaron Furgason, Ph.D., Chair and Associate Professor for the Department of Communication, elaborated, “While AI is new to the college experience, we can all agree that it will alter the classroom experience, as computers and cell phones have had that effect.”
Blair continued to discuss the positives and negatives of AI in academia, prefacing that the effect had will likely depend on academic discipline and the type of assignment.
“A claimed positive is that the technology will somehow liberate faculty and students to focus on more advanced approaches to disciplinary content and skills. This position seems based on assessments that the technology is good at summarizing and not so good at having and arguing for an opinion,” said Blair.
“[On the other hand], a concern in my field and others is always the validity of writing samples. One aspect of validity is academic honesty, the idea that a student’s submitted writing is substantially their own work. This issue may be addressed through technologies that detect AI-generated text. But it can also be prevented through writing- assignment designs that consider the AI’s weaknesses and that are not based on its strengths.”
Michael Phillips-Anderson, Ph.D., Associate Professor for the Department of Communication, mirrored Blair’s concerns of academic dishonesty in a scenario.
“ChatGPT might be helpful in putting together a first draft of a short paper. Nonetheless, the student would still need to do considerable research to confirm that the information used is correct and relevant. The concern is always that students are short-changing themselves by copying the work of others,” explained Phillips-Anderson.
Blair responded, “It has always been true that some students see the fraudulent acquisition of academic credentials as indicating, after graduation, that those alums have acquired the knowledge and skills the credentials represent. That fraud is eventually discovered. The student or alum may suffer, but less obviously, the University’s reputation suffers as well, and therefore the value of the degree current students are pursuing. When others cheat, you lose.”
“Clearly, students who cut corners, who complete assignments without gathering, processing, and presenting information themselves are fraudulently pursuing a degree. And while they might be able to get away with submitting an assignment in a classroom, trying to perpetrate the same fraud in a newsroom will likely fail,” reflected Morano.
Estes added, “If students use ChatGPT and similar text generation programs to write papers, they will not learn how to find information, how to evaluate the information they have found for accuracy, relevance, or currency, or for the authors’ affiliations and potential biases. They will not learn to think about the information they have found or to organize their ideas in written or spoken forms.”
Monmouth students appear just as torn about the issue as their professor-counterparts.
Delaney Buday, senior English student and peer writing assistant, commented, “Artificial intelligence is a topic that I—and I’m sure many others—have mixed feelings about. While it’s a marvel of what can be done with technology and machine learning, it can also shine a rather ugly light on the ways in which these systems are used to infringe on and steal from creative communities.”
Anna Huber, a senior English student and peer writing assistant, said that although she has not had enough experience with AI as a student or future educator to evaluate its effectiveness, she believes it can be a valuable tool in the classroom. “For students with learning difficulties or special accommodations particularly, there seems to be unlimited ways for AI tech to support them in the classroom. However, I do stress that AI should remain a tool. I do not think that AI can become the center of education,” said Huber.
Buday concluded, “For professors, I can only imagine how confusing this new form of academic dishonesty can be, and how it may sow distrust in extreme cases between student and teacher. As a tutor myself at writing services – and one who has had the fear of plagiarism, accidental or otherwise, drilled into my very core from both professors and my line of work – this also adds a layer of complication to the help I and others try to provide.”