Editorial

AI in higher education

Artificial intelligence has moved quickly from a background tool to a central point of tension in higher education. As universities begin to introduce the use of AI more and more for coursework students are already confronting how AI will shape their futures.


Editors were first asked whether the degrees they are currently earning will still be relevant in ten years, or if AI will fundamentally change their fields. One editor pursuing journalism expressed cautious confidence, saying,
“AI will never completely take over the field of journalism because AI can’t do what journalists can. In fact, journalists have to be that much better than AI.”


Another editor echoed that belief, emphasizing the importance of human interaction: “While AI can write, yes, it currently doesn’t have the same ability to interview people that a human interaction allows for.”


Another editor, though, viewed the issue through a slightly different lens, explaining, “The end product of said labor can most definitely be replicated, but not replaced. AI could push the industry to set a new bar given the albeit easy alternatives.”


The discussion became more divided when editors considered whether it is ethical for professors to use AI for grading or lesson planning while students face penalties for similar tools. One editor found the imbalance troubling, noting, “If I put my time and effort into a writing assignment and my professor is unwilling to sit down and read my work, that’s upsetting.” Another editor was more blunt, stating, “If I can’t use AI, professors should be held to the same standard. Otherwise, I’d consider anti-AI regulations total hypocrisy.”


One editor rejected the idea outright, arguing that trust is central to evaluation. “Definitely not. I would not trust a professor to grade my work if it meant they would be using the same software confidently telling users 2+2 somehow equals 5.”


Editors also reflected on where the line for cheating should be drawn in an era where AI can brainstorm, outline, and edit within seconds. One editor warned that even limited use can erode skill building, explaining, “I think AI is taking away skill. To think a machine can do that for you without getting any creative juices flowing is sad.”


Another editor offered a more flexible perspective, saying, “Relying on AI for help with an email, or an outline, or for grammatical edits similar to Grammarly can be alright. I think using ChatGPT to do an entire assignment can be fair ground for academic dishonesty.”


One editor built upon that, adding, “Could it do the latter? Absolutely. But what is there to gain from such besides a bigger risk than reward, AKA academic dishonesty.”


When asked what human quality AI will never be able to replace, editors found rare agreement.


One editor answered simply:
“Authenticity.”


Another emphasized connection, saying, “The human connection, especially in the communication field with journalism, marketing, and public relations, can’t be replaced.”


A third editor expanded on the idea further, stating, “Empathy. An AI could be trained on thousands upon thousands of texts, documents, and reports, but it will still lack the most basic human quality, the ability to put oneself in another’s shoes.”


As universities continue to navigate how AI fits into academic life, students remain caught between innovation and integrity. The challenge ahead is not whether AI should exist in education, but how it can be used without undermining trust, effort, and human growth. If classrooms at Monmouth University are intended to prepare students for the real world, they must recognize that while technology evolves, the human qualities behind learning still matter most.