AI: Good for students, bad for education?

Artificial intelligence (AI) writing tools like ChatGPT are gaining traction in academia. While these tools help students breeze through assignments with auto-generated essays, experts argue this can undermine actual learning and thinking skills. As AI capabilities advance, difficult questions are emerging on how to leverage the technology ethically to enrich education rather than obstruct it.
Watch more of our videos on Shots! 
and live on Freeview channel 276
Visit Shots! now

Artificial intelligence (AI) tools like ChatGPT are gaining popularity among students for completing assignments and essays. While these tools can help students work more efficiently, experts warn of the adverse effects on learning and education.

ChatGPT and similar AI writing assistants can generate entire essays within seconds on virtually any topic when prompted. They are able to do so because of their training on vast datasets and models. Students are increasingly using these tools to speed up essay writing and bypass the learning process.

Hide Ad
Hide Ad

“It’s tempting for students to use ChatGPT to quickly generate an essay that would take hours to research and write manually,” said Dr. Sarah Green, an education psychologist at University College London. “But this prevents genuine learning and defeats the purpose of the assignment.”

Artificial intelligence (AI) writing tools like ChatGPT are gaining traction in academia.  Photo: Getty ImagesArtificial intelligence (AI) writing tools like ChatGPT are gaining traction in academia.  Photo: Getty Images
Artificial intelligence (AI) writing tools like ChatGPT are gaining traction in academia. Photo: Getty Images

While AI tools may help students save time and meet deadlines, experts highlight that this leads to shallow learning. “The algorithm generates text based on its training data, without any actual understanding of the topic,” explained Dr. Green. “So the student misses out on acquiring knowledge or skills in analysis, critical thinking, forming logical arguments, and citing credible sources - abilities that are crucial both in academia and the workplace.”

Some worry that dependence on AI writing assistants like ChatGPT will degrade writing abilities among students. “If students get into the habit of outsourcing essay writing to algorithms, it may impact how they structure arguments, adopt critical perspectives, and express ideas in their own words,” said Michelle Brown, an English professor at the University of Birmingham.

There are also ethical concerns given that AI tools can generate paragraphs that sound convincing but may include factual inaccuracies or plagiarised content. Students risk submitting work that is partly or fully AI-generated without appropriate attribution which would constitute academic misconduct.

Hide Ad
Hide Ad

Hamza Haroon who is academic writing specialist and content marketer added “This situation seems to be getting out of hands. We all know that we cannot ban AI - there’s no way that we shouldn’t be using AI in education. But, we need to regulate things properly.

Some universities uses Turnitin or other similar tools which can detect if the content is copied from published sources.

For an AI content, such tools can’t detect because AI writers are trained on large language models. They provide spin content so that plagiarism checkers won’t be able to detect it.

For detecting AI content, educational institutes has started enforcing AI checkers but we know that they aren’t perfect.

Hide Ad
Hide Ad

So, what a student can do it to use Generative AI tools like ChatGPT or Bard to write an entire essay or assignment. They’ll head over to online ai tool to reword or make this work look like a human written. They will further use an AI writing detector to ensure their work doesn’t has any AI similarity. This makes it impossible for finding if a particular assignment/essay isn’t actually written by a student.

It isn't about such tools, we all know these tools are here to help students & writers to improve content. What matters is how someone is using these tools”

In response to such concerns, the UK government has outlined measures to update academic integrity policies. Schools and universities are also putting systems in place to detect AI-generated text. But tight regulation also risks limiting innovation. “We must find the right balance so creativity and productivity enabled by AI are encouraged, not stifled - but learning outcomes are still met,” said an education ministry spokesperson.

As AI capabilities grow more advanced, questions around changes required in education continue to emerge. Courses should involve dynamic assessment models focused less on memorisation and more on evaluation and application of knowledge. We need to reimagine learning objectives to develop holistic understanding and skills that AI cannot replicate.

Hide Ad
Hide Ad

The consensus is that responsibility lies not just with policy makers but also educators and students to ensure ethical usage. Students need guidance on avoiding misuse while also taking advantage of AI judiciously to enrich learning - for instance, by using it as a tutoring tool. Used strategically, AI can improve outcomes; used improperly, it defeats the very purpose of education.