In a recent Public Lecture on Large Language Models (LLMs) and the research industry, a heated debate erupted over the use of AI tools like ChatGPT in academia.
LLM is a type of artificial intelligence algorithm that uses deep learning techniques and massively large data sets to understand, summarize, generate and predict new content. The term generative AI also is closely connected with LLMs, which are, in fact, a type of generative AI that has been specifically architected to help generate text-based content. The most commonly used LLM generative A include ChatGPT and Gemini.
While proponents of AI maintained that tools like ChatGPT can be valuable aids for students, critics expressed open fears over students becoming overly reliant on these tools, leading to a decline in their critical thinking and problem-solving abilities.
But like all technological advancements, when we resist to embrace innovations like ChatGPT and other AI tools in academia or any other space, we risk them falling by the wayside. Remember the infamous “shape up or ship out” mantra? Well, the same principle applies here.
Innovations don’t thrive without support and proper integration. And guess what? Students are struggling to understand why they can’t use these tools for their exams or assignments. To be honest, ChatGPT help with multiple tasks such as research, writing, and formatting, grammar, etc. Moreover, millions of people are already using them and banning it may be futile.
The Great Innovation Dilemma
Professor Boris Stiepe from University of Toronto had an epiphany, “I figured it made no sense to ban ChatGPT within the university; it was already being used by 100 million people. Over the Christmas break I figured that as professors, we shouldn’t be focusing our energy on punishing students who use ChatGPT, but instead reconfiguring our lesson plans to work on critical-thinking skills that can’t be outsourced to an AI.”
His perspective packs a punch, right? If an algorithm can ace exams, then what value are educators truly adding to their students?
Calvin Klein, an IT student from the Technical University of Kenya, shared his own take on the issue. “Students ask for ‘the answer’ in an assignment instead of working through the problem themselves.” He’s worried that students might get too comfortable with AI doing the heavy lifting. The fear, echoed by many educators, is that tools like ChatGPT could erode critical thinking and problem-solving abilities. And then there’s the risk of misinformation—AI might not fully grasp the complexities of a subject, leading to inaccurate or misleading responses.
But AI is limited and when ChatGPT and other similar tools may appear ‘too good’ at assignments, truth is they may not read an entire course curriculum, nor truly “understand” the material like students do ( or at least not yet!). It’s not like ChatGPT can accurately answer all exam questions—especially if those questions are specific enough. It might give you something, but that “something” could very well miss the point entirely.
Way Forward?
Professor Stiepe got so consumed by this ChatGPT issue, that he decide to pour an entire semester’s worth of energy into fixing it. He has introduced the “Sentient Syllabus Project”— a collaboration between professors from around the globe, including a philosopher in Tokyo and a historian at Yale, in a mission to create a publicly available resource that helps educators harness the power of AI like ChatGPT without compromising academic integrity.
The idea is genius. Teach students to use AI tools for what they’re good at—expediting grunt work like formatting spreadsheets or summarizing literature—while keeping the heavy lifting of critical thinking squarely on their shoulders. The syllabus includes guidelines like “Create a course that an AI cannot pass,” and practical advice on normalizing honesty around AI use.
The ball is in our court. It’s not about banning ChatGPT or pretending it doesn’t exist. It’s about integrating it in ways that enhance learning, not replace it. If we reconfigure our courses to focus on skills that AI can’t replicate—like deep analysis, creativity, and problem-solving—we’ll be setting students up for success in an AI-augmented world.
So, professors, students, and everyone in between, the question isn’t whether we should use AI – it’s how we use it.
Molly is a versatile and detail-oriented writer with a background in journalism & PR. She is passionate about technology, science, arts, and culture. She delves into extensive research and writing. She is a Published Author
Philanthropic Overhaul; When Generosity is More Than Money
How to Stay Ahead in an Evolving Job Market
Women Suffer the Worst as Conflict Permeates in Rural Sudan
Why Voice Search Optimization Won’t Replace Search Engine Optimization
Should Your Company Hire a Chief AI Officer? We Spill the Tea on This Tech Trend. ☕️
AI Is No Longer Just Hype: How It’s Helping Businesses Crush It Right Now