When AI Replaces Thinking Instead of Supporting It


  • Research shows that the increasing use of LLM tools like ChatGPT has led to reduced critical thinking and higher cognitive offloading.
  • Students using these tools fail to understand the reasoning behind a problem and end up scoring less than students who do not use any such tools.
  • AI platforms like Cluely encourage people to ‘stop thinking’ and instead cheat on everything, including exams and jobs.
  • There’s a need for smarter AI tools like Khanmigo that aid the thinking process instead of replacing it.
Cheating Ourselves: When AI Replaces Thinking Instead of Supporting It

The value of AI in education and teaching has been a debated topic for quite a few years now. While some advocate actively for AI learning tech, only a few can see the slow death of human intellect and critical thinking abilities in the hands of AI.

In this article, we’ll unwrap the effects of artificial intelligence on human thinking and cognitive offloading. Sit tight because this could be a brain-melting one.

Why Think When You Can Prompt?

Let’s start with a simple example of a college assignment.

Before AI wasn’t more than just an odd combination of two English alphabets, you’d actually have to do the work yourself, read through tons of stuff, and then compile the whole assignment together. This required research, summarizing, understanding, analytical thinking, higher-level discernment, and even rephrasing.

You have a computer-generated, human-like assignment ready within a matter of minutes. The need to think, learn, and compile has gone for a toss, leaving no room for critical thought or understanding.

Remember the old adage, ‘the journey is what matters?’ Well, that journey has just been tossed out the window, learning and development opportunities included.

During our research for this article, we found ourselves genuinely wondering why students are resorting to such shortcuts. Well, one possible reason could be that education today is largely viewed as a set of tasks you need to complete to get a certificate. That’s about it.

The ‘willingness’ to learn new things seems to be dying among new-age students. People are submitting LLM-generated assignments simply because they think the very act of writing up a project is not important or value-adding to their educational pursuit.

Oh, and using GPT to complete your assignment is significantly easier than putting in the work yourself. That’s also a large part of it.

The Changing Education Landscape

Research at the University of Pennsylvania titled ‘Generative AI Can Harm Learning’ sheds more light on the matter. The experiment divided children in a Turkish high school into two categories: one with access to ChatGPT and another without.

Students who used ChatGPT solved 48% more mathematical problems correctly during practice. However, when a test was conducted on the same topic, students without access to any LLM tool scored better. The children who had used ChatGPT while solving their practice problems scored 17% less.

What does this teach us? ChatGPT or any other LLM model only helps you ‘complete the task.’ However, it adds little value to your overall intellect and doesn’t contribute much to the learning spectrum. In other words, students who used ChatGPT simply copy-pasted the answers provided by the tool without understanding the process behind the solution.

Some argue that we already have technology like spreadsheets and calculators that automate such ‘mundane’ tasks. Why don’t we eliminate those and do stuff manually for the sake of ‘learning’?

Well, a key difference is that you still need to understand the formulas used on a spreadsheet to produce any usable output.

It’s you who runs a spreadsheet with different formulas. A spreadsheet or a calculator does not alter the way we think; these tools still require us to use our intellect and our critical decision-making thought process.

However, AI tools are completely different.

Increasing Cognitive Offloading: Outsourcing Thinking

Another study titled ‘AI Tools in Society: Impacts on Cognitive Offloading and the Future of Critical Thinking’ studied the correlation between cognitive offloading and the use of AI tools. Cognitive offloading is the practice of reducing mental work by offloading the task to external tools, which in this case are AI apps.

The research shows that younger participants (17-25) exhibit the highest use of AI tools and also the lowest critical thinking scores. Cognitive offloading was also the highest among this age group. In contrast, older participants (46 and above) exhibited low AI tool usage and higher critical thinking scores.

And remember that the younger participants have started using AI tools only in the last couple of years. Unlike the present generation, they haven’t had access to these tools throughout their childhood or early adolescence. Even considering this relatively low exposure period, we are seeing low critical thinking and high cognitive offloading among this age group.

We can extrapolate these results to the current batch of students who rely entirely on these AI tools for education. It’s reasonable to assume that if the same research is done five years down the line again, these students might get even lower critical thinking scores than the current batch.

The same study shows a strong negative correlation between the use of AI tools and critical thinking.

This means that the more people use these tools, the lower their critical thinking skills and scores are. Plus, there was a strong positive correlation between AI tool use and cognitive offloading. So, the more people use AI tools, the more they tend to throw away their ability to think critically.

AI Is Making Cheating the New Fad

Enter Cluely. It’s an advanced AI assistant with a tagline that reads, ‘Let’s Cheat on Everything.’ Unfortunately, that’s what it’s literally meant for – cheating the human intellect.

Cluely

What’s more concerning is Cluely’s ad campaign, which shows two people on a date. The guy is wearing some sort of smart glasses, which tell him what to say next. It listens to responses from the girl and curates the ‘perfect reply.’

The guy can be seen lying about his age, job, and preferences in the video. What’s more shocking is how blatantly the brand says, ‘We built Cluley so you never have to think alone again.’ Since when was this a good thing?

While this may sound fun to some people, it’s pretty bone-chilling to think where such tech might eventually take us. If the death of critical thinking and problem-solving wasn’t scary enough, there’s also the chance of AI killing our ability to have genuine human connections.

Algorithmic LLMs like Cluely do not want to improve the human experience but rather replace it. Eventually, it’d just be two computer programs dating each other in human skin. Ana de Armas from Blade Runner 2049 is depressing, if you really think about it.

The scarier part is that Cluely is not a backyard school project. It’s a real AI tool that has raised $5.3M in seed funding. People want tools like Cluely to make their lives easier, which does make sense. But should we really sacrifice human thinking at the altar of comfortable utility in the process?

AI: A Double-Edged Sword

Not everything is as grim as it looks.

AI and education have surely changed the way young minds learn. Earlier, we had to dig through several pages of a textbook to understand complex equations and theorems.

Sal Khan, the founder of Khan Academy, showed us the positive sides of AI in education two years back in a TED Talk when he unveiled Khanmigo, an AI learning assistant. Unlike LLMs that blurt out the answers to any question, Khanmigo gives students a hint about the math problem and encourages them to solve it themselves, still allowing them to think independently.

Most of our concern about AI in education is that students won’t learn how to write or think. Khanmigo solves this problem to an extent.

Instead of writing ‘for’ you, this AI assistant ‘writes with you.’ For example, if you want to write a story, it will direct you to write the first two lines, after which it will add another two lines and repeat the process until you have the full story. Khanmigo doesn’t kill creativity but rather stimulates it.

This is a perfect example of how AI should be used in education. Rather than using it to offload our cognitive thinking, it should be used as a stimulus to human thought and the learning process as a whole.

The bigger question is, whose responsibility is it to ensure learning stays a human experience? To put it bluntly, companies are not going to stop producing LLM tools like ChatGPT, Gemini, or Claude simply because they’re stifling human thinking. There’s money to be made, so goodbye moral responsibility.

Therefore, a large part of the responsibility falls on teachers, parents, and students themselves. Guardians should regulate the kind of AI tools a child has access to. At the same time, students should become more self-aware of the long-term harm caused by replacing their thought process with an AI prompt.

While tools like Khanmigo can become effective in the learning process, ‘feed me the answer’ kind of AI tools can kill human creativity and critical thinking. And the responsibility for the types of tools we encourage rests on our shoulders.

Krishi Chowdhary

Krishi is a seasoned tech journalist with over four years of experience writing about PC hardware, consumer technology, and artificial intelligence. 
Clarity and accessibility are at the core of Krishi’s writing style. He believes technology writing should empower readers—not confuse them—and he’s committed to ensuring his content is always easy to understand without sacrificing accuracy or depth.
Over the years, Krishi has contributed to some of the most reputable names in the industry, including Techopedia, TechRadar, and Tom’s Guide.  A man of many talents, Krishi has also proven his mettle as a crypto writer, tackling complex topics with both ease and zeal.
His work spans various formats—from in-depth explainers and news coverage to feature pieces and buying guides. 
Behind the scenes, Krishi operates from a dual-monitor setup (including a 29-inch LG UltraWide) that’s always buzzing with news feeds, technical documentation, and research notes, as well as the occasional gaming sessions that keep him fresh. 
Krishi thrives on staying current, always ready to dive into the latest announcements, industry shifts, and their far-reaching impacts. 
When he’s not deep into research on the latest PC hardware news, Krishi would love to chat with you about day trading and the financial markets—oh! And cricket, as well.


View all articles by Krishi Chowdhary

The Tech Report editorial policy is centered on providing helpful, accurate content that offers real value to our readers. We only work with experienced writers who have specific knowledge in the topics they cover, including latest developments in technology, online privacy, cryptocurrencies, software, and more. Our editorial policy ensures that each topic is researched and curated by our in-house editors. We maintain rigorous journalistic standards, and every article is 100% written by real authors.



Source link

Leave a Comment