22.12.2025
10:27
250
AI prompts damaging your thinking skills, experts

AI prompts damaging your thinking skills, experts

Earlier this year, the Massachusetts Institute of Technology (MIT) published a study showing that people who used ChatGPT to write essays showed less activity in brain networks associated with cognitive processing while undertaking the exercise, El.kz cites BBC.

These people also couldn't quote from their essays as easily as those in the study who didn't use an AI chatbot.

The researchers said their study demonstrated "the pressing matter of exploring a possible decrease in learning skills".

All 54 participants were recruited from MIT and nearby universities. Their brain activity was recorded using electroencephalography (EEG), which involves electrodes being placed on the scalp.

Some of the prompts used by the participants included asking AI to summarise essay questions, track down sources as well as refine grammar and style.

It was also used to generate and articulate ideas - but some users felt AI wasn't very good at this.

Separately, Carnegie Mellon University and Microsoft, which operates Copilot, found people's problem-solving skills could diminish if they became too reliant on AI.

They surveyed 319 white-collar workers who used AI tools for their jobs at least once per week about how they apply critical thinking when using them.

They looked at 900 examples of tasks given to AI, ranging from analysing data for new insights to checking whether a piece of work satisfies particular rules.

The study found that higher confidence in the tool's ability to perform a task was related to "less critical thinking effort".

"While GenAI can improve worker efficiency, it can inhibit critical engagement with work and can potentially lead to long-term overreliance on the tool and diminished skill for independent problem-solving."

Prof Holmes points to research about cognitive atrophy, where someone's abilities and skills become worse after using AI.

He says this has been a problem for radiologists who use AI tools to help them interpret X-rays before they diagnose patients.

A study by Harvard Medical School published last year found AI assistance did improve the performance of some clinicians but damaged others for reasons researchers don't fully understand.

The authors called for more work to be done on how humans interact with AI so we can figure out ways of using AI tools that "boost human performance rather than hurt it".

Prof Holmes fears that students, whether in school or university, could become too reliant on AI to do their work for them and not develop the fundamental skills an education provides.

A student's essay might receive better marks thanks to help from AI but the issue is whether they end up understanding less.

As Prof Holmes puts it: "Their outputs are better but actually their learning is worse."