The increasing presence of generative artificial intelligence across digital life is understandably sparking both excitement and significant concern, particularly within the education sector. A recurring worry is that these tools could encourage a form of cognitive laziness by reducing the mental effort required for learning, thinking, and memorization.
As more tasks are delegated to AI, there’s a growing question of whether it could weaken our ability to believe independently. The issue is drawing scrutiny from scientific studies, public debate, and pedagogical perspectives.
What Cognitive Science Reveals
In a recent article published by Polytechnique Insights, neuroscientist Ioan Roxin, emeritus professor at the University Marie and Louis Pasteur, warns of a potential risk of cognitive atrophy linked to the intensive use of generative AI. This concern comes as AI tools turn into more integrated into daily workflows across industries.
The article cites a study showing that users assisted by ChatGPT write faster, but with a 32% reduction in relevant cognitive load – the mental effort actually used to transform information into lasting knowledge. Even more strikingly, 83% of participants couldn’t recall a passage they had just written when aided by the tool.
These findings raise a fundamental question: is AI replacing thought, or is it changing how we learn?
For some policymakers and educators, the findings are alarming. By offering instant answers, AI could encourage students to quickly delegate mental effort, potentially hindering the development of complex skills.
A Real and Documented Concern
These concerns aren’t merely intuitive. A recent study by Michael Gerlich, published in 2025 in the journal Societies, highlights a significant negative correlation between frequent AI tool use and critical thinking abilities. This effect is largely driven by cognitive offloading.
The research, conducted with 666 participants from diverse backgrounds, shows that individuals who heavily use AI tend to delegate key cognitive tasks – such as analysis, memorization, and evaluation – which correlates with lower scores in critical thinking. Young adults (17-25 years old) and those with lower levels of education appear particularly vulnerable.
Correlation is Not Causation
These widely discussed results warrant caution. Several researchers emphasize a key methodological point: the study establishes a correlation, not causation. It’s therefore impossible to definitively state that AI causes a decline in critical thinking. The findings are prompting further investigation into the long-term effects of AI on cognitive development.
The alternative hypothesis is equally plausible: individuals already less inclined to cognitive effort might turn to AI as a shortcut. This distinction is crucial to avoid technophobic overreactions.
The challenge isn’t to blame AI, but to understand how, why, and under what conditions it’s being used.
From Fear to Action: Insights from Margarida Romero
That’s precisely the approach advocated by Margarida Romero, a professor and researcher specializing in the creative uses of digital technology at the University Côte d’Azur. Rather than fueling fears of widespread “cognitive laziness,” she urges the education community to accept a more nuanced seem at how students are actually using AI.
According to her research, the real issue isn’t the tool itself, but pedagogical intention and the learner’s mindset. AI doesn’t inherently develop students passive; it can instead reveal or amplify existing attitudes toward learning.
She identifies two fundamental dynamics:
- AI as support for the will to learn, when used to explore, experiment, create, improve, or deepen reflection;
- AI as excessive delegation of effort, when it becomes a shortcut to avoid research, analysis, or problem-solving.
In other words, AI acts as a mindset amplifier: it can strengthen curiosity and engagement… or encourage cognitive avoidance in the absence of a clear framework.
This perspective aligns with that of Natasha Nobben (University of Luxembourg), whose MINE model emphasizes that the real impact of a digital tool always results from the interaction between the tool, its use, and the human environment. Any universal or decontextualized measure remains incomplete.
Structuring AI Use to Support Cognitive Engagement
Building on this approach, Margarida Romero proposes a model for shifting from passive to cognitively engaging uses of AI. This continuum ranges from simply consuming content generated by the tool to forms of co-creation, problem-solving, and expansive learning, where the student remains fully involved in their learning process.
The more a teacher encourages a student to:
- formulate clear intentions;
- analyze AI’s responses;
- compare, adjust, critique, and improve outputs;
- anchor the tool in meaningful projects;
…the more AI becomes a lever for developing higher-order cognitive skills, rather than a substitute for intellectual effort. Gerlich’s work supports this, finding that higher levels of education and activities promoting deep reflection mitigate the observed negative effects.
Concrete Action Steps for Teachers
Far from being a simple tool, AI is a significant disruption that forces us to rethink what it means to learn in the age of instant answers. Rather than fearing cognitive laziness, several pedagogical strategies can transform AI into an ally:
Introduce AI at the Right Time
Encourage students to think for themselves first before using AI to structure, enrich, or deepen their ideas.
Structure Uses According to a Pedagogical Progression
Leverage frameworks like the #PPai6 model, which helps students evolve from passive consumers to active and creative users.
Make AI a Thinking Partner
Design tasks where AI helps generate ideas, critique hypotheses, or test solutions, without replacing human reasoning.
Cultivate Critical Thinking and Digital Literacy
Explicitly encourage source verification, bias identification, and solution comparison.
Value the Process as Much as the Result
Remind students that the richness of learning lies in the intellectual journey, effort, perseverance, and ability to solve problems.
Toward a Reflective Use of Technology
the challenge isn’t to restrain AI, but to reaffirm the value of intellectual effort in the age of instant answers. Training learners to think with AI requires developing essential human skills: critical thinking, creativity, intellectual autonomy, ethical awareness, and a joy of learning.
As both scientific research and emerging educational practices suggest, AI can become a powerful ally if integrated into a thoughtful, progressive, and stimulating pedagogical approach.
Rather than fearing cognitive laziness, let’s make school a space where technology enriches learning… without ever short-circuiting its meaning.