

Is AI making us “cognitively lazy”?
AI has renewed debate about learning, memory, and creativity in an age of cognitive offloading. Cognitive offloading means using tools to do mental work we could do ourselves. Many ask if students who use AI to draft essays, solve problems, or recall facts will stop building core skills. These concerns echo older worries about writing, print, calculators, and search engines. History shows that learners adapt. New tools shift what we learn and how we learn it. This report examines how AI-based offloading affects learning, draws on recent research (including Kosmyna et al., 2025) on brain activity with AI writing tools, and argues for responsible AI integration in education. Our core claim is simple: use AI to augment human cognition, not replace it. We offer a brief history, review current evidence, and present recommendations for policy and practice.
From writing to smartphones: Old fears, new tools
Concerns that technology weakens the mind are ancient. In Plato’s Phaedrus, Socrates warned that writing could erode memory and give the appearance of wisdom without its substance. Later, people feared that the printing press would flood readers with books and reduce disciplined study. Similar claims met the telephone, calculators, and the internet. Ideas such as “digital amnesia” and “Google makes us stupid” reflect this pattern. Research suggests these claims are overstatements. Tools change how we remember and think, but broad declines in intelligence or memory are not supported. The more accurate view is that technologies shift cognitive load and skill demands.
Humans share memory with others and with tools, a system psychologists call “transactive memory.” GPS use, for example, offloads spatial memory. People who navigate with GPS recall routes less well than those who navigate unaided. Classic “Google effect” studies (see esp. Sparrow, Liu, & Wegner, 2011) show that when people expect a computer to store information, they remember the location of the information more than the information itself. Offloading can also help. When students save notes, they free attention for new material and often learn the next topic better. In short, technology can free mental resources for higher-order tasks, even if some routine skills receive less practice.
Learners adjust to the tools they have. After calculators spread, teaching shifted from hand computation toward conceptual problem-solving. With search and smartphones, judging source quality matters more than memorizing isolated facts. The challenge today is balance: integrate AI so that students keep practicing the thinking that builds understanding and wisdom.
New research insights
Generative AI can produce text and solutions on demand. This raises a fair question: if a chatbot can do the work, will students still learn the skills? Recent studies explore this issue. A preprint by Kosmyna and colleagues (2025) at MIT examined how assistance changes neural activity during writing.
In that study, 60 students wrote short essays under three conditions: with ChatGPT, with standard web search, or with no online tools. The team recorded brain activity with electroencephalography (EEG). Students who wrote without tools showed the strongest and widest brain engagement, linking memory-related and decision-making regions. Those who used ChatGPT showed the lowest engagement. The search group fell between. Memory results matched this pattern: when later asked to quote a sentence from their own essay, 83% of the ChatGPT group could not recall one, compared with about 11% in the other two groups. Active use appears to deepen processing and recall.
Students’ sense of authorship also shifted. Most unplugged writers felt clear ownership. Many AI-assisted writers felt less certain and described shared authorship with the tool. Teachers who scored the essays found the AI-influenced work more fluent and well structured but thinner in voice and original insight. An automated scorer did not reliably detect AI help.
The timing of AI support seems important. In a follow-up, students who first struggled on their own and then used AI showed higher engagement when they added AI. Those who began with AI and later wrote without it did not reach the same level. This suggests a simple rule: do some thinking before you ask the model. Other work aligns with this view. Studies report that heavy early reliance on AI can reduce active engagement and long-term retention, while pre-testing or initial solo attempts improve later learning with AI (Bai, Lui, & Su, 2023; Jose, Akgün, & Toker, 2024). Research on critical thinking shows similar risks: students who defer quickly to AI often score lower on reasoning tasks than peers who analyze first (Uwosomah, 2025). Creativity studies show a mixed picture: collaboration with AI can increase idea fluency and flexibility, but some students fixate on AI suggestions and report lower creative confidence (e.g., Bai, 2023). The lesson is context: design use so AI supports, rather than replaces, core cognitive work.
Reimagining learning in the AI era: A systemic shift
If AI can take on routine tasks, what should learners focus on? I argue AI exposes a larger design problem in education. Many curricula still center on goals from an information-scarce era: memorize facts, follow set procedures, and produce formulaic essays. Today, intelligent systems can generate, translate, evaluate, and co-create knowledge on demand. This invites us to revisit outcomes and assignments. If a standard five-paragraph essay is easy for an algorithm, we should ask what we really want to assess: analysis, originality, judgment, and the capacity to build knowledge.
Used well, AI can help learners reallocate time and attention toward higher-order work. As calculators shifted effort from arithmetic to concepts, AI can handle some routine drafting or summarizing so students can design projects, interpret results, and work across disciplines. Studies with pre-service teachers suggest that offloading routine steps can reduce cognitive load and improve performance on complex tasks (Iqbal et al., 2025). This frames AI as a tool that, when used with care, can amplify human capacity.
Practice must change to realize this promise. Students may otherwise seek shortcuts. Teachers remain essential, but their role changes. In AI-rich settings, they act as learning designers and cognitive coaches. They help students ask better questions, check AI outputs, and integrate results with prior knowledge. Assessment should also change. Rather than grading only a final product, evaluate process: prompts, drafts, notes, and reflections. This keeps thinking visible and values the student’s own contribution.
An equity crisis looms. Access to AI tools is uneven, controlled by a handful of actors, and models can reflect cultural and linguistic bias. Policies should expand access, build digital and media literacy for students and teachers, and require transparency and fairness in educational AI. Design should include diverse voices to reduce bias and support inclusion.
In sum, AI can augment learning without displacing human cognition. Machines are fast and tireless. People bring context, ethics, empathy, and the ability to create meaning. Aim for a partnership in which AI carries routine load so learners invest in inquiry and insight. This will require updated curricula and a culture that rewards quality of understanding, not mere output. As Moravec notes, AI will not replace education or educators, but it will change what counts as learning. The most resilient institutions will treat that change as a chance to evolve.
Recommendations for policy and practice
Policy and leadership will decide whether AI becomes a partner in learning or a crutch. The following steps support the former:
- Integrate AI with clear purpose. Give guidance on when and how to use AI. Invite AI for brainstorming, feedback, and revision after an initial individual attempt. Discourage first-pass answers produced by AI. Design tasks that require personal experience, reasoning, and evidence.
- Prioritize higher-order skills and media literacy. Shift standards away from rote recall toward critical thinking, source evaluation, creativity, and metacognition. Treat information literacy and responsible AI use as core outcomes, including prompt design, verification, and clear acknowledgment of assistance.
- Prepare teachers for new roles. Invest in professional learning so teachers can orchestrate AI-supported lessons, detect over-reliance, and coach students as AI mediators. Share practical models where AI handles sub-tasks and students focus on argument, method, and judgment.
- Align assessment with desired learning. Permit AI in some tasks and grade the student’s value-add, or use performance tasks that reveal understanding. Ask students to document their process: prompts used, edits made, and reasons for choices.
- Set guardrails. Protect privacy, test tools for bias, and require transparency about model behavior and data. Run equity reviews to see who benefits and who is left out (and create guidelines for making corrections, where necessary). Include educators and learners in design and selection.
- Support research and innovation. Fund studies on attention, reasoning, creativity, and long-term outcomes. Back tools that prompt inquiry rather than supply final answers. Share effective models across systems and countries.
- Preserve spaces for analog practice. Keep some work unaided to build confidence and mastery. Activities such as mental math, first-draft writing, and live debates strengthen internal skills and make learners better judges of AI output.
AI in education is not an existential threat. It is a powerful ingredient that needs careful use. Evidence shows that over-reliance can reduce engagement and skill growth, but thoughtful use can personalize learning, reduce unhelpful load, and support creativity. The goal is human–AI collaboration that helps students learn how to learn and adapt. With sound policy and practice, AI can serve learning rather than replace it.
References
Akgün, M., and Toker, S. (2024). Evaluating the effect of pretesting with conversational AI on retention of needed information. ArXiv, abs/2412.13487. https://doi.org/10.48550/arXiv.2412.13487
Bai, L., Lui, X., & Su, J. (2023). ChatGPT: The cognitive effects on learning and memory. Brain and Behavior. https://doi.org/10.1002/brx2.30
Dahmani, L., & Bohbot, V. D. (2020). Habitual use of GPS negatively impacts spatial memory during self-guided navigation. Scientific Reports, 10, 6310. https://doi.org/10.1038/s41598-020-62877-0
Iqbal, J., Hashmi, Z. F., Asghar, M. Z., & Abid, M. N. (2025). Generative AI tool use enhances academic achievement in sustainable education through shared metacognition and cognitive offloading among pre-service teachers. Scientific Reports, 15(1), Article 1676. https://doi.org/10.1038/s41598-025-01676-X
Jose, B., Akgün, O., & Toker, E. (2025). The cognitive paradox of AI in education: Between enhancement and erosion. Frontiers in Psychology, 16. https://doi.org/10.3389/fpsyg.2025.1550621
Kosmyna, N., Hauptmann, E., Yuan, Y. T., Situ, J., Liao, X.-H., Beresnitzky, A. V., Braunstein, I., & Maes, P. (2025, June 10). Your brain on ChatGPT: Accumulation of cognitive debt when using an AI assistant for essay writing task [Preprint]. arXiv. https://arxiv.org/abs/2506.08872
Moravec, J. (2025, March 28). Does the modern university end at AI? Education Futures. https://educationfutures.com/post/does-the-modern-university-end-at-ai/
Pearson, H. (2025). Are the Internet and AI affecting our memory? What the science says. Nature, 638(8049), 26–28. https://doi.org/10.1038/d41586-025-00292-z
Sparrow, B., Liu, J., & Wegner, D. M. (2011). Google effects on memory: Cognitive consequences of having information at our fingertips. Science, 333(6043), 776–778. https://doi.org/10.1126/science.1207745
Uwosomah, E. E., & Dooly, M. (2025). Pre-service teachers’ evolving perspectives on AI in education. Education Sciences, 15(2), 152. https://doi.org/10.3390/educsci15020152
Ward, A. F. (2021). People mistake the internet’s knowledge for their own. Proceedings of the National Academy of Sciences, 118(43), e2105061118. https://doi.org/10.1073/pnas.2105061118