ChatGPT may alter brain engagement, MIT study suggests

Researchers find reduced neural activity in students using AI for essays, but timing of use could be key

ChatGPT:A new MIT study led by researcher Natalyia explores how AI tools like ChatGPT might influence cognitive processes, raising questions about their impact on academic tasks. 

A new study from MIT has sparked debate about whether AI tools like ChatGPT could affect cognitive engagement, particularly in academic settings.

Led by researcher Natalyia, the study examines how students’ brain activity changes when using AI for essay writing, revealing intriguing patterns that suggest the timing of AI use may play a critical role in its effects.

Methodology

Structured Experiment Design

The study involved 54 students from Greater Boston University, who were invited to MIT’s lab for an in-person essay-writing task.

Participants were split into three groups: one used only ChatGPT to write essays, another relied on a traditional search engine without AI capabilities, and the third, termed “brain-only,” used no external tools.

Each group completed three essay-writing sessions, selecting prompts of personal interest to ensure engagement.

Measuring Brain Activity

Researchers employed electroencephalography (EEG), a non-invasive brain imaging technique, to monitor neural connectivity during the tasks. They also evaluated essay outputs for linguistic patterns and conducted interviews to gather qualitative insights.

This multi-faceted approach allowed the team to compare cognitive engagement across the groups, focusing on how tool usage influenced brain activity and writing outcomes.

You May Also Like:

Concerns grow as AI-generated videos spread hate, racism online: ‘No safety rules’

AI Having ‘Sensual’ Chats with Children Sparks Outrage, Prompts Meta Investigation

Research

Uniformity in AI-Generated Essays

During the initial three sessions, the ChatGPT group produced essays that were strikingly similar in wording and structure, often using identical phrases, names, and entities.

This homogeneity suggested a reliance on AI-generated content, limiting individual creativity. In contrast, the brain-only group’s essays displayed greater diversity, reflecting more independent thought.

Neural Connectivity Findings

EEG data revealed distinct differences in brain engagement. The brain-only group exhibited the highest neural connectivity, indicating robust cognitive activity.

The search engine group showed moderate engagement, while the ChatGPT group displayed significantly lower neural activity. “This scaling down of brain engagement when using AI isn’t surprising,” said Natalyia [Last Name], the study’s lead author.

“Similar patterns have been observed with tools like GPS or Google in past research.”

Results

Declining Engagement in AI Group

Over the three sessions, the ChatGPT group showed a decline in effort, with many participants resorting to copying and pasting AI outputs by the third session.

This trend suggested a growing dependency on the tool, potentially reducing critical engagement with the writing process. The brain-only and search engine groups, meanwhile, maintained more consistent effort levels.

Switching Groups Experiment

For a fourth session, about a third of the original participants returned, and groups were reassigned. Those previously in the brain-only group were given ChatGPT, while former ChatGPT users were restricted to brain-only writing.

Participants revisited topics from earlier sessions. Surprisingly, students who had initially relied on their own cognition performed exceptionally well with ChatGPT, suggesting that prior independent effort enhanced their ability to use AI effectively.

Experiment

Timing Matters

The reassignment experiment highlighted the importance of when AI tools are introduced. “Students who first tackled the task without tools were better equipped to use ChatGPT strategically,” Natalyia explained. “

They asked more thoughtful questions, likely because they had already invested time thinking deeply about the topic.” This finding points to a potential synergy between human cognition and AI when used in the right sequence.

Broader Implications

The study suggests that premature reliance on AI could bypass critical cognitive processes, but strategic use after independent effort may amplify outcomes.

This insight challenges the rush to integrate AI tools in education without considering their cognitive impact.

Optimism

Potential for AI Augmentation

The findings offer a cautiously optimistic view. “If you’ve already engaged deeply with a task, AI can enhance your work by providing new perspectives or refining your output,” Natalyia noted.

This suggests that AI could serve as a tool for augmentation rather than replacement, provided users first develop their own understanding.

Encouraging Thoughtful Use

The study underscores the value of “thinking before acting” in an era of instant AI access. By fostering independent critical thinking before introducing AI, educators and students might harness these tools to boost creativity and productivity rather than diminish them.

Critical Thinking

No Direct Evidence of Erosion

The study does not conclusively show that ChatGPT erodes critical thinking. “We’re careful not to make bold claims,” Natalyia emphasized. “Our sample size was small, and the task was specific to essay writing.”

However, the observed reduction in neural engagement among ChatGPT users aligns with prior research on tool dependency, suggesting a need for further investigation.

Contextualizing with Past Studies

Drawing on previous studies, the MIT team noted that tools like GPS and search engines have similarly reduced cognitive effort in certain tasks.

Yet, some research indicates AI can support critical thinking in specific contexts, reinforcing the study’s call for more data to clarify these dynamics.

Why It Was Released Early

Urgency in the AI Era

Despite its limitations, the study was released as a preprint due to the rapid integration of AI tools in education and workplaces. “With AI deployment moving so fast, waiting two years for peer review could be too late,” Natalyia said. “

We need data now to guide how these tools are used, especially for developing minds in schools.”

Call for Broader Input

The researchers advocate for involving educators, parents, and policymakers in evaluating AI’s role. “Let’s test these tools thoroughly—does using them make a difference? We don’t need brain scans to start; simple experiments can help,” Natalyia added.

This approach aims to balance innovation with caution, ensuring AI enhances rather than hinders learning.

This article is based on an interview with Natalyia , published on Jun 27, 2025. Additional context was drawn from posts on X discussing AI’s impact on cognitive skills.

Source:

Index
Scroll to Top