Education

AI homework use surges in 2025 as critical thinking fears rise

AI homework – More students are turning to AI for schoolwork, but concerns about weaker critical thinking and unclear school rules are growing.

Student use of AI for homework is rising fast—yet many students worry it’s changing how they think.

Misryoum reports that a new RAND analysis found AI use for homework increased from 48% to 62% between May and December 2025 across middle school. high school. and college students.. The jump was driven largely by younger learners, while college students showed comparatively steady use.. That shift matters because homework is often where students build habits: how they read a prompt. test an idea. and decide what counts as evidence.

At the center of the debate is critical thinking.. In the same period. the share of students who said AI use for school harmed critical thinking rose sharply—67% said it was harmful. up from 54% earlier in the year.. The concern wasn’t limited to students who use AI.. Among students who do not use AI, 78% reported that AI harms critical thinking, compared with 60% among AI users.. In other words, even students who rely on the tools are hearing the warnings—and some are internalizing them.

The report also shows how students are using AI when they sit down to do work.. For school-related activities, 71% of students said they used at least one AI tool.. Chatbots were the most common entry point (60%). followed by writing helpers such as Grammarly or Quill (21%). and general homework help platforms like Chegg. Brainly. or Course Hero (15%).. ChatGPT was reported as the most used chatbot (53%), while Google Gemini usage more than doubled from May to December, reaching 28%.. Misryoum readers may recognize the pattern: once students find an interface that feels easy, they keep returning to it.

What students do with AI helps explain why the disagreement is so intense.. The most common uses were getting better explanations of assignments (38%), brainstorming ideas (35%), looking up facts (33%), and drafting or revising writing (33%).. Interestingly, older students were more likely than younger students to use AI for these purposes—except for fact-checking.. That exception suggests a crucial boundary in how students trust AI: explanations and ideation may feel helpful. while verification may be treated as more sensitive or more demanding.

Not all students frame AI as cheating.. Except for getting direct answers to homework—45% of students believed that is cheating—many students said a range of AI uses are not cheating.. Nearly 80% said using AI to understand an assignment was not cheating. while 72% and 67% said the same about brainstorming ideas and looking up facts. respectively.. Misryoum interprets this as a sign that students are separating “learning support” from “answer retrieval. ” even if schools struggle to draw the same line consistently.

A second major friction point is policy clarity.. The analysis found that many schools lack clear. schoolwide rules: only about one-third of students reported that their school has a schoolwide policy.. Students also reported that rules can vary by teacher, especially at the college level.. That unevenness can create a messy student experience—when guidance changes from class to class. students may default to the tools that feel safest or most efficient.

Gender patterns in concern add another layer.. Male and female students were equally likely to use AI, but female students expressed greater worry about its impact.. Misryoum highlights that 75% of female students said AI harmed critical thinking skills, compared with 59% of male students.. Female students were also more likely to worry about cheating.. While the reasons weren’t expanded in the report. the gap is consistent with how academic risk and assessment anxiety can vary across groups.

For educators and policymakers. the headline is not simply “students use more AI.” It’s that students are using AI in ways that they believe blur learning and assistance—and they’re increasingly unsure whether their critical thinking is being strengthened or undermined.. RAND’s framing. echoed in Misryoum’s coverage. points to the need for schools to be explicit about when and how AI can be used so the tools support deeper thinking rather than replace it.

The practical challenge now is moving from blanket bans or vague permission to workable classroom expectations: what counts as understanding. what counts as drafting. what must be cited. and what must be done independently.. As AI tools become more embedded in student routines. Misryoum expects the most successful approaches to look less like policing and more like teaching students to use AI as a tutor—while still demonstrating their own reasoning.

Designing for Depth: Why High Grades Can Hide Shallow Learning

Oral Exams: 71 Sessions in 12 Days—What Really Changed

California School Strike Crisis Hits LAUSD Workers April 14