AI in Schools: Why It Could Backfire—And What to Do Instead

Misryoum breaks down why generative AI in classrooms risks harming learning, trust, and independence—and what state-level rules can do to keep it helpful.
COMMENTARY: AI in schools could be a disaster, but it doesn’t have to be
The promise of generative AI in education is moving quickly from headlines to lesson plans—and that speed may be creating the wrong incentives.
Generative AI in schools arrives on a familiar backdrop: education has repeatedly tried new technologies that were supposed to change everything. from MOOCs to large-scale “one device per student” spending.. In many places. those efforts fizzled. and the backlash is now real enough that some states are considering screen bans for younger students.. The key question for Misryoum readers isn’t whether AI can assist learning.. It can.. The question is what happens to the learning process when AI does so much of the work.
The concern raised by recent policy and research discussions is that generative AI could do more than distort academic outcomes—it may weaken foundational development.. That includes not only mastery of content. but also students’ social and emotional growth. their sense of autonomy as learners. and even their trust in school as an institution that reliably supports them.. When learning becomes too convenient. the skills students build along the way—thinking. revising. questioning. and explaining their reasoning—can quietly erode.
The early signs of risk are showing up in survey and observational data.. Misryoum’s framing here is straightforward: more than half of teens say they already use AI for schoolwork. and a substantial share describe using it for nearly all or most assignments.. Many teens also report that classmates use AI to cheat. while expressing worry that the technology is replacing practice that strengthens their ability to think independently.. In classroom-centered analysis. a portion of interactions involving AI are described alongside troubling behaviors such as cheating. bullying. or even self-harm—an indication that generative tools can enter school life in ways that educators may not be prepared to address.
A parallel theme is the mismatch between what parents believe and what students say they are doing.. Misryoum notes that parents often underestimate how frequently teens use AI for school tasks. and they may not be aware of the specific rules schools adopt.. That gap matters because policy can’t work only on paper; families need clarity to reinforce what schools want students to do—and what they must not rely on.
The human impact is easy to overlook when conversations focus on apps and capabilities.. A student who routinely outsources drafting or problem-solving may finish assignments faster. but the learning that would normally happen during the struggle—getting stuck. seeking a method. checking work. and revising—gets replaced with a smoother output.. Even for motivated learners, AI can make the process feel optional.. Over time. the habit of relying on shortcuts can spread beyond one subject or one grade level. shaping confidence and competence in ways that are difficult to reverse.
Misryoum also sees an important editorial distinction: education isn’t just about producing the “product” (a final answer. a polished essay. a submitted worksheet).. Learning is largely about the process that produces understanding.. That process depends on effort. feedback. and reflection—elements that generative AI can bypass when students treat outputs as the goal rather than drafts in progress.. The fear. raised repeatedly in risk-focused discussions. is that students may leave school less prepared for what comes next: more advanced coursework. real-world tasks that require judgment. and the everyday ability to evaluate information critically.
So what would a better approach look like?. Misryoum argues that the central lever is policy coherence, especially at the state level.. Leaving AI guidance to hundreds of districts creates a patchwork where families and teachers get inconsistent rules. and students learn to navigate loopholes instead of mastering skills.. State-level guidance should define what good use looks like for both teachers and students. and it should do so with enough specificity that schools can implement it.
For teachers, the guiding principle Misryoum supports is simple: AI should make instruction easier without lowering the quality of learning.. That can mean using AI to handle repetitive tasks—such as summarizing common misconceptions across a class or helping generate differentiated materials for different learners—so teachers can spend more time on feedback. coaching. and instruction that requires professional judgment.. The goal should be to reduce friction in teaching, not to automate teaching into a substitute for teacher-student interaction.
For students. the approach has to start from a reality schools can’t ignore: many students will use AI regardless of policy.. Misryoum’s implication is that schools need to redesign learning so that meaningful work happens where cheating is harder and thinking is more visible.. That may include shifting more writing. problem-solving. and assessment into the classroom; cutting busywork that can be easily generated on demand; and building coaching for students and families about the risks of overreliance.. Clear expectations can also support safer classroom norms—especially where AI can facilitate inappropriate behavior or unsafe disclosures.
Misryoum’s bottom line is that the threat isn’t “AI is bad” as a slogan.. The threat is a system-wide incentive to treat outputs as learning. while underestimating how quickly students adapt to tools that do the hard parts for them.. If leaders move now—with coherent rules. teacher-focused safeguards. student-centered assessment changes. and family communication—the technology can be directed toward support rather than replacement.
Neither students nor society can afford a future where educational experiences are diluted by automation and critical preparation falls behind. If schools want AI to help, they’ll need policies that protect the process of learning, not just the final submission.
When Employers Co-Design Cybersecurity Classes: What Changes for Students
Chronic Absenteeism: The New Attendance Playbook Schools Need
Breaking Words Strategy Boosts Students’ Confidence in Language