AI use is on the rise—are schools catching up?

AI use – AI adoption in schools has jumped fast, but training and clear academic-integrity rules lag behind—leaving students and families uneasy.
Generative AI is no longer a classroom novelty; it’s becoming part of everyday schoolwork. The big question for educators and families is whether guidance is keeping pace.
AI use in schools surged during the 2024–2025 academic year, according to recent national survey results highlighted by Misryoum.. By 2025. more than half of students (54%) and core subject teachers (53%) reported using AI for schoolwork or instruction—rising by over 15 percentage points compared with the prior one to two years.. Adoption also followed a clear pattern: high school students were the most frequent users. and teachers were more likely to report AI use as students’ grade levels increased.
But the story is not just about the speed of uptake.. Misryoum’s reading of the survey findings points to a widening gap between what schools are doing and what they’re preparing for.. While AI use grew quickly. training. student guidance. and written expectations have lagged. leaving many stakeholders to figure things out as they go.. In practice. that means classrooms are adapting to new tools faster than districts can build consistent rules for how those tools should be used.
Students and parents are feeling that uncertainty most sharply.. According to the same survey results. 61% of parents believe increased AI use could harm students’ critical-thinking skills. compared with 22% of district leaders.. Among students, concerns are widespread too: half of students reported worrying about being falsely accused of cheating.. Middle school and high school students also leaned toward skepticism about AI’s impact on thinking skills. with 48% of middle school students and 55% of high school students expressing that belief.
Those reactions raise an immediate. human issue for schools: academic integrity isn’t only a policy question—it’s a trust question.. When students worry about false accusations. it can affect how willing they are to ask questions. seek help. or even revise their work.. Misryoum sees the risk as twofold: students may start hiding their process. and teachers may find themselves spending more time resolving disputes rather than teaching how learning should work.
Meanwhile, district leaders appear less alarmed, even as adoption rises.. By spring 2025, only 35% of district leaders said their schools provide students with training on how to use AI.. More than 80% of students reported that their teachers had not explicitly taught them how to use AI for schoolwork.. In addition. policy coverage remains uneven: just 45% of principals said their schools or districts have AI-use policies. and only 34% of teachers said policies specifically address academic integrity and AI.
This is where the gap becomes most consequential for classroom life.. Without clear training and consistent rules. AI use can turn into a moving target—different teachers. different expectations. and students unsure where collaboration ends and dishonesty begins.. Misryoum’s editorial takeaway is that the challenge isn’t simply “banning” or “allowing” AI; it’s defining responsible use in a way that students understand before problems happen.
The survey also points to a curriculum-level challenge: guidance hasn’t fully caught up for younger learners.. Nearly half of elementary teachers reported experimenting with AI. yet elementary students often lack age-appropriate explanations of what AI can and cannot do.. Misryoum’s view is that early habits matter—students develop how they justify their work. how they distinguish their thinking from tool output. and how they learn to cite or document help.. Introducing coherent. age-appropriate instruction in the early grades could reduce misuse and confusion later. especially as AI systems become more capable.
Misryoum recommends that trusted education guidance be consistent and regularly updated—particularly from state education agencies—to help districts translate broad expectations into daily classroom practices.. District and school leaders also need to clearly define what counts as responsible AI use versus academic dishonesty. and communicate those expectations to both teachers and students.. In the near term. clarity about what qualifies as cheating—especially when AI is involved—should be treated as urgent. not optional.
Ultimately. the real test for schools is whether they can turn AI adoption into a learning advantage without undermining student development.. If training and integrity policies remain behind, the adoption curve may accelerate anxiety instead of improving instruction.. But if districts invest in clear. grade-appropriate guidance. AI could become what educators intend it to be: a complement to learning. not a shortcut around it.