From Chalkboards to AI
By Valerie Bennett, Ph.D., Ed.D., and Christine Anne Royce, Ed.D.
Posted on 2025-08-11
Disclaimer: The views expressed in this blog post are those of the author(s) and do not necessarily reflect the official position of the National Science Teaching Association (NSTA).
AI is rapidly becoming part of science classrooms—from AI agents to personalized feedback on lab reports. Yet not all students are eager adopters. In fact, some resist using any AI tools at all, believing it diminishes their learning or feeling uncomfortable with AI for other reasons. Others are unsure of its purpose or fear being accused of cheating. Parents, too, often express concern about overreliance on technology and data privacy.
Science teachers are in a powerful position to guide students—and families—through this shifting landscape. But doing so requires both empathy and clarity. Teachers must still center the student voice, recognizing that AI integration is a balance of acknowledging the limits of AI, spotlighting its role in later stages of learning, and reinforcing human strengths in scientific reasoning. These issues are the focus of this blog post.
Why Students Feel Anxious About AI in the Classroom
Recent research shows that many students feel “anxious, confused, and distrustful” when it comes to using AI tools in their coursework (Flittner et al. 2024). These feelings can be rooted in the following.
While these are general concerns, some students are hesitant about AI because they have learned how to consistently get good grades based on teacher expectations, and they may be concerned that AI will not produce the same level of performance and their grades could suffer—or they believe using an AI tool requires more work
Regardless, even when tools are introduced responsibly, students may see them as replacing effort rather than extending it. Teachers, therefore, should not frame AI as the first step in the learning process, but as step four or five. This will give the student more space and time for ideation, creativity, and critical thinking.
Within the Learning Journey, AI Belongs Later
Consider embedding AI into the 5 E’s learning cycle, aligned with three-dimensional science instruction.
When students see that they are the drivers of understanding and the learning process, and teachers themselves are not overly reliant on AI, AI becomes a scaffolding tool. Hopefully, students are more likely to trust the process.
However, trusting the process also must come with an ability to be critical consumers of what AI produces, and students must ultimately rely on their own understanding and investigation, if needed, to accept or reject what AI provides.
Where Humans Excel and Where AI Falls Short
A fascinating and insightful article by Bryan (2025) demonstrates the power of human/student thinking. This article discusses the work of University of Washington (UW) researchers who created AI Puzzlers, a game in which children solve ARC (Abstraction and Reasoning Corpus) puzzles. The puzzles, originally designed by François Chollet, are a benchmark for evaluating AI systems’ reasoning and problem-solving abilities, particularly their skill acquisition. The game, which consists of ARC-style visual reasoning puzzles, was tested with more than 100 students in grades 3–8 and 21 children ages 6–11 through UW’s KidsTeam programs. The game included an “Assist Mode” in which children helped guide AI with chatbots toward correct solutions (Bryan 2025). Children saw how various AI chatbots attempted—and usually failed—to solve the same tasks (Bryan 2025). Through gameplay, kids learned to critically evaluate AI limitations and how model explanations are often confidently wrong, prompting reflection on the differences between human creativity and AI computation. The findings emphasize that children can become savvy skeptics of AI when given tools that reveal its limitations (Bryan 2025).
One of the most powerful ways to address skepticism is to demonstrate what AI can’t do.
Teachers can design “human vs. AI” comparisons to illustrate these limitations, letting students improve on, critique, or rewrite AI-generated explanations.
Strategies for Earning Student and Parent Buy-In
Even if students want to use AI, some parents might still need some persuasion. Here are several ideas for making AI integration more human-centered.
Respecting Skepticism, Encouraging Inquiry
Students’ discomfort with AI is not a roadblock: It’s a teachable moment. It reflects the kind of critical inquiry we want in science. By helping students view AI as a tool that enhances human intelligence rather than competing with it, we affirm their agency in a tech-infused world. Teachers don’t need to be AI experts. But we can serve as bridges, connecting curiosity with critical thinking and innovation with integrity.
Students can consider AI an “Add-on” or “Plug-in,” if you will, that will bring life to an aspect of their work that they have already imagined. Students may find more comfort using it for artistic purposes to bring an idea or story to life, as opposed to for content.
References
Bryan, C. 2025, July 24. Game by UW researchers shows kids the limits of AI. GovTech. https://www.govtech.com/education/k-12/game-by-uw-researchers-shows-kids-the-limits-of-ai.
Flittner, A., T. Arvanitis, and C. Rigby. 2024, July 15. University students feel anxious, confused, and distrustful about AI in the classroom—and among their peers. The Conversation. https://theconversation.com/university-students-feel-anxious-confused-and-distrustful-about-ai-in-the-classroom-and-among-their-peers-258665.
Seldon, A., and O. Abidoye. 2018. The Fourth Education Revolution: Will Artificial Intelligence Liberate or Infantilise Humanity? Buckingham, UK: University of Buckingham Press.
Valerie Bennett, Ph.D., Ed.D., is an Assistant Professor in STEM Education at Clark Atlanta University, where she also serves as the Program Director for Graduate Teacher Education and the Director for Educational Technology and Innovation. With more than 25 years of experience and degrees in engineering from Vanderbilt University and Georgia Tech, she focuses on STEM equity for underserved groups. Her research includes AI interventions in STEM education, and she currently co-leads the Noyce NSF grant, works with the AUC Data Science Initiative, and collaborates with Google to address CS workforce diversity and engagement in the Atlanta University Center K–12 community.
Christine Anne Royce, Ed.D., is a past president of the National Science Teaching Association and currently serves as a Professor in Teacher Education and the Co-Director for the MAT in STEM Education at Shippensburg University. Her areas of interest and research include utilizing digital technologies and tools within the classroom, global education, and the integration of children's literature into the science classroom. She is an author of more than 140 publications, including the Science and Children Teaching Through Trade Books column.
Note: We would like to thank Debra Shapiro, NSTA’s Communication Specialist, Content, for her support related to this topic and recommended resources.
This article is part of the blog series From Chalkboards to AI, which focuses on how artificial intelligence can be used in the classroom in support of science as explained and described in A Framework for K–12 Science Education and the Next Generation Science Standards.
The mission of NSTA is to transform science education to benefit all through professional learning, partnerships, and advocacy.