UMaine experts leading conversations around best practices for AI in schools
Whenever she talks about teachers using artificial intelligence, University of Maine associate professor of special education Sarah Howorth likes to remind whoever she’s talking to that educators have always used technology to support student learning.
“Probably the most relatable example for most people is the use of calculators in math class,” Howorth says. “We’ve also had computers in schools for a couple of decades now. So throughout history, educators have adapted to the latest emerging technologies.”
While much of the public discussion around AI in education has focused on the potential negatives, such as cheating, information bias and concerns over technology replacing the human element of teaching and learning, Howorth and colleagues across the country are exploring some of the ways the technology can be used to help teachers and their students. The latest issue of the Journal of Special Education Preparation, which Howorth guest edited, features research on AI’s potential as a game-changing tool for educators, learners and families.
“The genie is already out of the bottle in terms of AI in schools,” Howorth said. “So the questions then become: How can we use AI to enrich learning for all students? And how can we use it to support teachers?”
The special issue of the journal is part of a project led by Howorth titled “Leading the Way: AI in Special Education Teacher Education,” which launched last year. It’s supported by a $9,000 grant from the Council for Exceptional Children (CEC), the largest professional organization focused on improving the educational success of youth with disabilities, as well as special gifts and talents. The project also includes a free webinar series based on the articles in the special issue. Howorth is featured in a video introduction to the series, which was produced by the Center for Innovation, Design and Digital Learning (CIDDL) at the University of Kansas.
For students with special needs, Howorth said AI can be beneficial. For example, a student with Attention Deficit Hyperactivity Disorder might be able to reduce their cognitive load by using an AI note taker to summarize classroom presentations and create action items for homework assignments. Generative AI can be used to level a text to a student’s reading ability, making assignments more inclusive, or to create social stories that teach students with autism about norms and how to communicate with others.
“AI is great for creating educational materials that appeal to students’ interests,” Howorth says. “If I’m a teacher and I have a student who’s really into horses, I can use AI to create stories and images of horses that I can incorporate into my lessons in ways that are more engaging for them. Students can also use AI tools to express their creativity and knowledge.”
At the same time, Howorth says it’s important for teachers to recognize when and how to use AI. She says a good rule of thumb for how to incorporate AI into instruction is to design assignments that can’t be completed with the technology alone. In other words, students should be able to show how and why they used AI to do the assignment. She notes that no technology can supplant the knowledge and skill of professional educators.
“We still need teachers to be teachers,” she says. “A skilled and compassionate human being is needed for effective instruction.”
UMaine senior lecturer of education Tammy Mills has also been working through some of the complexities around AI with both undergraduate teacher education students and graduate students who are already working in schools. For example, she asks her students to prompt ChatGPT to produce things like lesson plans and assessments for student learning.
“For the most part,” she said, “they’re not happy with the results, because they know best practices for instruction and assessment. They look at what ChatGPT comes up with and they can do it better.”
Like Howorth, Mills said she thinks AI will transform education. But she says it needs to be employed safely and ethically. She said she considers herself a co-learner along with her students as they figure out best practices together.
“You have to recognize that whatever you put into AI is going to be reflected in what you get out of it,” said Mills. “We talk about privacy and making sure you’re not putting personal information about students into any AI tools. We also talk about making sure we’re being culturally sensitive and aware of the biases inherent in AI, so that when we use it we’re able to get something that represents the demographics of the learners.”
“If anything,” she adds, “it makes human knowledge and skill more important than ever. Teachers know their students’ strengths, preferences, needs and interests, and can use this information to support kids with a variety of technological and pedagogical tools.”
Contact: Casey Kelly, casey.kelly@maine.edu