When ChatGPT and other new generative ai tools emerged in late 2022, educators’ main concern was cheating. After all, word quickly spread among students on TikTok and other social media platforms that, with a few simple prompts, a chatbot could write an essay or answer an assignment in ways that would be difficult for teachers to detect.
But these days, when it comes to ai, another concern has come to light: that the technology could lead to less human interaction in schools and universities, and that school administrators might one day try to use it to replace teachers.
And it is not just educators who are concerned: this is becoming an educational policy issue.
Last week, for example, a bill It passed through both houses of the California state legislature and aims to ensure that courses at the state's community colleges are taught by qualified humans, not artificial intelligence robots.
Sabrina Cervantes, a Democratic member of the California State Assembly who introduced the legislation, said in a sentence that the goal of the bill is to “offer barriers to the integration of ai into classrooms while ensuring that community college students are taught by human professors.”
To be clear, no one appears to have proposed replacing the state’s community college professors with ChatGPT or other generative ai tools. And even the bill’s leaders say they can imagine positive uses for ai in teaching, and the bill would not prevent colleges from using generative ai to help with tasks like grading or creating educational materials.
But proponents of the bill also say they have reason to worry about the possibility of ai replacing teachers in the future. Earlier this year, for example, a Boston University dean ai-help-irks-boston-universitys-striking-grad-students” target=”_blank” rel=”noopener nofollow”>raised concern Among graduate workers striking for higher wages, ai was mentioned as a possible strategy to handle course discussions and other classroom activities that were affected by the strike. However, university officials later clarified that they had no intention of replacing any graduate workers with ai software.
While California is the state that has made the most progress, it is the only one where such measures are being considered. In Minnesota, Rep. Dan Wolgamott of the Democratic-Farmer-Labor Party, ai-bills-regulation-minnesota-social-media-privacy” target=”_blank” rel=”noopener nofollow”>proposed a bill that would prohibit Minnesota State College and University System campuses from using ai “as the primary instructor for a credit-bearing course.” The measure has stalled for now.
Elementary and secondary school teachers are also beginning to push for ai protections for educators. The National Education Association, the nation's largest teachers union, recently released a report A policy statement on the use of ai in education who stressed that human educators must “remain at the center of education.”
It's a sign of the mixed but very intense mood among many educators, who see both promise and potential threat in generative ai technology.
Careful language
Even education leaders pushing for steps to prevent ai from displacing educators have been at pains to point out that the technology could have beneficial applications in education. They are being cautious with the language they use to ensure they don’t ban ai use altogether.
The bill in California, for example, faced initial pushback even from some supporters of the concept, out of fear of moving too soon to legislate the rapidly changing technology of generative ai, says Wendy Brill-Wynkoop, president of the California Community College Faculty Association, whose group led the effort to draft the bill.
A draft version of the bill explicitly stated that ai “may not be used to replace teaching staff for the purposes of providing regular instruction and interaction with students in a course of instruction, and may only be used as a peripheral tool.”
According to her, the internal debate almost led the leaders to abandon the project. Then Brill-Wynkoop suggested a compromise: removing all explicit references to artificial intelligence from the text of the bill.
“We don’t even need the words ai in the bill, we just need to make sure that humans are at the center,” he says. So the final text of the proposed legislation, which is very brief, reads: “This bill would explicitly require that the official instructor of a course of instruction be a person who meets the minimum qualifications described above to serve as a faculty member delivering credit-bearing instruction.”
“Our intention was not to put a giant wall in front of ai,” says Brill-Wynkoop. “That’s crazy. It’s a train that’s moving very fast. We’re not against technology, but the question is: how do we use it intelligently?”
And he admits that he doesn’t believe there’s an “evil mind in Sacramento saying, ‘I want to get rid of these nasty teachers. ’” But, he adds, in California, “education has been massively underfunded for years, and with limited budgets, there are a number of tech companies there saying, ‘How can we help them with their limited budgets by driving efficiency? ’”
Ethan Mollick, a professor at the University of Pennsylvania who has become a leading voice on ai in education, wrote in his newsletter Last month, he said he's concerned that many companies and organizations are focusing too much on efficiency and downsizing as they rush to adopt ai technologies. Instead, he argues that leaders should focus on finding ways to rethink how they do things to take advantage of tasks that ai can do well.
He noted in his newsletter that even the companies building these new, large language models have not yet figured out which real-world tasks they are best suited for.
“I worry that the lesson of the Industrial Revolution is being lost in ai implementations in businesses,” he wrote. “Any efficiency gains need to be converted into cost savings even before anyone in the organization understands what ai is for. It’s as if, after gaining access to the steam engine in the 18th century, all manufacturers decided to keep output and quality the same and simply lay off staff in response to the new efficiency, rather than building companies that span the globe by scaling up their output.”
The professor wrote that his university's new program ai-analytics.wharton.upenn.edu/generative-ai-lab/” target=”_blank” rel=”noopener nofollow”>Generative ai Lab He’s trying to model the approach he’d like to see, in which researchers work to explore evidence-based uses of ai and work to avoid what he calls “downside risks” — namely, the concern that organizations might make ineffective use of ai while ousting expert employees under the guise of cutting costs. And he says the lab is committed to sharing what it learns.
Keeping humans at the center
ai Education Project, a nonprofit focused on ai literacy, Respondent More than 1,000 American educators in 2023 were asked what educators think about how ai is influencing the world, and education more specifically. In the survey, participants were asked to choose from a list of top concerns about ai and the one that came up first was that ai could lead to “a lack of human interaction.”
That could be in response to recent announcements from major ai developers, including ChatGPT creator OpenAI, about new versions of their tools that can respond to voice commands and see and respond to what students enter on their screens. Sal Khan, founder of Khan Academy, recently posted a video demonstration in which he uses a prototype of his organization’s chatbot, Khanmigo, that has these features, to tutor his teenage son. The technology shown in the demo isn’t yet widely available and is at least six months to a year away, according to Khan. Still, the video went viral and sparked a debate about whether any machine can replace a human in something as deeply personal as one-on-one tutoring.
Meanwhile, many of the new features and products launched in recent weeks focus on helping educators with administrative tasks or responsibilities, such as creating lesson plans and other classroom materials. And those are the kinds of behind-the-scenes uses of ai that students may not even know are happening.
This was clear on the exhibit floor at last week’s ISTE Live conference in Denver, which drew more than 15,000 educators and edtech leaders. (EdSurge is an independent newsroom that shares a parent organization with ISTE. Learn more about EdSurge’s ethics and policies here and about sponsors here.)
Small startups, tech giants and everything in between touted new features using generative ai to support educators with a variety of responsibilities, and some companies had tools to serve as a virtual classroom assistant.
Many teachers who attended the event were not actively concerned about being replaced by bots.
“I don’t even think about it, because what I’m bringing into the classroom is something that ai can’t replicate,” said Lauren Reynolds, a third-grade teacher at Riverwood Elementary School in Oklahoma City. “I have that human connection. I’m getting to know my students on an individual basis. I’m reading more of what they’re telling me.”
Christina Matasavage, a STEM teacher at Belton Preparatory Academy in South Carolina, said she believes COVID closures and emergency shifts to remote learning proved that devices can’t replace human instructors. “I think we realized that teachers are so necessary when COVID hit and we went virtual. People realized very (quickly) that we can’t be replaced” by technology.