A few weeks ago, Ferris State University made ai-students-take-flight/” target=”_blank” rel=”noopener nofollow”>an eye-catching advertisement that it planned to enroll two chatbot “students” in its classes, grading it in a novel way for universities to test their curricula.
The unusual idea seems somewhat like a publicity stunt to draw attention to the academic specialization it offers in artificial intelligence, and ai-experiment” target=”_blank” rel=”noopener nofollow”>local television news The stations pounced on the idea that non-human classmates would participate side-by-side in hybrid college classes with T-shirt-clad youth. But the experiment points to interesting possibilities (and raises ethical questions) about how the latest ai technology could be used to improve teaching.
In fact, you could say that the experiment at the public university in Michigan marks a new generation in an area known as “learning analytics.” It's an approach that has grown over the past decade, with universities trying to take advantage of the digital breadcrumbs left by students as they move through digital platforms and online course materials to find patterns that can improve designing courses and even customizing material for individual students.
“ai could give us a novel way to see something we haven't seen before,” says Kyle Bowen, deputy chief information officer at Arizona State University. “Now we can have the notion of a data double… the notion that we have something that reflects a person at the data level.”
In other words, rather than just watching students click, generative ai tools like ChatGPT allow educators to create simulations of students who embody different profiles (for example, a first-generation student or a struggling student). in a certain subject) and see What happens when they find material in university courses?
“How can we fine-tune ai responses to reflect the diversity of our student body or the needs of a first-year student?” Bowen asks, suggesting that doing so could bring new insights to people who design learning experiences.
While Arizona State has not created virtual learners, it recently announced a major commitment to experimenting with ai to improve its teaching. Last month the university became the first higher education institution. partner with OpenAIthe organization behind ChatGPT, with the goal of “improving student success” and “streamlining organizational processes.”
And other universities are also pushing the latest ai to better understand student data. When Paul LeBlanc resigned as president of Southern New Hampshire University late last year, he announced that his next step would be to lead a project at the university to use ChatGPT and other artificial intelligence tools to reshape university teaching.
So what could generative ai do to improve learning?
Creating ai 'students'
Few details of the Ferris State experiment have been released so far, and university spokesperson Dave Murray told EdSurge that the chatbot students have not yet started taking classes.
Officials say they are still being built. The two chatbots are named Ann and Fry, with the former named after university librarian Ann Breitenwischer and the latter a nod to the fact that a leader of the effort, Kasey Thompson, once worked in McDonald's corporate office. She interviewed real students to help develop the personalities of the ai robots.
The robots ai/” target=”_blank” rel=”noopener nofollow”>reportedly be equipped with speech recognition and voice capabilities that will allow them to participate in class discussions with real students and ask questions of teachers. ai agents will also receive course curriculum information and deliver assignments.
“The entire role of a university and a college is evolving to meet the needs of society,” said Thompson, special assistant to the president for innovation and entrepreneurship at Ferris State. ai/” target=”_blank” rel=”noopener nofollow”>he told a local television station. “And what we hope to learn from Ann and Fry is: What is that like? How can we improve that experience for students?”
Murray says “the goal is to have them in classes this semester.”
Seth Brott, a sophomore at Ferris State University majoring in information security, plans to give his fellow robots a warm welcome.
He says he was excited when one of his teachers told him about the plan. “I would love to be in a class with one of these robots and see how they perform,” she says.
Brott says he has experimented with ChatGPT on some class assignments. He says the technology helped him generate ideas for a public speaking class, but was less useful when he was allowed to use it in an information security class to suggest ways to protect a data system.
So do you think chatbots will be able to pass your courses?
“At the moment, chatbots probably can't work very well,” he surmises, “but they can learn. When they make a mistake, they get feedback very similar to ours.” And she says that over time she can imagine the university could hone a student chatbot so she can thrive in the classroom.
He said he's excited the university is trying this innovative experiment. And he also hopes he can push the university to improve its teaching. A friend of yours, for example, recently told you about a course in which everyone in the class had an average score of only 60 percent on the midterms. To him, that seemed like an opportunity to send out a chatbot to see how instruction could be clearer for students.
However, not all students are enthusiastic. Johnny Chang, a Stanford University graduate student who hosted a national webinar last summer to encourage more educators to learn and try ai, had some questions about the approach at Ferris State.
“If the goal is to get feedback on the student experience, they should create tools to help administrators better talk to real students,” Chang says.
He is currently pursuing a master's degree in computer science and is focusing on artificial intelligence, and says the danger of creating chatbot students is that they could create “inherent bias” based on how they are trained. For example, if chatbot students are trained based solely on students of a certain type, Chang says, “the underrepresented student population could end up feeling unsupported.”
However, that doesn't mean ai can't play a role in improving a university. He suggested that Ferris State leaders could create a tool that would encourage students at different points in their learning process and ask them to answer quick survey questions. ai could then be used to sort, organize, and synthesize all that data in ways that would have been too difficult to do with previous technologies.
“If the goal is to get information about student behavior, these chatbots are good at analyzing and summarizing, almost like a co-pilot for administrators,” Chang says.
Ferris State spokesperson Murray says the university is willing to try several approaches.
“We often talk to students about their experiences and make changes based on their feedback. This is an additional approach,” she states. “We are interested in seeing what types of educational applications we can develop. We will learn what works, but also what needs to be refined and what might not work at all.”
Building a 'Syllabot'
At Arizona State, Bowen says that after calling out the community for ideas on how to use ChatGPT, leaders approved more than 100 different projects involving hundreds of faculty and staff members. Later they also plan to invite students to lead projects.
“We want a lot of experimentation to take place,” he says.
One idea being explored is a project he says they “jokingly call Syllabot.” The concept is: What if a syllabus was something that students could ask questions about instead of a static document?
“If you have a task to work on (for example, a writing assignment), you may be asked, 'How might I approach it?' he says.
Overall, he says, the university is working on a strategy around “an ai platform for ASU that combines our data here.”
And once big language models can be combined with university-specific analytical data, Bowen says the big question will be: “How can it help us take action on that information?”