When Satya Nitta worked at IBM, he and a team of colleagues took on a bold task: using the latest in artificial intelligence to create a new type of personal digital tutor.
This was before ChatGPT existed and fewer people were talking about the wonders of ai. But Nitta was working with what was perhaps the most high-profile ai system at the time: IBM's Watson. This artificial intelligence tool had achieved great achievements, including beat humans on the quiz show Jeopardy in 2011.
Nitta says he was optimistic that Watson could power widespread tutoring, but knew the task would be extremely difficult. “I remember telling the top brass at IBM that this will be a 25-year journey,” he recently told EdSurge.
He says his team spent about five years trying it, and along the way they helped create some small-scale attempts at learning products, such as a pilot chatbot assistant that was part of a Pearson online psychology course system in 2018. .
But in the end, Nitta decided that while the generative ai technology generating buzz these days brings new capabilities that will change education and other fields, the technology simply isn't up to the task of becoming a widespread personal tutor, and it doesn't. be. at least for decades, if ever.
“We will have flying cars before we have ai tutors,” he says. “It is a deeply human process that ai is hopelessly unable to address in any meaningful way. It's like being a therapist or a nurse.”
Instead, he co-founded an ai startup, called Merlyn Mind, that is creating other types of ai-based tools for educators.
Meanwhile, many companies and educational leaders these days are working hard to pursue the dream of creating ai tutors. Even a recent White House executive order seeks to help the cause.
Earlier this month, Sal Khan, leader of the nonprofit Khan Academy, technology/ai-chatbots-khan-education-tutoring.html” target=”_blank” rel=”noopener nofollow”>told the New York Times: “We are on the verge of using ai for probably the biggest positive transformation education has ever seen. And the way we're going to do that is by giving every student on the planet an amazing, but artificially intelligent, personal tutor.”
Khan Academy has been one of the first organizations to use ChatGPT to try to develop such a tutor, which it calls Khanmigo, which is currently in a pilot phase in a number of schools.
However, Khan's system comes with a nasty caveat, noting that it “sometimes makes mistakes.” The warning is necessary because all of the latest ai chatbots suffer from what are known as “hallucinations,” the word used to describe situations in which the chatbot simply fabricates details when it does not know the answer to a question asked by a user.
ai experts are busy trying to offset the hallucination problem, and one of the most promising approaches so far is to bring in a standalone ai chatbot to check the results of a system like ChatGPT and see if it's likely invented. details. That's what researchers at Georgia tech have been trying, for example, in hopes that their multi-chatbot system can get to the point where any false information is removed from a response before it is shown to a student. But it is not yet clear that this approach can achieve a level of precision that educators will accept.
However, at this critical point in the development of new ai tools, it is useful to ask whether a chatbot tutor is the right target for developers. Or is there a better metaphor than “tutor” to describe what generative ai can do to help students and teachers?
An 'always active helper'
Michael Feldstein spends a lot of time experimenting with chatbots these days. He is a long-time edtech blogger and consultant, and in the past he had no qualms about calling out what he considered excessive hype on the part of companies selling edtech tools.
In 2015, he famously criticized promises about what was then the latest in artificial intelligence for education: a tool from a company called Knewton. Knewton CEO José Ferreira said his product would be “like a tutor robot in the sky that can semi-read your mind and find out what your strengths and weaknesses are, down to the percentile.” Which led Feldstein to respond that the CEO was “selling snake oil” because, Feldstein argued, the tool didn't come close to delivering on that promise. (Knewton's assets were quietly sold a few years later.)
So what does Feldstein think of the latest promises from ai experts that effective tutors could be on the near horizon?
“ChatGPT is definitely not snake oil, far from it,” he tells EdSurge. “Nor is it a tutor robot in the sky that can semi-read your mind. “It has new capabilities and we need to think about what kinds of tutoring features current technology can offer that would be useful to students.”
However, he believes mentoring is a useful way to see what ChatGPT and other new chatbots can do. And he says that comes from personal experience.
Feldstein has a relative who is battling a brain hemorrhage, so Feldstein has turned to ChatGPT to give him personal lessons on how to understand his loved one's medical condition and prognosis. As Feldstein receives updates from friends and family on Facebook, he says, he asks questions in an ongoing thread on ChatGPT to try to better understand what's going on.
“When I ask it the right way, I can give myself the right amount of detail about, 'What do we know today about your chances of getting well again?'” Feldstein says. “It's not the same as talking to a doctor, but it has taught me significantly about a serious topic and helped me become better informed about my family member's condition.”
While Feldstein says he would call it a guardian, he maintains that it's still important for companies not to oversell their ai tools. “We've done a disservice by saying that they are these all-knowing boxes, or that they will be in a few months,” he says. “They are tools. They are strange tools. “They misbehave in strange ways, just like people.”
He notes that even human tutors can make mistakes, but most students have an idea of what they're getting into when they make an appointment with a human tutor.
“When you go to a tutoring center at your university, they don't know everything. You don't know how trained they are. There is a chance they will tell you something that is wrong. But go in and get the help you can.”
Whatever these new ai tools are called, he says, it will be useful “to have a helper always available that you can ask questions,” even if their results are just a starting point for further learning.
'Boring' but important support tasks
What are the new ways generative ai tools can be used in education, if tutoring ends up not being the right option?
For Nitta, the most important role is to serve as an expert assistant rather than replacing an expert tutor. In other words, instead of replacing, say, a therapist, imagine that chatbots can help a human therapist summarize and organize notes from a session with a patient.
“It's a very useful tool rather than an ai pretending to be a therapist,” he says. Although some may see it as “boring,” he maintains that the superpower of technology is “automating things that humans don't like to do.”
In the educational context, his company is creating artificial intelligence tools designed to help teachers or human tutors do their jobs better. To that end, Merlyn Mind has taken the unusual step of building its own big language model designed for education from scratch.
Even then, he maintains that the best results are obtained when the model is tuned to support specific educational domains, training it with vetted data sets rather than relying on ChatGPT and other conventional tools that pull large amounts of information from the Internet.
“What does a human tutor do well? “They know the student and provide human motivation,” he adds. “What we are looking for is for ai to improve the tutor.”