ai-based education technology has gained enormous popularity over the past two years. And while there are many promising tools, there are also many that are overhyped, poorly designed, or simply unnecessary, no matter how well-intentioned they may be. The Los Angeles Unified School District learned this the hard way The district recently had to abandon a chatbot system it had spent $3 million on just months after launching it because the company that developed it had laid off half of its staff.
To help educators avoid these ai mis-sells and properly evaluate ai tools, we turned to Tal Havivi, managing director of research and development, and Joseph South, chief innovation officer, both at ISTE/ASCD.
Here are his tips for properly evaluating ai tools.
<h2 id="1-beware-of-ai-products-that-over-promise”>1. Beware of ai products that promise too much
If you see an ai tool that claims to do a teacher’s job, you should probably approach it with the same skepticism you would approach a product that claims to turn water into wine.
“Anyone who suggests their product will make teachers obsolete is overreacting,” South says. “ai is a powerful ally for a teacher, but it does not replace their work or their judgment.”
South adds that ai tools for education should meet the same privacy standards as any other tool. “If a vendor can’t give you immediate answers to questions like, ‘How do you protect user privacy?’ ‘Is this product FERPA compliant?’ ‘How do you handle personally identifiable information and where is it stored?’ then they’re probably not ready to sell to educators.”
Additionally, ai must meet the same accessibility standards as any other tool a school might use, South says.
2. Focus on existing learning objectives
“It’s really important that we don’t get distracted by the glitz of ai,” South says. “ai can do some really amazing things, but it’s essentially just making predictions. Ultimately, it’s up to humans to make decisions about what to do with those predictions.”
That’s why, says South, the best place to start isn’t with technology, but with learning objectives. She advises asking yourself the following questions: “What do you want students to learn?” “How do you want them to learn it?” and “What evidence do you need to see to determine that they actually learned it?”
He adds: “Once you have a very clear idea of those three things, you can go to the ai market and see which solutions best fit your learning objectives and your pedagogical approach.”
3. Beware of hallucinations
In general, there has been a lot of talk about inaccurate or inappropriate hallucinations caused by ai, but they can be especially harmful in the educational setting. Havivi recommends looking into ai tools that allow educators to have some control over what ai can and cannot do.
“Since GenAI doesn’t ‘think,’ it’s important to see if products have controls in place and checks that school systems and educators can use to further reduce unexpected or undesirable outcomes,” Havivi says. “Educators should ask vendors what feedback loops are in place to help product developers improve controls based on educators’ experience and product usage.”
South adds: “Virtually all vendors focused on the K-12 market are taking steps to make their products safe and effective to use, but the depth and consistency of those efforts varies widely.”
He suggests reviewing the tool’s ai prevention plans and then conducting a “red team” exercise. “You and your team log in as a typical user would, but with the sole intention of trying to overcome those protections as best you can to verify that they work as intended,” he says. “ai should never be the final word on any fact. It’s the user’s job to verify everything that ai suggests. Remember, generative ai is simply predicting the next word. It does a mind-blowing job at this, but it doesn’t actually know what it’s saying.”
4. Avoid long-term contracts
When it comes to hiring an ai-powered edtech tool, being a little timid about committing is probably a good thing.
“The restructuring of the ai tool vendor market hasn’t happened yet,” says South. “There are far more tool vendors than the market can sustain. Signing multi-year contracts is a bet that the vendor will be here in 24 months, which may or may not happen.”
It’s also a good idea to make sure your contract includes free upgrades as the technology improves. “ai is evolving so quickly that current functionality is likely to feel dated in just a few months,” says South. “You don’t want to be contractually stuck with the current version when everyone else has moved on to something that’s twice as fast and twice as good.”
<h2 id="5-make-sure-your-approach-to-ai-is-evolving”>5. Make sure your approach to ai is evolving
Of course, as with many aspects of education, solutions for properly integrating ai are neither unique nor static.
“In evaluating products, it is especially important to understand the changing capabilities and limitations of generative ai,” says Havivi. “One of the challenges is that the capabilities and limitations of generative ai are not fixed.”
Early, large language models, like GPT-3 and even GPT-4, were especially good at tasks requiring probabilistic or heuristic reasoning, where there were “many paths to a ‘good enough’ answer,” Havivi says. These models were competent at tasks like summarizing content or generating creative text, but they weren’t very good at giving accurate, predictable answers to things like complex math problems.
“That is changing – models are improving,” says Havivi. “This rapid evolution means that the evaluation of ai tools will be dynamic, and it is even more important to ensure that the fundamentals of edtech evaluation remain consistent.”