Since digital education platforms generate data on how millions of students learn, they also lie in veritable gold mines of information for researchers trying to improve education.
An ethical and legal conundrum stands in the way: how to responsibly share that data without exposing students to the possibility of their personal information being exposed to third parties.
Now, a consortium of education and learning platform researchers is developing what they hope will be a solution: Researchers will never see the real data.
The project called SafeInsights, led by OpenStax at Rice University, is supported by a $90 million grant from the National Science Foundation for five years.
The idea is for Secure information serve as a bridge between your learning platform and your research partners, along with collaborators who help develop how the exchange will work to safeguard student privacy.
“In a normal situation, you end up taking data from websites and learning applications and giving it to researchers to study and analyze to learn from it,” says JP Slavinsky, CEO of SafeInsights and CTO of OpenStax. “Instead, we are bringing researchers' questions to that data. “This creates a safer environment for research that is easier for schools and platforms to participate in, because the data stays where it already is.”
Deeper insights at scale
Another way to think of SafeInsights is as a telescope, say Slavinsky and his colleague Richard Baraniuk, founder and director of OpenStax, which publishes open-access course materials. It will allow researchers to look at the vast amount of data from learning platforms such as the University of Pennsylvania's massive open online courses and Quill.org.
Researchers would develop questions, then transform them into computer code that can examine the data, to send them to learning platforms. Once the results are generated, they will be returned to the researchers without the data having to be shared directly.
“It's really a partnership where we have researchers meeting with schools and platforms, and together we try to solve some problems of interest,” Slavinsky says. “We are providing that telescope for others to bring their research agenda and questions they want answered. Therefore, we are less involved in what specifically is going to be asked and more in making it possible to answer as many questions as possible.
Part of why this model would be so powerful is how it would increase the scale at which educational research is done, Baraniuk says. There are many studies that have small samples of about 50 college students, he explains, participating as part of a psychology class.
“A lot of the studies are about first-year college students, right? Well, that's not representative of the huge variety of different students,” Baraniuk says. “The only way to see that breadth is to do extensive studies, so really the first key behind SafeInsights is partnering with these digital education websites and apps that host literally millions of students every day.”
Another way he sees the project opening new doors for researchers is the diversity of student populations represented by the learning platform partners, which include educational apps for reading, writing and science, along with learning management systems. .
“By putting all these pieces of the puzzle together, the idea is that we can, on a very large scale, get to see a more complete picture of these students,” Baraniuk says. “Our big goal is to try to remove as much friction as possible so that more useful research can be done and then more research-backed pedagogies and teaching techniques can be applied. But as we eliminate that friction, how do we keep everything truly protected?”
Building trust, protecting privacy
Before any investigation takes place, SafeInsights partners at the Future of Privacy Forum are helping to develop the policies that will shape how the program protects student data.
John Verdi, senior vice president of policy at the Future of Privacy Forum, says the goal is to build privacy protections into how everything works. Part of that is helping to develop what he calls the “data enclave,” or the process by which researchers can query data from a learning platform without having direct access. Other aspects include helping develop the review process for how research projects are selected, training researchers on privacy, and publishing lessons learned on how to operate with privacy at the forefront.
“Even if you have great technical safeguards, even if you do a lot of ethical research,” he says of the training aspect, “at the end of the day, the researchers themselves have to make decisions about how to responsibly use the system. “They need to understand how the system works.”
Protecting the privacy of student data in education is generally “woefully underfunded,” he says, but it is safeguarding that information that allows students to trust learning platforms and ultimately create opportunities for learning. research like SafeInsights.
“Giving students and parents the task of protecting data is the wrong place to assign that responsibility,” Verdi says. “Instead, what we need to do is build a digital infrastructure that respects privacy by default and provides assurances that information will be kept confidential and used ethically.”