The pandemic has greatly changed public perception about the appropriate use of technology by young people, says Katie Davis, an associate professor at the University of Washington’s school of information.
“The pandemic forced us to face the fact that technology is absolutely essential in our lives, and especially during crises,” he says. Now, she says, the discussion is shifting toward questions like “When is technology good?” When is it bad? What should be their role in the development of young people at each stage of their progression, from toddlers to emerging adulthood and beyond?
The EdSurge Podcast recently interviewed Davis, who has conducted research on the intersection of child development and technology for nearly 20 years. In a new book, she lays out a framework for how to best match technology with each stage of growth: “The child of technology: the role of digital media in the ages and stages of growth.” Celebrate when technology can help children thrive, as well as warn when it can get in their way.
Sometimes the problems posed by devices can arise in unexpected ways, he says, such as when literacy apps aimed at young readers introduce too many features, such as the meaning of a word that appears on the screen when children touch it, or sounds rich that they sound when they are children. read.
“You think it must be very good when you learn to read, to hear the word spoken. And in theory, these seem like good ways to improve the learning experience,” says Davis. “However, we must remember that, especially for young children, there is a limit to their information processing bandwidth. If you think about a computer, an analogy with a computer, they just have CPU than us as adults.”
And he says there’s a growing awareness of how some tech companies design their systems to do things that aren’t in the best interest of users, a phenomenon known as “dark patterns.” A common example of a dark pattern, Davis says, is the autoplay feature on YouTube that often keeps viewers watching and can make it harder for a parent to convince their young child to put the device down.
Davis calls for greater regulation of technology companies to control such design features.
“Relying on tech companies to regulate themselves doesn’t work,” he argues, “because it simply isn’t in their financial interest to put users’ well-being at the forefront. “Unfortunately, that’s not what makes them a lot of money.”
But he recognizes that regulations can have unintended consequences that can also be harmful. That’s why he calls on academics to conduct more research to help inform best practices for technology tools so they promote well-being and are more effective for education.
Listen to the episode on Apple Podcasts, Cloudy, Spotify or wherever you get your podcasts, or use the player on this page.