Key points:
This article and the accompanying image <a target="_blank" href="https://news.ku.edu/news/article/ku-research-team-develops-virtual-reality-ai-boosted-system-to-help-students-with-autism-improve-social-skills” target=”_blank” rel=”noreferrer noopener”>Originally appeared on Ku's news site and they are published here again with permission.
For more than a decade, researchers at the University of Kansas have been developing a virtual reality system to help students with disabilities, especially those with autistic spectrum disorder, to learn, practice and improve the social skills they need in a Typical school day. Now, the KU research team has secured funds to add artificial intelligence components to the system to give these students an extended experience, or XR, to sharpen social interactions in a more natural environment.
The United States Special Education Programs Office has granted a subsidy of five years and $ 2.5 million to researchers within the KU School of Education and Human Sciences to develop greater knowledge and natural opportunities with emotional social competence, or I know. The system will be based on previous works and will provide students with students and teachers an immersive and authentic experience that combines extended reality and elements of the real world of artificial intelligence.
I Know will expand the capabilities of the virtual reality opportunity of Voiss to integrate social skills, a virtual reality system developed by Ku that has proven to be successful and statistically valid to help students with disabilities improve social skills. This system contains 140 unique learning scenarios aimed at teaching knowledge and understanding of 183 social skills in virtual school environments, such as a classroom, corridor, cafeteria or bus that students and teachers can use through multiple platforms such as iPad, Chromebooks or OCULUS VR headphones. The system too It helps students use social skills such as receptive or expressive communication in multiple environments, Not simply in the isolation of a classroom.
I Know will combine the VR aspects of VRIS with the characteristics of ai as large language models to improve the abilities of the systems and allow more natural interactions than to listen to pre -recorded narratives and respond by pressing buttons. The new system will allow the user initiated by the user who can precisely transcribe the spoken language in real time. ai technology of Ikek can also generate appropriate video responses to Avatars students who interact with the audio analysis of users' responses, the integration of images in time and the graphics with instruction to increase the contextual understanding of the students .
“Avatars in I know can have certain reactions and behaviors based on what we want to do. They can model the practices that we want students to see, ”said Amber Rowland, an assistant research professor at the Learning Research Center, part of the KU Life Span Institute and one of the main researchers of Grant's Co.” The system will take advantage of The ai to make sure that students have more natural interactions and will put them in the role of “human in the circuit” by allowing them to speak, and will respond as a normal conversation. “
The spoken answers will not only be more natural and related to everyday situations, but the signals of contextual understanding will help students know better why a certain response is preferred. Rowland said that when the students were presented with multiple options in previous versions, they would often know what answer it was correct, but indicated that it is not how they would have responded in real life.
I know will also provide a student progress monitoring system in real time, telling educators and families how long the students spoke, how often they spoke, the number of keywords used, where students may have had difficulties in The system and other data to help improve understanding.
All Avatar voices that users know are provided by real students, educators and administrators. This helps improve the natural environment of the system without the deficiencies of students who practice social skills with classmates in supervised sessions. For example, users do not have to worry about what the people they practice are thinking about them while they are learning. They can practice the social skills they need until they feel comfortable moving from the XR environment to real life.
“It will take advantage of our ability to get something out of teachers' plates and provide tools for students to learn these skills in multiple environments. At this time, as close as we can be that is to train colleagues. But that puts students with disabilities in a different box in saying: 'You don't know how to do this,' “said Maggie Mosher, an assistant research professor at the Achievement & Evaluation Institute of KU, a coincipal researcher of the subsidy.
Mosher, a KU graduate who completed his doctoral dissertation by comparing Voiss with other social skills interventions, discovered that the system was statistically significant and valid to improve social skills and knowledge in multiple domains. His study, which also found that the system is acceptable, appropriate and feasible, was published in high -impact magazines Computers & Education and Problems and trends in learning technologies.
The I Know support subsidy is one of the four innovation and development subsidies of OSEP aimed at stimulating innovation in educational technology. The research team, including principal researcher Sean Smith, special education professor; Amber Rowland, associated research professor at the Research Center on Learning and the Institute of Aquirios and Evaluation; Maggie Mosher, Assistant Research Professor at AAI; and Bruce Frey, professor of educational psychology, will present his work on the project at the Annual I/ITSEC conference, The world's largest modeling, simulation and training. It is sponsored by the National Training and Simulation Association, which promotes international and interdisciplinary cooperation within the modeling and simulation, training, education and analysis fields and is affiliated with the Industrial Association of National Defense.
The research team has implemented Voiss, available at Apple Store and Google Play, in schools throughout the country. Any person interested in learning more can find information, demonstrations and videos in the Inkek site And you can contact developers to use the system on the site “Work with us” page.
I Know will add resources for teachers and families who want to implement the system on a website called Iknow Tools (teaching occasions and opportunities for learning support) to support the generalization of social skills in real world environments.
“By combining our emotional social socially based on research work (voiss) with the increase in power and flexibility of ai, I know even more customize the learning experience for people with disabilities together with classmates with classmates with Difficulties, “said Smith. “Our hope and expectation is that I know will make students even more involved to develop essential social emotional skills and then apply in the real world to improve their general learning results.”
!function(f,b,e,v,n,t,s)
{if(f.fbq)return;n=f.fbq=function(){n.callMethod?
n.callMethod.apply(n,arguments):n.queue.push(arguments)};
if(!f._fbq)f._fbq=n;n.push=n;n.loaded=!0;n.version=’2.0′;
n.queue=();t=b.createElement(e);t.async=!0;
t.src=v;s=b.getElementsByTagName(e)(0);
s.parentNode.insertBefore(t,s)}(window, document,’script’,
‘https://connect.facebook.net/en_US/fbevents.js’);
fbq(‘init’, ‘6079750752134785’);
fbq(‘track’, ‘PageView’);
(Tagstotranslate) Autism