Virtual-reality based gaze-sensitive adaptive response technology for children with autism spectrum disorder.
Impairments in social communication skills are thought to be core deficits in children with autism spectrum disorder (ASD). In recent years, several assistive technologies, particularly Virtual Reality (VR), have been investigated to promote social interactions in this population. It is well-known that these children demonstrate atypical viewing patterns during social interactions and thus monitoring eye-gaze can be valuable to design intervention strategies. However, presently available VR-based systems are open-loop and designed to chain learning via aspects of one’s performance only permitting limited degree of individualization. My research bridges this gap by closing the loop by developing a novel VR-based interactive system with Gaze-sensitive Adaptive Response Technology that can seamlessly integrate VR-based tasks with eye-tracking techniques to intelligently encourage a participant to engage in social communication tasks. Specifically, such a system is capable of objectively identifying and quantifying one’s engagement level by measuring real-time viewing patterns, subtle changes in eye physiological responses, and performance metric of a participant and adaptively responding in an individualized manner to foster improved social communication skills among the participants. The developed system was tested through a usability study with eight adolescents with ASD. The results indicate the potential of the system to promote improved social task performance along with socially-appropriate mechanisms, (e.g., improved attention towards the face of a communicator) during VR-based social conversation tasks.