Intelligent Systems for Autism Spectrum Disorders and Schizophrenia Intervention – Design, Development and User Studies
Bekele, Esube Tamirat
Autism spectrum disorders (ASD) are characterized by difficulties in social communication as well as repetitive and atypical patterns of behaviors. According to the Centers for Disease Control and prevention (CDC), an estimated 1 in 68 children and an estimated 1 out of 42 boys (with 5 times prevalence than girls) in the United States have ASD. The average lifetime cost of care for individual with autism is estimated to be around $3.2 million, with average medical expenditures for individuals with ASD 4.1–6.2 times greater than for those without ASD. Schizophrenia (SZ) is a debilitating psychotic disorder that affects about 1% of the population, costing more than $100 billion annually in the USA. It causes emotional and cognitive impairments. Currently, there are no effective pharmacological treatments. Given the present limits of intervention science and the powerful nature of early impairments across the lifespan, there is urgent need for the development and application of novel treatment paradigms capable of substantially more efficacious individualized impact on the early core deficits of ASD and SZ. Given rapid progress and developments in technology, it has been argued that innovative computer and robot oriented technologies could be effectively harnessed to provide innovative clinical treatments for individuals with ASD. This doctoral research explores the design and implementation of intelligent robotic and virtual reality systems for personalized treatment of autism spectrum disorders (ASD) and schizophrenia (SZ) interventions. Recent advances in robotic, virtual reality, and sensor technologies are utilized by designing, implementing and testing novel robot-assisted intervention for children with ASD, virtual reality-based facial emotional expression recognition for teenagers with ASD and adults with schizophrenia, and virtual reality-based social interaction and contextual emotion understanding in a virtual environment. We have designed novel human-robot and human-computer architectures in this dissertation to address the problem that fuse multimodal information and consider both explicit and implicit responses to optimize interaction for learning and intervention. The designed systems are not only sensitive of user performance but also monitor implicit affective cues inferred from peripheral body physiological signals, eye gaze patterns, and EEG signals. The multimodal cues were processed to infer psychological as well as task related performance patterns to help assess differences among subjects as well as generate a dynamic feedback online in assisting the subject learn the specific targeted skills. The design and implementation of the robotic and virtual reality systems were followed by user studies to evaluate the efficacy of the developed systems. A total of 4, one robot-centric and three virtual reality-based novel systems were designed in this dissertation. A robot-mediated joint attention framework for young children with ASD, a VR-based facial expression recognition system for adolescents with ASD, A VR-based facial expression recognition system as compared to a standard International Affective Picture System (IAPS) for adults with schizophrenia, and a multimodal VR-based social interaction platform for adolescents with ASD. The results of the user studies indicated that these novel technological systems are indeed viable intervention paradigms with added benefits of controllability while assessing objective performance metrics as compared to traditional human-centric interventions. In addition to adding to the body of literature in robot-mediated and VR-based intelligent intervention systems, multimodal affective signal processing, and pattern recognition, this work benefits the ASD and SZ society by providing cost effective intervention tools that may alleviate the lack of trained therapists in these fields.