dc.creator | Bian, Dayi | |
dc.date.accessioned | 2020-08-24T11:52:50Z | |
dc.date.available | 2020-02-26 | |
dc.date.issued | 2019-08-30 | |
dc.identifier.uri | https://etd.library.vanderbilt.edu/etd-08292019-104321 | |
dc.identifier.uri | http://hdl.handle.net/1803/15511 | |
dc.description.abstract | Autism Spectrum Disorder (ASD) is characterized by social interaction deficits, verbal and non-verbal communication skill deficits, repetitive behaviors, and fixed interests. Given the limited availability of professionals trained in autism intervention and diagnosis, emerging technology will play an important role in providing more accessible intervention and early diagnosis in the future. One critical skill necessary for attaining independence, securing a job, and maintaining social relationships is driving, but it is only recently that researchers have turned their attention towards the issue of driving in the ASD population. VR-based skill training systems have a demonstrated value for training specific skills in individuals with ASD. We designed a VR-based Driving Environment with Adaptive Response technology (VDEAR) for individuals with ASD that can adapt task difficulty in real-time based not only on task performance, but also on participant engagement. In order to achieve this adaptive system behavior, we developed a closed-loop physiology-based engagement recognition module embedded within a dynamic difficulty adjustment mechanism that controls driving task presentation in the driving platform. With respect to early diagnosis of ASD, sensory processing differences, including responses to auditory, visual, and tactile stimuli, are ideal targets. However, most existing studies focus on the audiovisual paradigm and ignore the sense of touch. We present a Multisensory Stimulation and Data Capture System (MADCAP) that can deliver audio, visual, and tactile stimuli in a controlled manner and capture peripheral physiological, eye gaze and electroencephalographic response data. Design of MADCAP paves the way for future research into multisensory perception and processing in infancy. MADCAP could be utilized to explore multisensory processing differences between the infants who will and will not develop ASD. | |
dc.format.mimetype | application/pdf | |
dc.subject | affective computing | |
dc.subject | human computer interaction | |
dc.subject | physiological computing | |
dc.subject | autism detection | |
dc.subject | autism intervention | |
dc.subject | autism spectrum disorder | |
dc.subject | machine learning | |
dc.title | Physiology-based Intelligent Systems for Individuals with Autism: Novel Platforms for Autism Intervention and Early Detection | |
dc.type | dissertation | |
dc.contributor.committeeMember | Zachary Warren | |
dc.contributor.committeeMember | Maithilee Kunda | |
dc.contributor.committeeMember | Gabor Karsai | |
dc.contributor.committeeMember | Amy Weitlauf | |
dc.contributor.committeeMember | D. Mitchell Wilkes | |
dc.type.material | text | |
thesis.degree.name | PHD | |
thesis.degree.level | dissertation | |
thesis.degree.discipline | Electrical Engineering | |
thesis.degree.grantor | Vanderbilt University | |
local.embargo.terms | 2020-02-26 | |
local.embargo.lift | 2020-02-26 | |
dc.contributor.committeeChair | Nilanjan Sarkar | |