Adam Naples’ grant will enhance a series of interactive experiments using eye-tracking and EEG to study social brain function in autism. In these experiments, participants will interact with onscreen faces that respond to their gaze with reciprocal eye contact or emotion. The Autism Science Foundation award will support implementation of hardware to monitor a child’s facial expression, gaze, speech, and posture in real time during recording of neural activity. This technology will enable simulation of interpersonal interactions based on a child’s verbal and nonverbal behavior. For example, an onscreen face can reciprocate emotions responding not only to gaze, but to a smile or a frown. This study, conducted in collaboration with Jamie McPartland, Ph.D. and the Developmental Electrophysiology Laboratory, will investigate the brain mechanisms of multimodal reciprocal social interaction for the first time.
Submitted by Emily Hau on November 15, 2013