The eyes may or may not be the gateway to the soul, but they are providing Yale researchers Ami J. Klin, Ph.D., Warren R. Jones and Fred R. Volkmar, M.D., with an unprecedented glimpse into the workings of the autistic mind, thanks to lightweight eye-tracking devices and motion-capture systems like those used by Hollywood directors to create computer-generated characters.
Autistic children and adults are often less threatened by technological devices than they are by people, and many are infatuated with television programs and films. In their eye-tracking work, Klin and Jones have benefited from both tendencies in naturalistic studies that mirror real-world behavior better than typical, highly controlled psychological experiments.
Eye tracking allows scientists to show a film or still image and see precisely where in the frame the subject is looking. The newest head-mounted trackers are relatively inexpensive, and subjects—even infants—can move their heads freely during experiments. Experimenters secure a lightweight rig, which looks a bit like a futuristic baseball cap, to a subject’s head with a comfortable leather headband. Two metal tubes, each equipped with an infrared lamp and a high-speed infrared camera, swoop down from the headband to a spot just under the eyes.
The infrared lamps invisibly illuminate the eyes for the cameras, which make recordings at the rate of 60 frames per second and stream the information into a computer. Sophisticated image-analysis software instantly finds the centers of the pupils in each of these thousands of images. All this happens so quickly that a cursor corresponding to a subject’s direction of gaze can be superimposed on the experimental image as the subject is looking at it.
This technique has revealed that the gaze of autistic subjects has a distinctive signature. Normal subjects who watched the classic Elizabeth Taylor film Who’s Afraid of Virginia Woolf? focused mostly on the actors’ eyes, and made appropriate shifts in gaze when characters made pointing gestures. Autistic subjects, however, often focused on irrelevant objects away from the center of the frame, and their eye movements were tentative and unpredictable in response to the actors’ gestures.
To create the Gollum character for the Lord of the Rings trilogy, director Peter Jackson relied on motion-capture technology, in which an actor’s body is fitted with lights and the resulting points of light are used to reconstruct the actor’s movements and embody them in a new, computer-generated “skin.”
Klin and Jones also use motion capture to create point-light displays of actors’ movements. The raw point-light displays look like constellations when seen as still images but they are recognizable as moving human forms, even to very young children, the instant they begin to move. “When my son saw a still version, he said ‘Stars!’ ” Klin recalls. “But when he saw it move, he said, ‘Ah! It’s a man of stars!’ ” This exquisite sensitivity to “biological movement” is so critical for survival that it can be demonstrated even in nonhuman primates.
Using eye trackers to monitor children’s shifts in gaze while they watched moving point-light displays that were either correctly oriented or upside-down, Klin and Jones discovered that normal children have a decided preference for right-side-up displays. Autistic children seem to detect no difference between the two, which Klin and Jones interpret as a deficit in perceiving biological movement in autism.
Klin and Volkmar have just received a grant from the Doris Duke Charitable Foundation for a remarkable new collaboration with Brian Scassellati, Ph.D., assistant professor of computer science. Scassellati builds robots with human-like facial expressions to study children’s social development, and he and Klin plan to examine whether the robots might be a less threatening way for autistic patients to develop social skills.
Klin says that in order for this newest work to be successful, children would have to perceive the robots in social terms, but he says he has few worries on that front after watching his three-year-old daughter’s reaction to a prototype that seemed to ignore her: “It took her only a few seconds to start fighting with this robot, because it was snoring!”