A paper that describes a unique approach for classifying individuals with autism spectrum disorder (ASD), and also provides robust representations of brain activity that can help interpret which regions of the brain most relate to autism, was named Best Paper at the 10th International Workshop on Machine Learning in Medical Imaging (MLMI) in Shenzhen, China. The workshop was a satellite event held in conjunction with the 22nd International Conference on Medical Image Computing and Computer Assisted Intervention (MICCAI 2019).
The paper, which was chosen from a group of 158 submissions, is from the Image Processing and Analysis Group. The lead author is Nicha Dvornek, assistant professor in the Department of Radiology & Biomedical Imaging at Yale School of Medicine. The paper was presented by Xiaoxiao Li, a graduate student in Biomedical Engineering whose advisor is Dvornek as well as Jim Duncan, professor of Radiology & Biomedical Imaging and of Biomedical Engineering.
The paper, “Jointly Discriminative and Generative Recurrent Neural Networks for Learning from fMRI,” describes how Dvornek, using multi-task learning, designed a novel recurrent neural network-based model (RNN) that learned to discriminate between data classes. Gathering large functional magnetic resonance imaging (fMRI) datasets for learning can be difficult, which is why Dvornek and her team designed the model to simultaneously learn to generate fMRI data.
“Nicha added the idea to not only classify autism vs. healthy controls, but tried to predict the data at the next time point to create a more robust and interpretable model,” Duncan said.
This was the first time recurrent neural networks have been used to create predictive models from fMRI time-series data. “Generative RNN models have been used extensively in language processing (generating text, for example), but its application to the medical imaging field has been limited,” Dvornek said.
The team applied its approach to the classification of subjects with autism vs. healthy controls using several datasets from the Autism Brain Imaging Data Exchange. Experiments using these datasets showed that the new model improves the ability to classify subjects, leading to three influential functional “communities” or brain subnetworks that are predictive of ASD.
“Autism spectrum disorder is characterized by impaired social skills and communication; thus we expect to find communities related to associated neurological functions,” Dvornek said.
The top extracted community includes the temporal lobe and ventromedial prefrontal cortex, which are associated with social and language processes. The second community includes the ventromedial prefrontal cortex, hippocampus, and amygdala, which are associated with memory. The third community contains the ventromedial prefrontal cortex and ventral striatum, which is involved in reward processing and decision making. “Dysfunction of all these brain regions and processes in ASD have previously been shown,” Dvornek said.
Dvornek’s work, which is supported by a National Institutes of Health grant from the National Institute of Neurological Disorders and Stroke (NINDS), has the potential to be translated to patient care.
“If we can get a sense of what part of the brain is most important, we can look at quantitative information in these datasets and use them at baseline to know who is going to respond to therapy,” Duncan said. “We’ve seen some evidence that the work Nicha has done can predict the change in social responsiveness scores after behavioral therapy.”
Dvornek’s work has exceptional clinical relevance, said Pamela Ventola, PhD, an associate professor at Yale’s Child Study Center. “With greater understanding of the neural underpinnings of ASD, we can develop more targeted therapies, both behavioral and pharmacological,” Ventola said.
“As for her work in predicting response to treatment, this has the most substantial direct impact, as we are working towards using these neuroimaging results to help match patients with the treatment approaches that will be most efficacious for them and as early efficacy indicators -- evidence that the treatment may be working before we are able to measure it with direct behavioral observations,” Ventola added.
Transfer Learning Challenge
A team from the Image Processing and Analysis Group also won the Best Challenger Award at the Transfer Learning Challenge that took place during the Connectomics in Neuroimaging Workshop.
This workshop also was a satellite event of MICCAI 2019.
The team, led by Dvornek, included Juntang Zhuang, a graduate student in biomedical engineering who is advised by Duncan. The goal of the challenge was to learn to classify individuals with attention deficit hyperactivity disorder from neurotypical controls using resting-state fMRI data. The team’s approach, based on recurrent neural network models, scored the highest classification performance at the live challenge.
Yale competed against teams from Russia (the Institute of Cognitive Neuroscience in Moscow) and England (the University of Sheffield and a combined team from King’s College London and University College London).