Dorothea Floris “Linking structural and functional imaging modalities to characterize face processing in autism”
March 09, 2023Information
- ID
- 9632
- To Cite
- DCA Citation Guide
Transcript
- 00:06Good afternoon, everyone.
- 00:07Welcome to the opera sheet.
- 00:11And so I'm also very excited to
- 00:13be back at this great meeting
- 00:15after the three-year break.
- 00:17And I'm also excited to present
- 00:18some of my work here today,
- 00:19which is on multimodal
- 00:21analysis and autism within the
- 00:23context of phase processes.
- 00:28OK. So autism is a very common
- 00:31neurodevelopmental condition.
- 00:32So it occurs in roughly like one person,
- 00:34one in 44 people,
- 00:36and it is characterized by social
- 00:38communicative difficulties and
- 00:40restricted repetitive behaviors.
- 00:42And autism is also still exclusively
- 00:45diagnosed based on behavioral symptoms,
- 00:48which is pretty much the case for any
- 00:50psychiatric condition as we know.
- 00:52But because of this,
- 00:53our goal is to eventually develop
- 00:56biomarkers that can help us.
- 00:58Diagnose the condition more objectively
- 01:00and also to identify more targeted
- 01:03support services which help individuals.
- 01:07And for this reason,
- 01:08it's of course really important
- 01:10that we better understand the
- 01:13neurobiological underpinnings of the
- 01:15condition and its clinical features.
- 01:17And in this context,
- 01:19there has been a lot of neuroimaging
- 01:22studies conducted already to date
- 01:25and so many different neuroimaging
- 01:28modalities have been applied that try
- 01:31to characterize different biological
- 01:33aspects of the condition and its
- 01:37associated features individually.
- 01:39But then the question is,
- 01:41can we maybe obtain a more holistic
- 01:43understanding of the neurobiology
- 01:45of autism if we study different?
- 01:48Modalities in combination.
- 01:52This is what happens on
- 01:55transatlantic flights. OK.
- 01:58So luckily it is quite normal that we
- 02:03acquire multimodal neuroimaging data
- 02:05and like MRI data from the same subject
- 02:09when we conduct new imaging analysis.
- 02:12And the reason why we do so is because
- 02:15we know that every modality captures
- 02:18like different unique aspects of the
- 02:20neural organization of the brain.
- 02:22So for example,
- 02:23EG has very high temporal resolution and
- 02:26most accurately reflects neuronal activity.
- 02:29And then MRI is you know,
- 02:31has very high spatial resolution
- 02:33and provides more detailed insights
- 02:35into the structural and functional
- 02:37organization of the brain.
- 02:39And then depending on our research question.
- 02:43Um, we study these different aspects
- 02:45either individually or in combination.
- 02:48But what we usually do is,
- 02:49is we conduct unimodal analysis.
- 02:52So we study each modality individually
- 02:55and search for like disease or
- 02:57like phenotype related changes
- 02:59in each modality separately.
- 03:05However, we know that like any
- 03:08single imaging, modality alone can
- 03:10only provide a limited view into the
- 03:13neural organization of the brain,
- 03:15and there should be certain benefits to
- 03:17combining different imaging modalities.
- 03:19And so the most obvious benefits
- 03:21are first of all,
- 03:22it should be biologically more valid,
- 03:24of course, because we know that
- 03:26neural changes occur across different
- 03:28biological systems at the same time.
- 03:31Theoretically,
- 03:31we would also expect a greater signal
- 03:36to noise ratio and like greater
- 03:39sensitivity to detect effects.
- 03:42And it should also give us
- 03:43probably like a more comprehensive
- 03:44view of the question we study.
- 03:54So this figure I borrowed from one of
- 03:56Minsk Yahoo's papers and it shows you
- 03:58that there are different approaches
- 04:00to doing multimodal analysis and
- 04:01depending on which approach we adopt,
- 04:03we can increase the amount of the
- 04:06shared joint variance that we extract
- 04:08from the different modalities.
- 04:10So the most simple or like the simplest
- 04:13way would be to do unimodal analysis
- 04:15and then visually compare our unimodal.
- 04:18Results or like do unimodal analysis
- 04:20and then overlay our results.
- 04:22So these of course can be very useful,
- 04:24but they would not account for any
- 04:26interactions between the modalities.
- 04:28The next level would be asymmetric
- 04:31fusion and that's when one modality
- 04:33is used to restrict another one,
- 04:36like for example in F MRI seated
- 04:39EG reconstruction or like when
- 04:41we Co register images.
- 04:43And then finally the most powerful
- 04:45technique would be symmetric fusion,
- 04:48where we can extract most of
- 04:50the shared common variants.
- 04:51So this means that we would merge the data
- 04:55before the statistical interpretation
- 04:57stage and model the cross subject
- 05:00variability in a data-driven way.
- 05:05And so there are different methods
- 05:07of course to do multimodal analysis.
- 05:09So earlier Maria showed us how you can
- 05:12do structure, function and coupling.
- 05:14And then what we usually do at
- 05:16Donders is so-called linked
- 05:19independent component analysis.
- 05:20So the toolbox for this was
- 05:22developed by Alberto Yera at donors,
- 05:24also Christian Beckman.
- 05:26And as you can see,
- 05:28it's pretty much an extension
- 05:30of single modality ICA,
- 05:32which you know is like a
- 05:34multivariate data-driven method
- 05:36to decompose the MRI data into
- 05:39statistically independent components.
- 05:41And different spatial patterns
- 05:43and associated time courses.
- 05:44And so here the difference is
- 05:46it's a Bayesian tensor extension
- 05:49of single modality ICA,
- 05:51with the difference that we decompose
- 05:54the data simultaneously across the.
- 05:56Different modalities.
- 06:01Yeah. And as a result,
- 06:02we would then get independent components
- 06:04and these are then associated with
- 06:07spatial maps for each modality,
- 06:09but with one shared subject course.
- 06:11And this is the key feature
- 06:12that the subject course is
- 06:14shared across the modalities.
- 06:15And this then you can use to like
- 06:17relate it to different clinical
- 06:18features or like behavioral measures.
- 06:23OK, so few years ago Alberto already
- 06:26published a paper using this method and
- 06:28showed how about a powerful tool it is.
- 06:30So here he applied linked independent
- 06:33component analysis to all structural
- 06:35imaging modalities in the HCP data set.
- 06:37And he found a set of different
- 06:40independent components and that of which
- 06:42many related to like the whole range of
- 06:44behavioral measures that we have in HCP.
- 06:46And here for example,
- 06:47you can see one very significant
- 06:49independent component.
- 06:51And so these maps are the spatial.
- 06:53Steps per modality as you can see.
- 06:56So the interesting thing here was that
- 06:58all modalities that he fed into the
- 07:00model contributed to this component
- 07:02and it was also very significantly
- 07:04related to a whole range of different
- 07:06cognitive behavioral measures such
- 07:08as like cognitive functioning,
- 07:10mental health, things like that.
- 07:12So this is just to briefly show you
- 07:14that it works nicely across structure
- 07:16structural imaging modalities and
- 07:18it's a great tool to elucidate brain
- 07:21behavior relationships across modeling.
- 07:26But then, as I said,
- 07:27I'm interested in studying the
- 07:29neurobiology of autism and so I'm more
- 07:32interested to apply this method in
- 07:34a clinical context and characterize
- 07:36the neuro phenotype of autism cross modally.
- 07:39And luckily we have a very comprehensive
- 07:42and deeply phenotype data set available
- 07:45within the EU aims aims to trials consortium.
- 07:49So this is the largest consortium
- 07:52in Europe designed to discover
- 07:54biomarkers and drugs and autism.
- 07:56And this is a data set available with
- 07:59over 700 autistic and non autistic
- 08:01individuals and at the same time
- 08:03many different imaging modalities.
- 08:05So structural MRI DTI is you can
- 08:08see resting stata from our task
- 08:10fMRI we also have EG and then like
- 08:13a whole range of different clinical
- 08:15and cognitive measures available.
- 08:18So it's really one of the most
- 08:21multimodal datasets and largest
- 08:22ones available in autism.
- 08:24So it's actually ideally suited to do
- 08:27multimodal imaging analysis in autism.
- 08:32OK. So the first study I would like to
- 08:35touch on and here we asked the question,
- 08:37what does the multimodal neural
- 08:39signature of autism look like?
- 08:41And here we didn't have
- 08:43any specific hypothesis.
- 08:44We just looked at like the
- 08:47global cross modal pattern.
- 08:49And this study was led by one
- 08:50of our PhD students at Donners,
- 08:53Leonard Oblong.
- 08:55And so here we use this UAMS data set
- 08:57that I just showed you and integrated
- 09:00imaging data across three different
- 09:02modalities which were structural MRI,
- 09:04resting state F, MRI and DTI.
- 09:07All analysis and features were
- 09:10also restricted to specific
- 09:12cortical and subcortical ROI.
- 09:14Which had previously been
- 09:16implicated in autism, such as,
- 09:18for example, post central gyrus,
- 09:19amygdala, fusiform gyrus.
- 09:22And then we did the unimodal
- 09:24feature extraction which was,
- 09:25as you can see,
- 09:26BM for the structural domain,
- 09:28then connect topic mapping for
- 09:30resting state fMRI and probabilistic
- 09:33tractography for DTI and then
- 09:35went on and applied linked ICA.
- 09:37To merge these different modalities
- 09:39and then eventually you would
- 09:40obtain the subject courses that
- 09:42are shared across the modalities
- 09:43and these can then be studied in
- 09:45association with behavior for example.
- 09:49OK. So on the bottom you can see the
- 09:53different independent components.
- 09:55These are now 22 in total.
- 09:58The color stands for one
- 10:00modality and so depending on
- 10:01the component that we look at,
- 10:03you can see that there are
- 10:06more or less multimodal.
- 10:07Is 22 were the ones that were either
- 10:10related to some behavioral measures
- 10:13like adaptive daily living skills,
- 10:17autism associated features,
- 10:18or they showed a significant
- 10:20group difference.
- 10:25And so the most interesting one among these
- 10:29independent components was this one here,
- 10:32because it showed a significant
- 10:33group difference between autistic
- 10:34and non autistic individuals.
- 10:36Or the autistic group as you can see here,
- 10:37like a lower contribution on this component.
- 10:41And the next you can.
- 10:43Look at the spatial profile and the
- 10:46individual modality contributions
- 10:47and as you can see we BM contributed
- 10:49to a very small extent and this
- 10:52was mostly driven by resting state
- 10:54fMRI within the fusiform gyrus.
- 10:58So, taken together in this study,
- 11:01the most interesting result was not in
- 11:04a particularly multimodal component,
- 11:07but it was still interesting to
- 11:09see this group difference within
- 11:12the fusiform gyrus and also some
- 11:15nominally significant associations.
- 11:17But yeah.
- 11:20The reason why we were excited to
- 11:23see this strong implication of the
- 11:26fusiform gyrus is because it has been
- 11:29atypically implicated in autism before.
- 11:32So there are many unimodal studies
- 11:34that show that autistic individuals,
- 11:36for example, show hyperactivation
- 11:37while doing face matching tasks,
- 11:40or they would show like a delayed and
- 11:44170 response in response to phases
- 11:47when doing EEG acquisition or like.
- 11:50A structurally and they for example,
- 11:52atypical asymmetry or like increased or
- 11:55decreased volume depending on the region.
- 11:58So it has typically been implicated
- 12:02in autism.
- 12:03And another important thing is
- 12:05that it's a key feature associated
- 12:07with face processing as we know.
- 12:10And so you can see here that it's kind
- 12:12of like a mosaic with like different
- 12:15differently functionally specialized regions.
- 12:17So there are these like
- 12:19circumscribes patches like for.
- 12:21That are like responsible for places,
- 12:23processing shapes,
- 12:24words and faces,
- 12:26but it's like the key region that's
- 12:29active when you process phases.
- 12:31And what's also important here is that
- 12:34atypical phase processing has been shown to
- 12:37be one of the most commonly cited social
- 12:40difficulties in autistic individuals,
- 12:42who they Orient much less to faces.
- 12:45But they also have difficulties understanding
- 12:48or like recognizing facial emotions.
- 12:51OK.
- 12:51So this is just a brief overview
- 12:54or context of the fusiform gyrus.
- 12:56So the next study makes more sense
- 12:58because based on Leonard's results.
- 13:00And based on these unimodal prior
- 13:03results or literature in autism
- 13:05and I decided to apply this linked
- 13:08ICA method to the fusiform gyrus
- 13:10specifically and asked what does
- 13:12the multimodal neural signature
- 13:14of phase processing look like in
- 13:16the fusiform gyrus and autism?
- 13:20So here I also merged them
- 13:23different imaging modalities,
- 13:24so also structure and resting
- 13:26state fMRI as Leonard did,
- 13:28but it also included additionally 2
- 13:30functional modalities that were specifically
- 13:32associated with phase processing,
- 13:34so EG and task fMRI.
- 13:38So for EG there was ERP data available
- 13:43for those recorded when subjects looked
- 13:46like upright or inverted phases and
- 13:48for task fMRI this was acquired when
- 13:52subject performed the Hariri face
- 13:55matching paradigm in the scanner.
- 13:57And I told you that this UM sample
- 14:00is actually quite big that but then
- 14:02when we take the intersecting sample
- 14:04across all the different modalities,
- 14:06that's available for all the
- 14:08individuals and also with good quality.
- 14:10This actually boils down to a
- 14:12sample of around 200 individuals.
- 14:16OK, so we can look at the
- 14:20different methodological steps.
- 14:21First of all, I restricted all
- 14:23analysis to the left and the right
- 14:25fusiform gyrus separately because
- 14:27we know face processing is the right
- 14:29lateralized cognitive function and
- 14:31it actually makes sense to study the
- 14:35hemispheric contributions individually.
- 14:37Then for the structural domain,
- 14:39I am extracted Gray matter
- 14:42volumes based on VM.
- 14:43And for task F MRI we used the
- 14:47contrast maps for the phases
- 14:50greater than shapes condition from
- 14:53this Hariri phase matching task.
- 14:56And for resting state F MRI I computed
- 15:00connectivity gradients or connect topic maps.
- 15:04And this is actually the same approach that
- 15:07Michael introduced before in the hippocampus.
- 15:10So you correlate each voxel within
- 15:12the fusiform gyrus with the voxels
- 15:14and the rest of the cortex.
- 15:16Then you can compute the similarity
- 15:20matrix and derive Laplacian eigen maps.
- 15:24And this then reflects the connectivity
- 15:26gradient within the fusiform gyrus
- 15:28and it it makes actually sense to use
- 15:31a spatial model like this because
- 15:33of this fine grained functional
- 15:35heterogeneity within the fusiform gyrus.
- 15:39OK, and then for EEG,
- 15:42we did source reconstruction
- 15:44with interviews of from gyrus,
- 15:46so we reconstructed them the time
- 15:48series from different locations
- 15:49within the fusiform gyrus.
- 15:56Then as a next step,
- 15:57I applied normative modeling to
- 15:59each imaging modality separately.
- 16:01And so we have the true experts on
- 16:03normative modeling from the donors
- 16:05in the audience, Charlotte and Sage.
- 16:07And So what I did here was I computed
- 16:10the mapping between the brain
- 16:13features and age, sex and site.
- 16:15And with this we derived the values
- 16:18which quantify how much each modality and
- 16:22differs or like deviates from a normative.
- 16:25Pattern at the box lower at the
- 16:28time series level, time Point left.
- 16:32And then next these Z values or deviations
- 16:36were fed into the linked ICA model.
- 16:40So this is where you merge the different
- 16:42modalities using linked ICA and as I said,
- 16:45as a result we get how much each
- 16:47modality contributes and also how
- 16:48much each subject contributes.
- 16:50And this is this cross subject variability
- 16:52that we can then also use to study it
- 16:56and association with behavior or like
- 16:58some clinical features of interest.
- 17:03OK, so here you can see the
- 17:06resulting independent components.
- 17:07These are the multimodal ones.
- 17:09So linked ICA would also
- 17:12theoretically give you unimodal ones.
- 17:13But here I'm interested in
- 17:15the multimodal components,
- 17:16which are the ones where one modality
- 17:19does not contribute more than 80%.
- 17:21As you can see,
- 17:22they're also divided into right and
- 17:24left hemispheres separately and sorted
- 17:26by how multimodal they actually are.
- 17:29We can say that the right in
- 17:31the left hemisphere had like
- 17:33roughly equal contributions here.
- 17:35And overall, EG had the largest contribution,
- 17:38whereas VBM had the smallest contributions.
- 17:41Next weekend and also check what the
- 17:45functional meaning or implications
- 17:47of these multimodal components are.
- 17:50So, for example,
- 17:52we can check how these multimodal
- 17:55components relate together.
- 17:58So not individually,
- 17:59but like in the spirit of
- 18:01multimodal analysis,
- 18:02how they relate together to different
- 18:04cognitive constructs that are
- 18:06relevant in the context of social
- 18:09functioning and face processing.
- 18:10And so a way to do this is by a
- 18:13canonical correlation analysis, which.
- 18:16Basically identifies a multivariate
- 18:19association between brain related
- 18:22features and like a set of social
- 18:25communicative features that we choose.
- 18:28And as you can see here,
- 18:29this resulted in a significant
- 18:32association between between
- 18:33the two sets of variables.
- 18:36You can also see that they
- 18:38differently load onto.
- 18:41This result so some independent
- 18:43components loaded more than others.
- 18:46The same applied to the social
- 18:48cognitive features.
- 18:49And what was also interesting was
- 18:51when we reran this with like other
- 18:54measures related to restricted
- 18:56repetitive behaviors and like sensory
- 18:58processing like non social features
- 19:01this was no longer significant.
- 19:02So it points to some specificity
- 19:05in the social communicative
- 19:07domain and phase process.
- 19:11OK. And then we can check our any
- 19:13component significantly different between
- 19:15autistic and non autistic individuals.
- 19:17And there was one component that
- 19:19significantly differed where autistic
- 19:21individuals had significantly lower
- 19:23contributions than non autistic individuals.
- 19:26This was a component that was driven
- 19:29by task fMRI, resting state fMRI,
- 19:31EEG and also to a small extent VBM.
- 19:34So actually all modalities contributed
- 19:36that we fed into the model just
- 19:39to like varying degrees. Umm.
- 19:44Here also the right hemisphere had a larger
- 19:47contribution which is interesting in the
- 19:49context of like the lateralized phase
- 19:51processing effects that we would expect.
- 19:53And you can also see the different
- 19:56spatial maps for each modality and
- 19:58the different regions within the
- 20:01fusiform gyrus that load positively
- 20:03or negatively onto this component.
- 20:05And then so here in this case where we
- 20:08see the significant group difference.
- 20:12We can then say that in the
- 20:14positive yellowish regions,
- 20:15autistic individuals loaded more,
- 20:18had lower loadings,
- 20:20whereas in the bluish regions
- 20:21higher loadings onto this component.
- 20:23And then depending on the modality we
- 20:26could interpret this accordingly that
- 20:28they had like larger or like smaller
- 20:30volume or higher or lower activation.
- 20:33Hmm.
- 20:35And then we can further characterize
- 20:38this component by looking at the
- 20:41different modality contributions.
- 20:43For example, if you look at the temporal
- 20:45profile of EG and the different time
- 20:48points that most strongly load onto.
- 20:51This component we can see that this
- 20:53was the case at like 100 and 72180 and
- 20:58450 milliseconds.
- 20:59And this was quite interesting
- 21:01because we know that for example the
- 21:03face sensitive ERP occurs at N 170.
- 21:06Which is associated with phase processing
- 21:09and expertise to social stimuli and
- 21:14between 280 and 500 milliseconds for
- 21:17example we see the P300 and it's
- 21:20sub components P3AP3B and these also
- 21:22have been associated with learning,
- 21:24novelty detection things like that.
- 21:27So yeah I'm not big expert but it
- 21:29was interesting to see that this
- 21:32inter subject variability and
- 21:33like group differences occurred
- 21:35at these time points that.
- 21:36Are actually meaningful.
- 21:39And then for the the remaining modalities,
- 21:43we can further look into
- 21:45the spatial profiling,
- 21:46characterize these for example
- 21:47overlay them with an Atlas
- 21:48like the Harvard Oxford Atlas.
- 21:50And then we see, OK,
- 21:51most of the group differences occurred
- 21:53mostly like in more posterior regions
- 21:56or like posterior occipital regions.
- 21:59You could also overlay it with the.
- 22:01Functional Atlas.
- 22:02So there's this visual functional Atlas.
- 22:06Which was created,
- 22:07so they ran different functional
- 22:09localizer task and so it does.
- 22:11Atlas was then created and characterizes
- 22:14this functional heterogeneity within
- 22:16the fusiform gyrus and describes
- 22:18these different categories,
- 22:19specific category specific patches and
- 22:21we can see that there is some overlap.
- 22:24So for example they autistic individuals
- 22:27show decreased volume in like more
- 22:29in the more retinal topic areas of
- 22:31the fusiform gyrus also in the right.
- 22:34To the form face area,
- 22:36but the initial increased values for
- 22:39the resting state modality, for example.
- 22:42So this probably shows us that it's
- 22:45more like the posterior regions,
- 22:47but like that are involved more in
- 22:49early visual processing are involved,
- 22:51but also the fusiform face area,
- 22:53which is of course very.
- 22:54Interesting in this context.
- 22:58OK.
- 22:58So in summary,
- 23:00we can say that we successfully
- 23:02merged data across different new
- 23:05imaging modalities to characterize a
- 23:08multimodal neural phenotype of autism.
- 23:11Multimodal aspects related to the
- 23:13fusiform gyrus and phase processing
- 23:16are related to different behavioral
- 23:18phenotypes and can explain variants
- 23:21and social functioning and phrase
- 23:24processing and autism.
- 23:25We can also say that multimodal
- 23:27approaches are useful in general
- 23:29because theoretically they should bring
- 23:32us closer to biological validity.
- 23:34As we know that they said the changes
- 23:37occur across different biological systems,
- 23:40it should also increase our signal
- 23:42to noise ratio,
- 23:43so we might be more sensitive to
- 23:45detect effects as we saw earlier
- 23:49in josephina's talk.
- 23:50I think she also showed that the the
- 23:53multimodal feature classification.
- 23:54Outperformed the unimodal one if I
- 23:58remember correctly. So this is one example.
- 24:00But then here we could also compare the
- 24:03unimodal with the multimodal analysis to,
- 24:06it's like Randy said, to quantify,
- 24:08quantify actually the added value
- 24:12of doing these multimodal analysis.
- 24:15Because theoretically, you know,
- 24:17unimodal ones are easier, faster,
- 24:20like might be more economic.
- 24:23But still,
- 24:24usually we collect all these
- 24:25imaging modalities,
- 24:26so if we have the possibility,
- 24:28we should aim for doing this,
- 24:29especially in our big data.
- 24:32Error and eventually this should
- 24:34help us to better characterize
- 24:37the neurobiology of autism or
- 24:40any other neurodevelopmental
- 24:42condition. And yeah, so. Go merch.
- 24:54And they would like to thank, of course,
- 24:55everyone who contributed to this work
- 24:57from University of Zurich and donors and
- 24:59the UIMS consortium and thank you guys.
- 25:03Last question.
- 25:07So this is really cool.
- 25:09This is important.
- 25:10I wonder if this approach allows you to.
- 25:14Gaining insights into like mechanistically
- 25:17being put allergies or if you see it more
- 25:20as a as a kind of these are combined.
- 25:25Get better. No.
- 25:29Yeah. So the question is
- 25:31what it's most useful for,
- 25:32more like for a mechanistic insights or
- 25:34more for increasing our predictive power.
- 25:36I'd say both. So definitely you
- 25:40know when you do unimodal analysis.
- 25:42It can be useful of course too,
- 25:43and you get like more the salient
- 25:46features of the one imaging modality.
- 25:49But you know, it could be completely
- 25:51different systems involved when you look at,
- 25:53you know, the different ones that you
- 25:55would not detect when you look at one and.
- 25:58More like you know that they
- 26:01interact in combination when it
- 26:03comes to the different disorders.
- 26:04So yeah, when we don't also combine it with,
- 26:06for example, some.
- 26:08Genetic analysis,
- 26:09you could do like gene expression decoding,
- 26:11you know in these regions that
- 26:13are cross modally implicated,
- 26:14we could get some more mechanistic
- 26:17insights and then as we saw earlier it
- 26:19can also increase our predictive power.
- 26:23So yeah.
- 26:27Very interesting analysis,
- 26:28very interesting results.
- 26:30Can you help me get the
- 26:31head around one thing?
- 26:34In the linked ICA the component that's only.
- 26:39Hmm. Then still influenced
- 26:43by the other modality.
- 26:46Yeah, that's a good question.
- 26:47So you could compare compare it
- 26:49with like unimodal ICA to see if
- 26:50you get them the same pattern.
- 26:52But that's the way that linked ICA works.
- 26:55It has its uses like this
- 26:58Bayesian model order selection,
- 27:00so it has like these are ARD prior, so like.
- 27:04Automatic relevance detection.
- 27:06So it actually gives each modality away.
- 27:09Weighting, weighting,
- 27:10and then can eliminate modalities that are
- 27:13not informative for that one component.
- 27:16So it can also identify,
- 27:18you know, structured, structured,
- 27:22single modality signals.
- 27:24If there are also the same ones that
- 27:26would come up if you do single modality,
- 27:29I say. We can check would that be?
- 27:37You separate out everything from signal
- 27:39that also relates to the resting state
- 27:43and this is like what's left over there.
- 27:45So this one would be a unimodal
- 27:47component and I mean then it's not
- 27:49driven by the other modalities.
- 27:51So that's then it can also be for example
- 27:53noise associated with this one component.
- 27:55You know you can check is it related
- 27:58to head motion or other confounds.
- 28:00So it can get these single modality
- 28:02structured signal or noise components
- 28:04but then unrelated to the other.
- 28:06Modalities. Yeah.
- 28:14For study, you showed the different
- 28:16aspects of the present from giant
- 28:18technological data related to this one
- 28:21component that wasn't system mean.
- 28:24That's what I was trying to figure
- 28:27out with these last slides.
- 28:29It's complicated, yeah,
- 28:30but it's like the regions that are
- 28:32cross modally implicated and then you
- 28:34can like characterize these further.
- 28:35You know, do they more overlap with like
- 28:38the face from face area or you know,
- 28:40do we see more like object place
- 28:42related regions and like which could
- 28:44indicate that autistic individuals
- 28:46have like an atypical strategy even
- 28:48processing phases or is it more the
- 28:50left versus the right, you know,
- 28:51we would expect more right to the.
- 28:54Involves more to the opposite hemisphere,
- 28:56so you can make sense of these
- 28:58different patches,
- 28:59but like individually by modality then.
- 29:05Or only through? Thank you. Yes well
- 29:09here it's this connect topic maps only.
- 29:13Yeah.
- 29:17OK.
- 29:20OK.