Rick Betzel “Edge-centric connectomics”
March 08, 2023Information
- ID
- 9611
- To Cite
- DCA Citation Guide
Transcript
- 00:07So really excited to be here today and
- 00:09talk about some work from from my lab.
- 00:11It's kind of unfold over the
- 00:13last two or three years.
- 00:15They try to tell you a a complicated
- 00:17story maybe not that complicated
- 00:19but one that's trying to address
- 00:21a very specific question,
- 00:23the very end I'll try to present
- 00:24a resolution to the question but
- 00:26I think the more interesting part
- 00:27is what happens in between like
- 00:29the journey to the resolution. So.
- 00:32So I'm going to do that right now,
- 00:34the talks inside the.
- 00:36Essentrics connectomics.
- 00:37So this is kind of a silly thing
- 00:39to say in this room,
- 00:40because everybody who knows this,
- 00:42but bringing networks really composed
- 00:43of two things, nodes and edges.
- 00:45Nodes of course represent populations
- 00:48of neurons, sensors, areas,
- 00:51and then the edges represent the
- 00:53functional anatomical connections
- 00:54between pairs of those those nodes.
- 00:56And I'm going to argue that
- 00:59network neuroscience,
- 01:00maybe neuroscience in general,
- 01:02has really been interested
- 01:04in properties of the notes,
- 01:06whether it's the number of connections
- 01:08they make.
- 01:08The number oh,
- 01:09sorry,
- 01:10the community to which a node is
- 01:12assigned centrality of a node with
- 01:14respect to a process most of our of
- 01:17our of our favorite measures are
- 01:18really related to to the nodes themselves.
- 01:21Those parallels has been going
- 01:23on in neuroscience for like the
- 01:24last century going back at least
- 01:26to broadman and have chopping up
- 01:27with the brain the basis inside
- 01:29work at the comic properties,
- 01:31but also including more more recent
- 01:33imaging based approaches for for
- 01:36for characterizing properties
- 01:37of areas and territories.
- 01:39I think that's great,
- 01:40but I think it means we're possibly
- 01:42leaving something on the table
- 01:44and we might benefit maybe from a
- 01:46shift in perspective, for instance,
- 01:48one that prioritizes other features
- 01:50of the network with nodes themselves,
- 01:53and there's precedent for how to do this.
- 01:56Maybe not so much in the neuroscience
- 01:58literature or human imaging literature,
- 02:00but if you go back to network science papers,
- 02:03this is probably the most famous one.
- 02:06But there are examples of how
- 02:07to shift that perspective to go
- 02:09from nodes to edges,
- 02:10and essentially what this
- 02:12paper shows is it presents a.
- 02:15Really,
- 02:15really simple strategy for taking
- 02:17our familiar node node networks for
- 02:20areas connected by structural or
- 02:22functional connections and flipping
- 02:24them so that the new nodes or the
- 02:26edges in the original network,
- 02:27it's a, it's a, it's a.
- 02:30I just don't know. I don't think.
- 02:33I don't think I touch anything.
- 02:35It might be a battery.
- 02:41Oh, we have batteries. That's great.
- 02:44Sorry, everybody. How they're
- 02:45going to go skiing at 11:30.
- 02:47Yeah.
- 03:00Way to go, Rick. Yeah. Back.
- 03:09We'll see.
- 03:12Hey, I hear you again.
- 03:16Well, that's working excellent.
- 03:19I got a bunch of devices like this.
- 03:26Enjoy it here.
- 03:27So you're saying this is an approach
- 03:30essentially transforms node node networks,
- 03:33ones we're used to dealing with,
- 03:34into a kind of higher order network
- 03:36with the new nodes or in fact the
- 03:38edges in the original network.
- 03:40I won't go into details of how to do
- 03:42this spell present just to kind of a
- 03:44high level schematic about my work.
- 03:46So in this particular approach the
- 03:49strategy is let's grab 2 edges E1E2.
- 03:51They must have a shared stub,
- 03:53so one of the endpoints must
- 03:55be coming to both.
- 03:56This leaves 2 unpaired or unmatched.
- 04:00Angulate the overlap in the
- 04:01in their connectivity profiles
- 04:02and that gives you one number.
- 04:04In this case there's three edges
- 04:06and overlapping by back by 1:00,
- 04:08so you got an overlap measure of 1 / 3.
- 04:13Do this for all edges and now you
- 04:14get a new connectivity matrix.
- 04:16The rows and columns correspond
- 04:18to edges in the original network.
- 04:20The entries correspond to this weighted
- 04:22overlap with connectivity profiles.
- 04:25So very simple.
- 04:26Some advantages this approach.
- 04:28So for instance,
- 04:30if we're closing this network,
- 04:31you're clustering edges,
- 04:32you're not clustering areas or parcels,
- 04:34and that means the edges inherit community
- 04:37assignments or module assignments,
- 04:38and the perspective of any node
- 04:40will it inherits the communities
- 04:42of its edge distance.
- 04:43They overlapping, which is kind of cool.
- 04:47There's at least one other
- 04:48approach to doing this,
- 04:49but in the end it generates
- 04:51something you know awfully similar.
- 04:54So why don't we just take these off the
- 04:56shelf and start applying them to brain data?
- 04:59There's some challenges.
- 05:00First, these two approaches really
- 05:03only work for sparse networks,
- 05:06meaning most of the connections
- 05:07are absent or not present.
- 05:09Doesn't do a good job of dealing
- 05:12with signed connections.
- 05:13And hey, we we we like correlation
- 05:15matrices and there's negative correlations.
- 05:18And that's where everything is and kind of
- 05:21kind of like and tested territory maybe,
- 05:23but it deals with grass and self,
- 05:26transforms the networks,
- 05:27the static networks into edge edge networks
- 05:30and so you lose all temporal information.
- 05:33So maybe that's not ideal.
- 05:36So back in the before times like 2019 or so,
- 05:39we sat down and started thinking about
- 05:41what an edge centric approach for your
- 05:43imaging and brain data might look like.
- 05:45There's a bunch of people involved
- 05:47in this project and we decided to
- 05:49start with something that we were
- 05:51already eminently familiar with and
- 05:53that is functional connectivity
- 05:55has defined as a correlation.
- 05:57Again, this is really probably
- 05:59don't need to say this.
- 06:00I don't know.
- 06:01Spell it out again.
- 06:02The idea is records from two parts of the
- 06:04brain and calculate the shared variance.
- 06:07Correlation coefficient becomes
- 06:08a weight in the matrix.
- 06:10There's our functional connectivity.
- 06:24So I actually want to try to
- 06:26unpack that stuff a little bit.
- 06:28This is really simple stats unpacking,
- 06:30but I'm going to do it anyway.
- 06:32So what do we actually do?
- 06:33We calculate a correlation.
- 06:35What? We take those time courses
- 06:37from two parts of the brain.
- 06:38We start by Z scoring 0 mean,
- 06:41unit variance, and we start
- 06:42calculating a bunch of products and
- 06:44each instance in time we calculate
- 06:46the product of these two time series.
- 06:48This gives US1 number a an
- 06:52instantaneous Co fluctuation.
- 06:54It's signed.
- 06:54It's amplitude tells us both of those two,
- 06:57the blue and the Red Time series are
- 06:59moving in the same direction with
- 07:00respect to their mean and by how much.
- 07:02We repeat this for all frames
- 07:04and it gives us.
- 07:06Well, gives us a Co fluctuation
- 07:08value at every instant in time.
- 07:10The average of that is correlation.
- 07:13That's our our functional connectivity.
- 07:16What if we did something a little devious?
- 07:18What if we like still admitted that
- 07:20last step, the averaging step?
- 07:22Well, goodbye correlation.
- 07:24Goodbye functional connectivity
- 07:25by arguably preserve something
- 07:27that's quite useful.
- 07:29And it's this cooperation time series, right,
- 07:32we're actually preserve that temporal.
- 07:35And this cool fluctuation time series
- 07:37has an interesting properties.
- 07:39It tells us about the amplitudes,
- 07:40so how big the come fluctuations are.
- 07:42It tells us about the balance.
- 07:44Like those two time series are up
- 07:46and down together and telling us
- 07:48exactly when in time those cook
- 07:51fluctuations are happening.
- 07:52So, for instance, here's a.
- 07:55There's a big conflagration here.
- 07:58Why is it big? Why is it positive?
- 08:01The red and the Blue Time series both
- 08:04have big Z score values at that time.
- 08:07Here's a negative Co fluctuation
- 08:08wise is negative ones going up,
- 08:10the other one is going down.
- 08:13You get one of these for every pair of
- 08:15brain regions and every time you calculate
- 08:17functional connectivity is a correlation,
- 08:19be a full or lag or partial.
- 08:21You're implicitly doing what
- 08:23doing this step exactly.
- 08:24You might be averaging at the end,
- 08:26but you are calculating the
- 08:28Co fluctuation time series.
- 08:30And my clean I'm gonna pack over
- 08:32the next few slides is that I think.
- 08:36These go fluctuation time series
- 08:38have some interesting and potentially
- 08:40useful properties.
- 08:42So what do we get when we calculate this?
- 08:44Well, rather than doing it for a single node,
- 08:46or rather for a single pair of nodes,
- 08:48most of which were all pairs
- 08:50of nodes for all edges.
- 08:51So that time series I showed you in
- 08:53the previous slide, it's like a slice.
- 08:55This matrix rows here are edges
- 08:58and columns are time.
- 08:59But if we slice this vertically,
- 09:02what do we get?
- 09:03Well, now we're aggregating
- 09:04a collections for all edges,
- 09:06all pairs of brain regions,
- 09:08and we're getting time resolved
- 09:10conflations and resolve networks.
- 09:12There's no sliding window,
- 09:14there's no kernel, no convolution.
- 09:16We just get instantaneous conflation that
- 09:20go fluctuations. That works for free.
- 09:23There's also some interesting properties.
- 09:25Remember code fluctuation.
- 09:26Their average is a correlation
- 09:28coefficient collapses across time.
- 09:30I get other factor of correlation
- 09:32coefficients. Rearrange it.
- 09:34Upper triangle of a square matrix.
- 09:37There's that. C again.
- 09:38So this is.
- 09:39This is an exact decomposition
- 09:41of functional connectivity into
- 09:43time resolved to fluctuation.
- 09:45We're getting networks for free.
- 09:47And it's giving us these frame wise
- 09:49estimates of the numbers, yeah.
- 09:52So we started looking at these
- 09:54cool fluctuation time series and
- 09:55we noticed them what we think are
- 09:57kind of interesting properties.
- 09:59They all seem to have these periods of
- 10:02quietude punctuated by these big bursts.
- 10:05You can see some here.
- 10:07Now remember the fluctuation time series,
- 10:10the products of activity,
- 10:12the average of them is functional
- 10:15connectivity.
- 10:16That means those high amplitude
- 10:17frames that kind of tipped the scale,
- 10:19they took that average a little bit.
- 10:21They contribute more to that.
- 10:22Average pattern and so we asked
- 10:25do those bumps,
- 10:27those high amplitude equal fluctuations
- 10:29they tend to occur synchronously
- 10:31across edges and think that maybe
- 10:34brain wide events or do they occur
- 10:37asynchronously just kind of uncorrelated way?
- 10:40Here's that same.
- 10:43Whole grain matrix.
- 10:44If you're looking at it,
- 10:46maybe you already know the answer to this.
- 10:48This is kind of like vertical band
- 10:51the striations.
- 10:52Those are frames and time instance
- 10:54and time when lots of edges
- 10:56simultaneously have big Co fluctuations,
- 10:58positive or negative.
- 10:59You can see this when you look at the
- 11:03global amplitude and highlighting a couple.
- 11:05If possible,
- 11:06the pure putative events,
- 11:08right?
- 11:09The distribution for those
- 11:11amplitudes is heavy tailed.
- 11:13And you just didn't really
- 11:15drive this point home.
- 11:16Here is the network,
- 11:18here is the network structure during
- 11:20some of those flammable 2 points versus
- 11:23the low amplitude strikingly different.
- 11:25And this to us suggested
- 11:27that the network dynamics,
- 11:28the way the network evolves over time,
- 11:30it's kind of bursty.
- 11:31It goes between these periods of high
- 11:34amplitude and relative low amplitude.
- 11:36I want to be clear, if we're not the
- 11:38first people to think about this,
- 11:39other people have written a lot about this,
- 11:42and that's kind of a.
- 11:44Again, my *** handed to me on Twitter
- 11:46because this was what happens.
- 11:48But it's. But it's true.
- 11:50But I'm saying here that there's
- 11:52some advantages to our approach.
- 11:53There's an exact composition, parameter free.
- 11:56That's the whole brain.
- 11:57There's some reason why we might like it.
- 12:00So one of the implications
- 12:02then of this like you know,
- 12:04potentially burst the Co fluctuations.
- 12:08The implication that becomes
- 12:09clearer when we look at the the.
- 12:12The tales of the amplitude distribution.
- 12:15So essentially what we're doing is
- 12:16we're taking each frame for filter,
- 12:18we're filtering them.
- 12:19We're retaining just the top
- 12:21amplitude and the lowest amplitude.
- 12:23The left is the top and the
- 12:26bottom is the right.
- 12:27If you just look at this.
- 12:30I'm asking you to look at it.
- 12:32Left hand side looks like
- 12:33functional connectivity.
- 12:34Right hand side is kind of paler.
- 12:36The Co fluctuations are very weak.
- 12:39And in fact,
- 12:40if you calculate the correlation of
- 12:42each side of this matrix with I'm
- 12:44average static functional connectivity,
- 12:46the high amplitude friends are much more
- 12:49strongly related versus the bottom.
- 12:50They also have higher modularity,
- 12:52so there's a stronger modular system level,
- 12:56organization, organization,
- 12:57and I amplitude frames than in the lower.
- 13:01This adjustment by percent of the
- 13:02frames you can do with, you know,
- 13:04really any percentile you like.
- 13:06And it suggests to us that functional
- 13:08connectivity can bring systems.
- 13:10We can reasonably explain them on
- 13:12the basis of these rare network
- 13:14wide dude go fluctuations.
- 13:16So we started asking,
- 13:17well,
- 13:18what other properties these these
- 13:21events have?
- 13:23And this is kind of a hodgepodge of results.
- 13:25You got a Mile High view of it.
- 13:28So we looked at what's happening in
- 13:30terms of brain activity during the events.
- 13:32That's really dominated by
- 13:33a single mode of activity.
- 13:35Depending on where you're coming from,
- 13:37this might be extrinsic versus
- 13:39intrinsic division might be
- 13:40the first principle gradient,
- 13:42it might be the default mode,
- 13:44but it's a very recognizable
- 13:46pattern of activity.
- 13:47The events themselves have
- 13:50synchronized during movie watching.
- 13:52They're not entirely intrinsically driven.
- 13:54They are partially dependent
- 13:57upon the stimuli.
- 13:59They actually used to be
- 14:00reconstructing networks for the
- 14:02events versus the low amplitude.
- 14:03They lead to stronger brain behavior,
- 14:05correlations and enhance brain fingerprints.
- 14:08This is with scan club data.
- 14:11We ask can we identify
- 14:13individuals across scans?
- 14:14Expectation is that within individuals
- 14:16we'd see a stronger correspondence
- 14:18and. Between the sea attenuated
- 14:20similarity and we see this.
- 14:23So the high amplitude this is the bottom.
- 14:25But this is subject by
- 14:28subject similarity matrices.
- 14:30And again, as I said,
- 14:32this enhances brain behavior correlations.
- 14:35This is CP data.
- 14:3810 behavioral factors and that colored
- 14:42points here represent the excuse me.
- 14:52Sometimes it could take a breath.
- 14:57Yeah, the colored points represent the
- 15:00correlations obtained from the high
- 15:02amplitude and the grave and below amplitude.
- 15:04We follow this up with a few other studies.
- 15:07Events themselves are not monolithic
- 15:09and of correspondence distinct states.
- 15:11The propensity for event to occur within a
- 15:14within a scan session is is that correlated
- 15:18with endogenous fluctuations and hormones.
- 15:20We've been a biophysical models
- 15:22that partially explain where the
- 15:24events are coming from.
- 15:26This model take structural connectivity
- 15:29couples brain regions together the treat each
- 15:32as oscillator stimulates the new data and we.
- 15:35Events in these data,
- 15:36when we destroy the structures and
- 15:39specifically the modules events go away.
- 15:41It was telling us that the underlying
- 15:44modular structure of SC might play a
- 15:46role in the emergence of the events.
- 15:49And something we're still kind of puzzled by.
- 15:52Going back to some of the
- 15:53movie watching data,
- 15:54we found that the timing of events
- 15:56is also very stereotypical.
- 15:58And these red lines or spond to
- 16:00the ending of movie clips and
- 16:02that's when everybody has an event.
- 16:04But here's a big burst at the end of movies.
- 16:07We're trying to kind of figure
- 16:09out what's going on there.
- 16:12So I'm going to end with the
- 16:15other couple slides, but like,
- 16:16I want to make a couple of points
- 16:18and have some caveats.
- 16:19I've really focused on the high
- 16:21amplitude frames, these big events,
- 16:22they're really, really part of the story.
- 16:26For instance, uh,
- 16:27the identifiability or that fingerprints.
- 16:30They actually don't tend to people the
- 16:32highest amplitude with the next bin down.
- 16:33Other people have shown this as well.
- 16:36I think this has a lot to do with how
- 16:39events themselves are related to one another.
- 16:42I have two friends are
- 16:44very stereotypical events.
- 16:45Go fluctuation patterns are
- 16:46very similar to one another.
- 16:48So the.
- 16:52So when you look at that highest bin,
- 16:53you're essentially getting lots
- 16:55of the same pattern.
- 16:56You go down, you get a mixture.
- 16:58Something looks a little bit more
- 17:00like functional connectivity.
- 17:01And I don't think that global
- 17:03events are the full story.
- 17:04So far we've been looking at the
- 17:06whole grain go fluctuation signal.
- 17:08You can calculate Co fluctuation
- 17:10time series for each system and
- 17:13they are differentially coupled
- 17:14to that whole grain signal.
- 17:17Those system level Co fluctuations
- 17:19also have their own.
- 17:20Your potentially interesting
- 17:22correlations director, too.
- 17:24And then there's just kind of a.
- 17:27Maybe elephant in the room in some ways.
- 17:29Since we published our first paper,
- 17:30there's been at least a handful that
- 17:33have claimed that some of the apparent
- 17:35temporal structure we're seeing,
- 17:37you know,
- 17:37maybe it's not meaningful or maybe
- 17:40my meaningful is that if you were
- 17:42to take a static correlation matrix,
- 17:44there's no temporal structure,
- 17:46use it to generate simulated data.
- 17:48The simulated data inherits some of the
- 17:51properties that we see in the real data.
- 17:53I think these are really good studies.
- 17:55So they present a possible.
- 17:57Mechanism for when these events come from.
- 18:01It's to me it's not particularly
- 18:04satisfying explanation.
- 18:05We can talk about this offline maybe,
- 18:07but it's still interesting nonetheless
- 18:09and I want to present maybe
- 18:11some some new data that suggests
- 18:13that this might not be the case.
- 18:14This is not human imaging data,
- 18:16this is a light sheet calcium data
- 18:20from zebrafish single cell data
- 18:22and it works well with what we do
- 18:24with it is something like this.
- 18:25So in addition to the the
- 18:27fluorescence traces the the fish
- 18:29are effectively moving their eyes.
- 18:31Come around during the recordings
- 18:32and we can ask this question,
- 18:34well, what do we take activity,
- 18:36something that's independent of the
- 18:38correlation structure or sorry,
- 18:39rather than that is somebody driving
- 18:42the static function productivity
- 18:44and our time varying correlations
- 18:46are each time series.
- 18:47We'll put those together into a model,
- 18:49the most derived of activity and
- 18:52connectivity and use that to try
- 18:54and predict spontaneous behavior.
- 18:56If.
- 18:56The models really only used the
- 18:59activity to explain behavior.
- 19:02Then maybe there is.
- 19:03Maybe it is true that the time
- 19:05varying fluctuations,
- 19:06the Edge time series aren't playing a
- 19:08big role here that are that they're
- 19:11really not temporally important or
- 19:14walk to anything of significance.
- 19:17We do this this is trying to
- 19:20explain fictive turns,
- 19:22and we find that we can explain about
- 19:2425% of the variance when we include both.
- 19:28I'm very connectivity and activity in
- 19:30the same model, so we're doing OK now.
- 19:33Perfect. Let me shuffle the time series.
- 19:36What we're left with is this kind
- 19:38of a garden moist war.
- 19:39Even if there's no temporal
- 19:41structure related to the behavior,
- 19:42we still get some correlation.
- 19:44It's about 10% of the variance.
- 19:47But now what happens if we shuffle activity?
- 19:50Then we're basically predicting
- 19:51with an activity alone.
- 19:53We have the same with connectivity.
- 19:54Shuffle that and leave activity intact.
- 19:57Uh-huh.
- 19:58So when we shop for activity,
- 20:00destroy the activity structure.
- 20:02We don't hurt our correlation very much.
- 20:04On the other hand,
- 20:06when we destroy the productivity,
- 20:08we get a bigger decrement.
- 20:09So suggesting that at least
- 20:11in this particular measure,
- 20:12this particular measure is
- 20:14spontaneous behavior turning.
- 20:16I think that's something that
- 20:18edges activity contributes,
- 20:20something not immediately
- 20:21accounted for by activity.
- 20:23This is only part of the story.
- 20:26There's another measure,
- 20:27this is eye movement.
- 20:28The other way around,
- 20:30if you're destroying activity,
- 20:32relation goes way, way down.
- 20:33Your model sucks basically.
- 20:36Destroy the overall structure
- 20:38in the primary connectivity.
- 20:40You don't hurt much,
- 20:41so it's really context dependent.
- 20:43Some measures are better
- 20:44predicted by activity,
- 20:45so by time varying connectivity.
- 20:48And we also using the same approach in
- 20:52pit different time varying connectivity
- 20:54measures against one another.
- 20:55And lo and behold,
- 20:56some of the better performing measures are
- 20:59the these Co fluctuated in time series,
- 21:01these Edge time series they're
- 21:03outperforming the sliding windows
- 21:05by a pretty good margin.
- 21:07OK,
- 21:07now and I promise you and circle
- 21:09back to the question we started with,
- 21:11I'm going to be trying to develop
- 21:13an edge centric
- 21:14approach for human imaging.
- 21:16Basically some way of transforming
- 21:18no data into these edge edge
- 21:21graphs and turns out we can do it.
- 21:24Basically once we have our Edge Time
- 21:26series calculate their correlation
- 21:27structure or some some measure of
- 21:29similarity and it gives us an edge by
- 21:32edge matrix fully weighted in signed,
- 21:33we can cluster it.
- 21:35When we do that we can our edge.
- 21:37Level clusters which are
- 21:39fundamentally overlapping and we
- 21:41can calculate things like the.
- 21:43Degree to which one particular region
- 21:45has multiple affiliations or not.
- 21:47So we can do all the things we set out to do,
- 21:49but also the end punch line would be a
- 21:52little weaker than that that story there.
- 21:56And I'll end with this slide.
- 21:58My claim here is not that we should
- 22:00all be doing edge centric things.
- 22:02The claim is not that you should all be
- 22:05equally code fluctuation time series,
- 22:07but rather it should be viewed as a
- 22:09complementary tool that we have already.
- 22:11Right. We're interested in brains.
- 22:13We're interested in how they're linked
- 22:15to behavior and how they how they work.
- 22:16Activations tell us something,
- 22:18connectivity or traditional
- 22:20networks tell us something else.
- 22:22And hopefully, you know,
- 22:23there's something more thoughtful.
- 22:25Not unique,
- 22:26not not accounted for by networks
- 22:28reactivations at the edges and movements.
- 22:30New approach can meet back and tell us.
- 22:33At that point I'll stop and
- 22:36open this up for questions.
- 22:44Sage bunch.
- 22:56Yes. The question is about the dependency
- 22:59presumably on parcellation INS for the
- 23:01measures that we were interested in.
- 23:02In the first papers,
- 23:03there's very little dependence and
- 23:05so far we haven't done a lot of
- 23:07reading behavior association things.
- 23:08I'm sure that there is,
- 23:09like everybody else here is
- 23:11talking about some reasonable
- 23:13dependence there and will vary,
- 23:15but we haven't seen it so far and the
- 23:16things that we really care about.
- 23:18Start with just.
- 23:25The level.
- 23:27Can you repeat the question,
- 23:29the question was then so for instance
- 23:32the zebrafish where we actually
- 23:34have like fine scale single cell
- 23:36data and we understand that stage
- 23:38was it was like. Can you just?
- 23:44Yeah. So this is you're bringing
- 23:46up one important point.
- 23:47I'll touch on this first.
- 23:49So when you build Edge time series
- 23:52or Co fluctuation time series,
- 23:54you have N parcels, voxels,
- 23:56whatever, you get that squared.
- 23:59So it becomes a real memory challenge,
- 24:01especially if you're working with boxes.
- 24:03We found some sneaky ways of
- 24:06circumventing that issue,
- 24:08holding a little bits of memory,
- 24:09doing some operations on them,
- 24:11holding something else in memory,
- 24:13doing some.
- 24:13But it becomes really challenging at the
- 24:15voxel level or even at the single cell level.
- 24:17Those are the that's not trivial to do.
- 24:19And to your question,
- 24:21we we haven't done like an exhaustive
- 24:25scale dependence with these days.
- 24:27We haven't measured that explicitly.
- 24:29I think that's an important point.
- 24:31Let's go to machine.
- 24:34So there is. Cover this.
- 24:40Spraying is very, very different.
- 24:44Yeah, that's right.
- 24:45They're tiny little.
- 24:47Couldn't talk.
- 24:49So I wonder if you kind of called
- 24:51disease reduce right up into the
- 24:52different seconds tective maybe
- 24:53look at the spot or whatever
- 24:55you know predictions change in
- 24:57these different departments have.
- 24:58It would be very easy to do.
- 25:01We have all of the atomical labels for
- 25:03for roughly where those those cells
- 25:05live and you could do this I think
- 25:07would be an interesting question.
- 25:09And I should add, we have preliminary
- 25:11data that's along the same lines
- 25:13using human movie watching data.
- 25:14It's a little more complicated.
- 25:16It's not spontaneous behavior.
- 25:17The question is can you explain something
- 25:20in responses to particular features in the
- 25:23movies using activity versus connectivity,
- 25:26putting them together,
- 25:28training explanation and
- 25:29the results again mixed,
- 25:31but there are some.
- 25:32Features where it's activity dominated,
- 25:34others where it's activity dominated Julia.
- 25:39I was wondering if they were trying to
- 25:42look at fluctuation instead of looking at
- 25:44the similarity down to looking for rather.
- 25:50And that it only relates also.
- 25:53Photography of the great, right?
- 25:54You know that oscillate at
- 25:57different different ranges.
- 26:00Like frequency dependence of this we
- 26:02have we have not this is something we
- 26:04I mean it turns out the people invent
- 26:06invented this way before us we didn't
- 26:08invent anything here it's correlations.
- 26:10There's a Christian Beckman paper
- 26:12in like 2016 where they calculate
- 26:14code fluctuations they don't,
- 26:16they don't they don't recognize or
- 26:18pursue that it's a connectivity
- 26:20over time and over that as this
- 26:22decomposition so we we haven't
- 26:23done I think it's something that's
- 26:25interesting but even our own internal
- 26:28bandwidth Ward really able to pursue.
- 26:30I think it's important question.
- 26:34There's an online question which is.
- 26:39How good is this method in
- 26:40catching moment to moment dynamics
- 26:42and what kind of parcellation
- 26:43best suits this analysis?
- 26:46How good is it?
- 26:47A catching moment to moment dynamics?
- 26:49And we end up building networks at the
- 26:52whatever frame rate you acquired your data.
- 26:55It isn't noisy, it's you're basically using
- 26:58single frames to estimate Co fluctuations.
- 27:01Again, this is not a good
- 27:03answer to your question,
- 27:04but given the bandwidth that we
- 27:06have internally for addressing,
- 27:07we're pursuing these like
- 27:09edge centric projects.
- 27:10We haven't actually looked at like
- 27:12a moment into how we haven't done a
- 27:14proper benchmarking study in that way,
- 27:16just that specific.
- 27:19Uh, attribute moment right here.
- 27:24Two questions. First, how did
- 27:26you vectorize your next series?
- 27:29For which that.
- 27:33Is there on this slide or ohh boy,
- 27:36don't go back. OK?
- 27:37There's a rule in my old land
- 27:39that every slide you go back
- 27:41you with somebody of beers.
- 27:42It's like really disconcerting.
- 27:46So I know which one you mean.
- 27:48I have to go back. Sorry, Todd.
- 27:50Sorry.
- 27:50I can find it really easily.
- 27:54Like this?
- 27:57Yeah, so each instant it's the.
- 28:00It's all pairs of nodes.
- 28:03Basically every edge in the
- 28:04network is A is a column,
- 28:07so every column here is a represents
- 28:10the upper triangle of a square,
- 28:12node by node matrix.
- 28:15So if you have 200 nodes,
- 28:1719,900 elements, yeah, yeah.
- 28:23The features you might get presented
- 28:26our submission name orthogonal to
- 28:28spiritual connectivity and activation
- 28:29will be able to pull all these things.
- 28:34$1,000,000 I mean to
- 28:34me it's a very.
- 28:37Yeah, I think this,
- 28:38this is the real question and it's
- 28:40something I think like ******* get
- 28:42yelled at when after after this.
- 28:44I know there's people in the crowd,
- 28:45but I think don't share my views
- 28:48about whether there is meaningful
- 28:50temporal information when there's
- 28:51all just static connectivity or
- 28:53activity or something like that.
- 28:55So I think we took a first step
- 28:58in that zebrafish model where you
- 29:00you put them together, right.
- 29:02You one of them has unique explanatory
- 29:04power that the other is not vice versa,
- 29:07but putting them all together.
- 29:08That incoming connectivity,
- 29:10the edge stuff into a single
- 29:12modeling is kind of challenging.
- 29:15Good ideas for how to do this, yeah.
- 29:19Yeah, so I great talk always.
- 29:24The the thing that I was that jumped
- 29:26into my mind that I I might have missed.
- 29:29So just wondering how I'm different sort
- 29:31of strategies for dealing with given
- 29:32that you know a lot of these networks
- 29:34are being specifically focused on these
- 29:36high amplitude fluctuation points and
- 29:38you're showing that that this happens
- 29:40typically like the end of movies, right.
- 29:42Wondering like how different motion
- 29:45correction strategies and regression
- 29:46people liking all these things kind
- 29:48of play into changing how these
- 29:50networks end up looking more like
- 29:52the how the algorithms play out.
- 29:54Yeah,
- 29:55we
- 29:55haven't done an exhaustive search of all
- 29:58possible motion correction strategies.
- 29:59The person to talk to
- 30:01disappear as Josh Vasquez,
- 30:02who did all of the processing for this here,
- 30:05so we can corner him after he's done
- 30:07seeing the one interesting thing I'll
- 30:09mention is that behind budget frames there,
- 30:11there's not a very clear relationship between
- 30:13motion and amplitude and the connectivity.
- 30:15But the one point that we find is a high
- 30:17amplitude frames tend to be the ones
- 30:19with the lowest instantaneous motions
- 30:21is the bars or free license placement,
- 30:23however you measure it.
- 30:24Maybe it's just like signals coming
- 30:27through at that point just knowing
- 30:29but it's that we we we see that
- 30:32relationship and whereas the lower
- 30:33the highest motion frames tend to
- 30:35be the troughs of the preserves is
- 30:37extreme right where like there's a
- 30:38lot of music music when you see high
- 30:41amplitude pretty good signal almost
- 30:43certainly not motion contaminated,
- 30:45the lowest end budget frames
- 30:48almost certainly motion contended.