Skip to Main Content

Neuroskeptic

January 27, 2021

A longstanding science blogger for Discover magazine and Twitter influencer, Neuroskeptic, sits down in a pub to chat with Daniel about the “generalizability crisis,” blogging, and social media in this episode of Science et al.

ID
6135

Transcript

  • 00:10Welcome to science et al podcast about
  • 00:12everything science sponsored by the Yale
  • 00:14School of Medicine I'm Daniel Barron.
  • 00:16I'm your host and in this episode I'm
  • 00:19speaking with Neuro Skeptic Snore Skeptics,
  • 00:21a longstanding or long time science
  • 00:23Blogger for Discover magazine.
  • 00:25I guess my culation say he was a
  • 00:27solo Blogger like he had his own
  • 00:30website and then science or Discover
  • 00:32Magazine picked up his blog.
  • 00:34And I actually just tried to find out
  • 00:36how long it been on Discover magazine
  • 00:39and they're only showing about 30
  • 00:41of his most recent pieces there.
  • 00:44But Needless to say, no.
  • 00:45Skeptics a big deal.
  • 00:47I don't know any scientists who
  • 00:49have a Twitter account that don't
  • 00:51follow neuro skeptic,
  • 00:52and he has 190,000 followers on Twitter.
  • 00:54And he has both the most insightful and
  • 00:57the most hilarious tweets about science.
  • 00:59And so this conversation with Neuro
  • 01:02Skeptic was long time in the making.
  • 01:04I, as a medical student and started
  • 01:08writing for Scientific American and I
  • 01:11had written a piece at the beginning
  • 01:14of my psychiatry residency about
  • 01:17the tension between neuroscience
  • 01:19and specifically psychotherapy
  • 01:21and psychoanalysis,
  • 01:22and we ended up having a
  • 01:25brief polemic about it,
  • 01:27which I thought was just completely
  • 01:30delightful and at a conference
  • 01:33I guess is probably.
  • 01:35Almost a year after that,
  • 01:37mutual friend invited us both
  • 01:38to dinner and nurse kept,
  • 01:40and I grabbed a drink afterwards
  • 01:42and had such a wonderful time
  • 01:43discussing writing and science.
  • 01:45And just, you know, kind of the works.
  • 01:48How do you move from academia
  • 01:49and how do you write for the
  • 01:52popular audience as a scientist?
  • 01:54Still trying to be a scientist
  • 01:56and engaging academia,
  • 01:57he's a wealth of information,
  • 01:58and since then he's been extremely helpful
  • 02:00with a few different projects I'm working on.
  • 02:03And so I was really fortunate to ping him
  • 02:06when I was in London researching a book.
  • 02:09And so two tidbits of information will be
  • 02:11important if you're going to stay with us.
  • 02:14In our conversation there.
  • 02:15So first of all, it is a little bit noisy.
  • 02:18In this episode,
  • 02:19our conversation took place in a pub down
  • 02:22the street from where I was giving a talk,
  • 02:24and the pub wasn't open yet,
  • 02:26but it was completely empty,
  • 02:28and so we kind of knocked on the door and the
  • 02:31bartender very kindly let us sit in a corner.
  • 02:34And so I set up my little recording
  • 02:36equipment and so the background sounds
  • 02:38that you hear during this episode.
  • 02:41Or the bartender?
  • 02:41You know,
  • 02:42getting ready for the day.
  • 02:44I should also mention that we recorded
  • 02:46this episode the first week in February,
  • 02:48so this was just as the Corona
  • 02:51virus was beginning to make the
  • 02:53news in a in a really serious way.
  • 02:56And so we kind of comment on
  • 02:59this during our conversation.
  • 03:01And so the second kind of category
  • 03:04of background information is
  • 03:05about the generalizability crisis,
  • 03:07and so you've probably heard of the
  • 03:10reproducibility replik ability crisis,
  • 03:12which is the well established
  • 03:14observation that scientific studies,
  • 03:16if someone tries to replicate or
  • 03:19to perform the exact same study
  • 03:21using the same methods, the same.
  • 03:25Like experimental design and materials,
  • 03:27they often don't get the same result and so
  • 03:31this is a little bit different from that.
  • 03:34So nervous, guilty and I are discussing
  • 03:36a paper that one of our colleagues
  • 03:39Taglioni had recently published on.
  • 03:41It was one of the preprint servers.
  • 03:44I don't recall a link out to
  • 03:46it in the episode description.
  • 03:50So the generalizability crisis isn't
  • 03:52dissimilar from the reproducibility
  • 03:54or applicability crisis.
  • 03:56The generalizability crisis would
  • 03:58be an example of one study using
  • 04:02a very specific behavioral task,
  • 04:04for example,
  • 04:05and then from a very specific type
  • 04:08of memory task trying to draw a
  • 04:12generalizable conclusion about memory.
  • 04:15So basically,
  • 04:16this idea that a very specific instance.
  • 04:20Other behavior doesn't generalize to
  • 04:21that category of behavior itself,
  • 04:23and so it was a wonderful paper by
  • 04:26Talan neuro skeptic and I really
  • 04:28enjoyed discussing it.
  • 04:29I hope you enjoyed this episode
  • 04:32and here we go.
  • 04:43K so I'm here this afternoon
  • 04:45with neuro skeptic and decided
  • 04:47to meet and it's very nice pub in London,
  • 04:49which is perfect in order to discuss
  • 04:52a little bit about science and so
  • 04:54now skeptic alot of people that
  • 04:56I've spoken with kind of targeted
  • 04:58different facets of science spoke with
  • 05:00some science journalists to see how
  • 05:02they view the progress of science,
  • 05:04how it's reported and understood by
  • 05:06the public, and you've been in a
  • 05:08position for a long time to be able
  • 05:11to assess like what is good science.
  • 05:13You're very outspoken and very rigorous
  • 05:15in terms of how you evaluate publications.
  • 05:18So how do you do it?
  • 05:20How do you decide which which papers
  • 05:22to write about and what your criteria?
  • 05:26I mean. What is good science?
  • 05:30I think is really a very very,
  • 05:32very big question.
  • 05:34So my own, my blog and my
  • 05:38papers that I tweet about.
  • 05:40That is very much my own personal.
  • 05:43My own personal interest,
  • 05:45so I wouldn't say that I'm.
  • 05:47In any way.
  • 05:51Sort of focusing on on good science
  • 05:53in a sort of objective sense.
  • 05:55I try to focus on science
  • 05:57that's interesting to me.
  • 05:59So like an ethical good, like a
  • 06:02personal. My good for euro skeptic.
  • 06:06More of. Good in the sense of,
  • 06:11I think that this is interesting
  • 06:13and I think that this will be.
  • 06:16Interesting to other people. Who are?
  • 06:20Working in the field of neuroscience,
  • 06:23I think. But I mean,
  • 06:25that's just that's a subjective.
  • 06:29A subjective judgment as to
  • 06:31whether something is interesting.
  • 06:34And I think what's what is notable about
  • 06:38about science is that an awful lot of.
  • 06:42An awful lot depends on research
  • 06:45being perceived as interesting,
  • 06:47so you could say that so from effective
  • 06:50you could say that science is good.
  • 06:53If it is true, right?
  • 06:56If it's basically it's accurate,
  • 06:58so rigorous, and if the
  • 07:00conclusions are valid, right?
  • 07:05Box.
  • 07:08If you exclusively take that
  • 07:10that view. If you say the best
  • 07:13science is the most rigorous.
  • 07:18And that I think it's perfectly
  • 07:20sort of justifiable view in
  • 07:22some ways, but that kind of.
  • 07:26Doesn't.
  • 07:29It doesn't get you to the level of
  • 07:32sort of interesting or highly cited
  • 07:34or highly influential science,
  • 07:36which is what's rewarded
  • 07:38under the current under the.
  • 07:41System in which you know researchers,
  • 07:43published, and then excited,
  • 07:45and then index index factor.
  • 07:47And then that's what drives grants.
  • 07:51And so there's this.
  • 07:53You can say that science is good science
  • 07:56if it's if it's technically rigorous,
  • 07:58and if it's the results turn
  • 08:02out to be replicable. But then.
  • 08:05There's this sort of.
  • 08:07There's almost like an orthogonal dimension,
  • 08:09which is sort of completely unrelated to
  • 08:12that, which is how interesting is this.
  • 08:16So you could have science,
  • 08:19which is really technically just terrible.
  • 08:24You know, maybe it's done with the
  • 08:27tiny sample and it's using like.
  • 08:30Very old fashioned and discredited methods,
  • 08:34even.
  • 08:36And that could be very interesting,
  • 08:38though in the sense that it's
  • 08:40addressing a topic that people find
  • 08:42interesting and is making claims
  • 08:44that people think are interesting,
  • 08:46sure.
  • 08:48So the interesting interesting
  • 08:51yonassan the technical.
  • 08:53Rigorous nesr I would say almost
  • 08:56completely two separate things.
  • 08:59So it's almost as if you're putting
  • 09:01your finger on an evolutionary
  • 09:03pressure in science, right?
  • 09:05So what is fit science, right?
  • 09:08So it's like you're.
  • 09:10Obviously if you're a scientist and you want
  • 09:13to continue producing science to survive,
  • 09:16you must have rigorous science.
  • 09:18You also must have interesting science,
  • 09:20otherwise your funding agencies
  • 09:22aren't going to run it.
  • 09:24That's actually conversational.
  • 09:25Is having with an anthropologist fun in mind?
  • 09:28And we were talking about how fields
  • 09:31progress overtime and how the development
  • 09:33of the NIH in the US really sculpted
  • 09:36a very specific type of science.
  • 09:39As to be like clinically useful.
  • 09:42And that was like a
  • 09:43definition of interesting.
  • 09:44So do you see that in do you
  • 09:46feel like there are differences
  • 09:47in countries in terms of not
  • 09:49just what you find interesting,
  • 09:51which I assume remains the same no
  • 09:53matter what country you know the
  • 09:54authors are from like the sorts of
  • 09:56science that countries produce.
  • 09:59Every question I mean. I would save.
  • 10:03Probably not to a great extent
  • 10:06because I think science has become.
  • 10:09Today is become does seem to become more.
  • 10:12Um? Instead of globalized.
  • 10:16I mean, funding agencies are still
  • 10:18based in particular countries,
  • 10:20but the journals are very much international.
  • 10:23So if you want to publish in, let's say,
  • 10:27the American JAMA, you know the.
  • 10:30Or the British Medical Journal.
  • 10:34You don't have to be from that country,
  • 10:36but you do have to improve,
  • 10:38so research will be from coming
  • 10:40in from all over the world.
  • 10:43But it's. But is being judged
  • 10:46by judge to the same standard.
  • 10:49So I think too large extent,
  • 10:51the same standard of kind
  • 10:53of interest has now become.
  • 10:55Fairly universal,
  • 10:56but probably not entirely so.
  • 10:59I think there are still some
  • 11:03country specific priorities.
  • 11:05In terms of funding.
  • 11:10I mean so. It is not perhaps the best
  • 11:15example, but a few years it well,
  • 11:17maybe a decade ago in the UK there was a.
  • 11:23There was a lot of interest in.
  • 11:27In Psychotherapy's for sort of
  • 11:29common psychiatric disorders.
  • 11:30And there was an initiative called I
  • 11:33Act which I think was improving access,
  • 11:37improving access to psychological therapies.
  • 11:41This is a massive thing.
  • 11:44And was basically about sort of
  • 11:47increasing the number of therapists
  • 11:49and then increasing the access so
  • 11:52more people could access therapy.
  • 11:55And it was like a countrywide
  • 11:57thing and that I think.
  • 11:59Probably Spurs quite a bit of research.
  • 12:05Which was probably very much
  • 12:07specific to this country.
  • 12:10That was that a result of like a political
  • 12:13movement or some lobbying, or I think
  • 12:15it was a mixture. Yeah,
  • 12:17I think there was definitely political.
  • 12:19Aspect to it? I mean that that long
  • 12:22been a sort of perception that.
  • 12:25People were relying on
  • 12:27antidepressants and other medication.
  • 12:29Doctors are prescribing it too much.
  • 12:32Um? That's.
  • 12:39It was easier to get access to medication,
  • 12:41but it was very hard to get.
  • 12:45Angkasa therapy because it had
  • 12:47like long waiting lists and so on.
  • 12:50And I think that became a political.
  • 12:52I don't know. I don't know.
  • 12:53I don't really know the kind
  • 12:55of all the details of it.
  • 12:58But I think it definitely
  • 12:59was a political initiative,
  • 13:00and then I act was brought in.
  • 13:03I think that did lead to a kind of.
  • 13:09Quite a lot of research.
  • 13:13Based on my app sort of molded by apps.
  • 13:18But then I don't know whether that was
  • 13:21ever that research was ever kind of.
  • 13:23Of interest to anyone outside the country. I
  • 13:27don't know. I remember around the same
  • 13:29time there is a lot of questioning
  • 13:32about Psychopharm in the USI mean.
  • 13:34What was that book signal to
  • 13:36Prozac or something like that?
  • 13:37Pretty famous book that I
  • 13:39think you know in the 90s
  • 13:41where, oh, there were quite a
  • 13:43few of them. Yeah, I had
  • 13:45another P in it of something and Prozac.
  • 13:50Got a lot of traction,
  • 13:51got alot of clinicians
  • 13:53thinking about whether they,
  • 13:55whether they should start to
  • 13:56prescribing or use other modalities.
  • 13:58So I mean maybe another example is like
  • 14:01the Human Genome Project where countries
  • 14:03kind of the stab Lish these priorities,
  • 14:06thereby sculpt science.
  • 14:08So I guess I'd be interesting.
  • 14:12In like a economic way or governmental way.
  • 14:16And now I mean just so.
  • 14:20I was reading today that there have been.
  • 14:2350 papers published about the
  • 14:25coronavirus since it was discovered it
  • 14:27was only discovered like a month ago,
  • 14:29so he's like fully peer reviewed.
  • 14:33We have this data.
  • 14:35And I think that's on the host like.
  • 14:39They've collected the data and
  • 14:41and written up and submitted it
  • 14:43and then obviously the journals
  • 14:45must have like fast tracked.
  • 14:47Finding reviewers.
  • 14:52And that's I guess is an
  • 14:54example of a national,
  • 14:55and I think most of that is from China.
  • 14:57So that's an example of a
  • 14:59kind of a national priority,
  • 15:00but that one presumably will not last.
  • 15:04You know, more than a few months,
  • 15:06hopefully the outbreak is C.
  • 15:08Hope you attained, but that,
  • 15:09I guess, is an example of how those
  • 15:12priorities can happen very quickly.
  • 15:15Yeah, the HIV and AIDS
  • 15:17Crisis was something similar.
  • 15:18I'd spoken with a journalist who
  • 15:21had been the lead anchor, her NBC,
  • 15:24who described that period of
  • 15:26time when people were campaigning
  • 15:29at the FDA and NIH to do more.
  • 15:32Research and past these drugs
  • 15:34and it's pretty fascinating.
  • 15:39Back to neuroscience.
  • 15:40I know both of our shared
  • 15:42shared interests here.
  • 15:45I'm curious how how you can keep
  • 15:48abreast of the entire field like so.
  • 15:50How do you choose which things
  • 15:52to write about? Like what?
  • 15:54What do you think is interesting?
  • 15:56Yeah, I mean, I definitely can't keep
  • 15:59abreast of everything, so any means.
  • 16:01I mean, I do sort of focus on.
  • 16:04What I find to be interesting
  • 16:06and as to why do I find certain
  • 16:09things interesting, I mean.
  • 16:12I don't know. I mean,
  • 16:13I've never I, I'm not sure is the
  • 16:16answer things just a certain
  • 16:17thing you at your computer is
  • 16:19like do you? Do you chuckle to
  • 16:21yourself if you read a paper that's
  • 16:23really fun? I feel like you must.
  • 16:25I check when I read something. Yeah, so
  • 16:28like oh this is funny.
  • 16:29And then like the pieces that
  • 16:31you really want to expound upon.
  • 16:33Or, you know, interact with in some way.
  • 16:35Then you do find something
  • 16:36challenging them or.
  • 16:39And there are different.
  • 16:40There are different reasons
  • 16:41why something is interesting,
  • 16:42so it could be interesting 'cause.
  • 16:44It's saying something very new.
  • 16:46Um? But it also could be this is.
  • 16:51A study which is challenging what's
  • 16:55already sort of challenging or
  • 16:58debating what's already known.
  • 17:01Or it can just be interesting in
  • 17:03terms of the sort of the perspective
  • 17:06adds like it's taking it kind of.
  • 17:08Kind of roundabout or new look at things,
  • 17:11but once an
  • 17:12example I think would be fun
  • 17:15to talk about. OK, so actually
  • 17:17good examples, but the last
  • 17:19paper about which was the.
  • 17:22So basically there's a lot of.
  • 17:24Interest in these rare cases of
  • 17:27people who are born and they seem
  • 17:29to be like complete normal and they
  • 17:32don't have any obvious deficits.
  • 17:35But then. For some reason they have
  • 17:38an MRI scan or another break type
  • 17:40brain scan and they turn out to
  • 17:42have like apparently a huge hole,
  • 17:44sort of in the brain and this is
  • 17:46caused by hydrocephalus and it's
  • 17:48basically created a kind of fluid
  • 17:50hole like in the middle of where
  • 17:52their brain normally would be and
  • 17:54on the scan it looks like this
  • 17:56just like a huge hole and like a
  • 17:58tiny bit of brain sort of squished
  • 18:00around the edge of the skull right?
  • 18:02But they are able to function very well.
  • 18:06And there have been three or
  • 18:07four of those cases,
  • 18:09and they were quite famous.
  • 18:12And I think.
  • 18:14They often get called like
  • 18:16the the man with no brain,
  • 18:18which is obviously not
  • 18:19true. 'cause they do have, but
  • 18:21there's something going
  • 18:22on the brain still there,
  • 18:24but it certainly is dramatic when
  • 18:26you see it and there was an article,
  • 18:29famously called is your brain
  • 18:30really necessary about this?
  • 18:32And basically asking,
  • 18:33like if someone can function so well with.
  • 18:36So much their brain missing.
  • 18:38Like what extent is the brain
  • 18:41actually actually needed?
  • 18:42And of course, yeah,
  • 18:44there's a lot we don't
  • 18:46know about these cases.
  • 18:48And also it's not as if.
  • 18:53It's not like their brain is
  • 18:56actually missing necessarily.
  • 18:57I mean some. Because the the the
  • 19:01way the fluid sort of presses out
  • 19:04from the middle and it sort of
  • 19:06compresses the other areas there.
  • 19:08Probably is some quite a lot
  • 19:10of tissue lossed,
  • 19:11but it doesn't mean that any individual
  • 19:13part of the brain is not there.
  • 19:16Maybe even the functions are mapped.
  • 19:18The functions could have been mapped, yes
  • 19:20stroke or something.
  • 19:22So the paper books about was actually about.
  • 19:25A similar case, but are at so
  • 19:27they discovered a rat and they
  • 19:29were doing some study on like
  • 19:31completely unrelated like some gene.
  • 19:33They knocked out some gene and they
  • 19:35wanted to study the effect in rats.
  • 19:38And so they had some knockout
  • 19:40rats and control rats,
  • 19:41and most of them were fine.
  • 19:45All of them seem to be fine,
  • 19:46but then when they put them in the MRI
  • 19:49scanner they found out that one of them
  • 19:52has a huge kind of apparently a hole.
  • 19:54Exactly, yeah,
  • 19:55and it probably wasn't related to the.
  • 19:58The gene weather and it may just.
  • 20:00Happened by chance.
  • 20:01It may have been related to the
  • 20:03gene knockout, but probably not
  • 20:04'cause they all the other ones,
  • 20:06the knockout rats, the fine.
  • 20:08But this route basically
  • 20:10showed no real deficits.
  • 20:11It could sort of move around,
  • 20:13and I could do it like a learning task.
  • 20:18I think it was slightly showed some anxiety,
  • 20:20like slightly above the normal levels,
  • 20:22but otherwise it was pretty much fine,
  • 20:24so it's kind of human.
  • 20:27Sting human model
  • 20:28is not just a place.
  • 20:30Yeah, that's really interesting.
  • 20:31Yeah, it's so it could have
  • 20:33all of the same genes as it's.
  • 20:36I don't know siblings.
  • 20:37But yeah, sometimes I guess little
  • 20:39bit of tissue just gets stuck and
  • 20:41makes a clog or something. Yeah,
  • 20:43I'm not sure anyone knows why it happens.
  • 20:46Obviously normally think
  • 20:47hydrocephalus is fairly common,
  • 20:48but these were talking about like a
  • 20:51very extreme cases where it becomes.
  • 20:55To the extent that yeah,
  • 20:56you look on the scan and you say.
  • 20:59Where's the brain? Or you can just
  • 21:01about sort of see like wait a
  • 21:03second. Let's check the scanner.
  • 21:07But so I like that paper because.
  • 21:11There's something there's a lot of interest.
  • 21:14There's something sort of
  • 21:16inherently interesting about the,
  • 21:18just like the idea, I guess of a person
  • 21:22or an animal like going around and not
  • 21:26realizing that it has this sort of.
  • 21:29Dramatic abnormality?
  • 21:30Or is it dramatic?
  • 21:32When you look at it right,
  • 21:34but then in terms of functional terms,
  • 21:35it doesn't seem to be actually.
  • 21:39Affecting them at all.
  • 21:41It's almost like that I was
  • 21:43taught that you only use like 10%
  • 21:45of your brain or something like
  • 21:47that, which you know of course
  • 21:49not true, but it almost
  • 21:51makes you feel like that.
  • 21:52Do we actually know it's actually?
  • 21:55Yeah, like that those cases they
  • 21:58actually make you think well.
  • 22:01Maybe there's some truth in that.
  • 22:04And but they're also of interest I guess too.
  • 22:08Philosophers so there have been.
  • 22:13I think particularly first,
  • 22:14I think it was like 1980s for the first
  • 22:17one was discovered, there was quite
  • 22:19a lot of like discussion about it,
  • 22:21and some people were saying, well, maybe.
  • 22:25If this person doesn't have a brain,
  • 22:27that's evidence for the soul,
  • 22:29because then the soul is well
  • 22:31controlled in their body.
  • 22:32I didn't know you were going there.
  • 22:35That's where people took it right?
  • 22:37And again, it was never actually true
  • 22:39that they didn't have brain 'cause they
  • 22:41did have a branch like abnormal one,
  • 22:43but it was still a brain.
  • 22:47And this is actually this paper
  • 22:48about the rat was called life
  • 22:50with no brain, which again.
  • 22:53Isn't true. But that's what attracted
  • 22:57me to the paper to be fair, so it
  • 23:00worked. Sort of
  • 23:01template click, bait.
  • 23:02Headline also sounds interesting point.
  • 23:04I've wondered sometimes if
  • 23:06more scientists should be more.
  • 23:07Click baity you know to encourage public
  • 23:10engagement and specially in the US,
  • 23:12where most the majority of scientists funded
  • 23:15by the government in some fashion, right?
  • 23:18So to keep the taxpayer you know in
  • 23:20case for the government interested like
  • 23:23to encourage that sort of science.
  • 23:26It sounds like this paper wasn't
  • 23:28unrigorous that your time.
  • 23:29I haven't heard you say that it's something
  • 23:31like he was genuinely interesting was
  • 23:33definitely. I mean it
  • 23:34was one of these.
  • 23:37In the title obviously was was misleading,
  • 23:39yes, but the actual paper was
  • 23:41was fine as far as I could see.
  • 23:44Obviously just like an end of 1
  • 23:46paper says just an example of a
  • 23:48case report of 1 particular app,
  • 23:51but that's interesting out so weather.
  • 23:54Scientists should be more click baity.
  • 23:55I mean, I think a lot of them are.
  • 24:00Certain fields.
  • 24:05Seems to be more kind of encouraging
  • 24:07of that than others. Such as.
  • 24:11So there are certain areas of
  • 24:13social psychology, for instance,
  • 24:15where it's very common to have a.
  • 24:20A title which is like a little
  • 24:22start off with like a little joke
  • 24:25and a colon and then like along.
  • 24:27Explanation of the joke was that
  • 24:30yeah or what actually happened
  • 24:32in the paper.
  • 24:39Search something something.
  • 24:40I mean there are lots of examples,
  • 24:42but there's like there's a
  • 24:44paper called Boom Headshot.
  • 24:45And then, like colon,
  • 24:48and there's something like.
  • 24:50The effects of violent video
  • 24:54game shooting on children's.
  • 24:57As she to firearms,
  • 24:58that kind of thing, yes.
  • 25:00And it's about saying that
  • 25:02if you get children to shoot.
  • 25:04Like monsters in the head in a video
  • 25:06game that it makes them more likely to.
  • 25:08Anne.
  • 25:11Play with a real gun or something.
  • 25:14Yeah, I remember these
  • 25:15are from both sides of that.
  • 25:17Some say yes, some
  • 25:18say no. Isn't that controversial paper?
  • 25:22But certainly there's like many,
  • 25:23many papers like that which have
  • 25:25that kind of. The title is kind
  • 25:29of attention grabbing, yeah?
  • 25:33There's nothing wrong with it.
  • 25:35I mean, I think yeah,
  • 25:36you want to make your title titles
  • 25:37job is to attract attention.
  • 25:39Really, as long as it's doing it in it,
  • 25:41as long as it's not inaccurate like yeah,
  • 25:43it's in life with no brain.
  • 25:46Is just wrong, right?
  • 25:48You sure, I'm surprised that
  • 25:50the general allowed it, but.
  • 25:52Making your work sound interesting.
  • 25:54Yeah, it's definitely you should.
  • 25:55People should be doing that.
  • 25:58So I feel like what you've done as
  • 26:00you've acted as a liaison for you know
  • 26:03a lot of good science and then also.
  • 26:08Debunker bad sign I that's not the
  • 26:10right term, but I think you understand.
  • 26:12I mean so I I wonder how to get
  • 26:14more scientists themselves involved
  • 26:15in doing that same sort of work.
  • 26:19So that's yeah,
  • 26:21that's that's something that a lot of.
  • 26:25Well, I've wondered about other people have.
  • 26:27I've asked me and I guess my
  • 26:31answer is it's certainly true.
  • 26:34I mean, many scientists unlike are aware of.
  • 26:39Bad science in their field and are
  • 26:42already like aware and able to point
  • 26:44to like the specific flaws with it.
  • 26:46So it's I think it's not a matter of.
  • 26:51Getting scientists to.
  • 26:56Kind of to the bunk stuff so
  • 26:58much as it is getting wins.
  • 27:01Do it publicly interest.
  • 27:02So yeah, so, oh like have a
  • 27:05form for debate
  • 27:06then yeah, I actually
  • 27:07think it's a pub.
  • 27:09Peer has done a lot for that.
  • 27:11I think I'm not familiar with that.
  • 27:14What is soap up?
  • 27:15Here is a site it's called public.com
  • 27:18and basically you go there and.
  • 27:20You can, there's like a.
  • 27:23Sort of organized by paper,
  • 27:25so that every every paper.
  • 27:28That's on pub Med.
  • 27:30Like you can have like a discussion about it.
  • 27:34So you can go to like the page
  • 27:36for a particular paper and
  • 27:38there might be comments on it.
  • 27:41Or you can start the comments.
  • 27:44But it's it's.
  • 27:46It's an anonymous forum or no.
  • 27:48It's anonymous by default.
  • 27:49I think some people are able to
  • 27:52post under name if you if you wish,
  • 27:55but it's not anonymous by default,
  • 27:58and I think there isn't a lot for making.
  • 28:03Making people.
  • 28:05More willing to to publicly.
  • 28:08Debunk things. Because I think.
  • 28:12As I say, a lot of people have.
  • 28:14I perfectly aware of when you have to.
  • 28:18Shoddy or terrible science?
  • 28:21And there are there are many stories
  • 28:24I've heard of people who have.
  • 28:26Who tried to replicate a previous study?
  • 28:30Anne failed to replicate.
  • 28:34And then. They just didn't let.
  • 28:37They just moved on something else,
  • 28:39but they didn't publish that or filed right?
  • 28:42Or yes. It's also changed now,
  • 28:46but there's more of a.
  • 28:48Moving acceptance of.
  • 28:51Replication studies and also
  • 28:54more understanding that it's
  • 28:56important that often studies are.
  • 29:00Are not going to be replicated like.
  • 29:04There's more understanding now that if you
  • 29:06fail to replicate something, it might be.
  • 29:11It cause that study was wrong in
  • 29:13the 1st place rather than on your
  • 29:16because you have like screwed up in
  • 29:18some way. Sure you are there.
  • 29:20Conditions have to be just
  • 29:22so every time. So what do
  • 29:24you think about the generalizability crisis?
  • 29:29So this is something which I'm still
  • 29:33still thinking about. Have to say.
  • 29:36I mean, I think it's a huge,
  • 29:40very important like issue, but.
  • 29:43I still haven't personally sort of work
  • 29:46through it in my mind to know exactly
  • 29:48what to make about it, but I think.
  • 29:53I think the kind of fundamental point that.
  • 29:59Replik ability. So the point is,
  • 30:02I say is with applicability is not enough.
  • 30:05Basically, you're saying you can have it.
  • 30:07You could have an effect which is
  • 30:09very replicable if you get the
  • 30:11conditions to be exactly right.
  • 30:15But that doesn't mean that it's.
  • 30:19It will generalize to other conditions,
  • 30:21and if we talk about an effect and we
  • 30:23and we say for instance, I don't know.
  • 30:32Well, the example that was used.
  • 30:36Was if you talk about a memory.
  • 30:38If you talk about something that
  • 30:40just happened to you, your memory.
  • 30:44Your ability to recognize.
  • 30:47Like details of the scene.
  • 30:52Is reduced, so it's I think they
  • 30:55called it verbal overshadowing.
  • 30:56Yeah, so if you talk about
  • 30:59something then that kind of
  • 31:00makes it into a verbal memory
  • 31:03and we process
  • 31:04it and go down
  • 31:05and then that actually makes it
  • 31:07less makes the visual details less
  • 31:09accessible and which has a lot of
  • 31:12practical implications for like,
  • 31:14eyewitnesses and so forth.
  • 31:18And the original study of this.
  • 31:22It seems had. Two stimuli essentially.
  • 31:27So the simulator movies and they
  • 31:29only show participants one of two
  • 31:32different movies about for remembering
  • 31:34remembering this correctly.
  • 31:36And the question is,
  • 31:38will that then generalize to other stimuli?
  • 31:41And because we can say,
  • 31:44I mean we would normally talk about this,
  • 31:47we say. We say something like.
  • 31:53If you view if you view a movie and then
  • 31:56talk about the events that happened,
  • 31:58that will impair your ability to recognize.
  • 32:01The details right?
  • 32:03So we say a movie if you view a movie,
  • 32:06but we actually only tested
  • 32:08two particular movies right?
  • 32:10And there are an infinite
  • 32:12number of movies that we could
  • 32:14have situations. There's
  • 32:15infinite number of movies
  • 32:17that you could have used.
  • 32:21And if you only use two and two is obviously
  • 32:24two is kind of obviously a low number,
  • 32:27but let's say you used 100, right?
  • 32:29But then that's still like you've
  • 32:32used 100 different movies.
  • 32:34But there's an infinite number
  • 32:35of movies you could have used.
  • 32:38So, is it justified to assume that that
  • 32:43will generalize to other movies and?
  • 32:47I think the question I have is.
  • 32:50Intuitively, it seems like there must
  • 32:53come a point where you can generalize.
  • 32:56Intuitively, it seems you can say.
  • 33:00Well, I've I have used enough
  • 33:03stimuli that I am covering like
  • 33:06the whole range of stimuli that
  • 33:10would normally be encountered.
  • 33:13In real life and OK there are weird
  • 33:16stimuli in an infinite number of
  • 33:19words to me like you could cook up,
  • 33:22but intuitively it seems like there
  • 33:24should be a way to cover the spectrum of
  • 33:29really stimulated are likely to occur.
  • 33:32But it's actually very hard, I think,
  • 33:35to to formalize that and say.
  • 33:38And say this is really.
  • 33:42This is a generalizable
  • 33:44finding because I've used. Um?
  • 33:48A wide enough variety of stimuli.
  • 33:53And it seems like we should be able
  • 33:55to say that, but is there a way?
  • 33:58The weekend put a number on it,
  • 34:03for instance, so I mean with replica
  • 34:05ability you could put a number on it.
  • 34:08You can say oh, look,
  • 34:10I've replicated this study five times
  • 34:13with 5000 people in each study.
  • 34:16So then you can put like AP
  • 34:18value or base factor on it,
  • 34:20and you say that's how certain
  • 34:23I am that this effect is real
  • 34:26under these conditions.
  • 34:27You can actually put a number on it,
  • 34:30but with generalizability.
  • 34:32Wait, everybody,
  • 34:32is there anyway to put a number on that?
  • 34:36Well
  • 34:36maybe making me wonder whether
  • 34:38the concept of generalizability
  • 34:40itself is poorly framed, right?
  • 34:41So the idea of generalizability
  • 34:43is that I've discovered something
  • 34:45that's true in some absolute sense,
  • 34:47like if I hold this glass
  • 34:50and let go over the floor,
  • 34:52it will fall and crack,
  • 34:54you know, on the ground.
  • 34:56So like I could say that that's
  • 34:58a generalizable thing, right?
  • 35:00And so I wonder if our statistics
  • 35:03aren't set up to do that anyway.
  • 35:06Like we're not trying to measure
  • 35:08causality and almost all of our analysis
  • 35:11and like is maybe generalizability.
  • 35:13People are interpreting as being.
  • 35:15Call on somewhere when it's really not.
  • 35:19I think the concept of
  • 35:22generalizability. In theory.
  • 35:27Could be applied to purely
  • 35:31correlational claims.
  • 35:33But but in practice,
  • 35:34it does tend to be.
  • 35:39It almost always is causal claims
  • 35:42that are being talked about. Um?
  • 35:47And I think the.
  • 35:50The funny thing about about.
  • 35:54Making a generalizable claim.
  • 35:58Is the time if I say for instance?
  • 36:02Um? But seriously,
  • 36:04this is this is one thing that.
  • 36:07One of the things I've been
  • 36:10thinking about in response to the.
  • 36:12The generalizability crisis preprint.
  • 36:13Is that it is something which we do
  • 36:16every day and not just in science.
  • 36:18So you give you get example of
  • 36:20if I hold a glass and drop it.
  • 36:23It will break right and I would say that
  • 36:26that is a claim which will generalize.
  • 36:30To most glasses.
  • 36:32I mean, yeah, sure yeah,
  • 36:34maybe there are some like a plastic plastic.
  • 36:37OK well that's not glass so I would say.
  • 36:41Look up, but
  • 36:42this is the thing right?
  • 36:44'cause actually some people would
  • 36:46say that it was a glass and then?
  • 36:49And then the claim wouldn't
  • 36:51generalize to that group,
  • 36:52but if you drop a glass and also if you
  • 36:54drop a glass for a very small height,
  • 36:57it's not gonna break.
  • 37:00And if you drop a glass
  • 37:02onto a Hello
  • 37:03pillow, or frankly it's going
  • 37:05to break, right, so? If I say.
  • 37:11Don't drop that glass, it'll break.
  • 37:14I'm making a claim which is.
  • 37:17On the face of it,
  • 37:19a very general claim,
  • 37:21but I think in reality.
  • 37:23I'm not making a general claim.
  • 37:26I'm making a kind of claim
  • 37:29about things that would.
  • 37:31Things that I plausable
  • 37:34unlikely to happen right?
  • 37:37So probabilistic claim under
  • 37:38certain circumstances under
  • 37:40realistic circumstances,
  • 37:40if I say don't drop a glass, it'll break.
  • 37:44I'm not thinking about someone dropping
  • 37:47it from a height of 1 centimeter.
  • 37:49Sure, 'cause? Why would?
  • 37:51Because that wouldn't?
  • 37:53That wouldn't happen,
  • 37:54or at least is less likely to happen, or it
  • 37:57wouldn't be a notable event if it did happen.
  • 38:00So in some way I'm implicitly referring
  • 38:02to a kind of space of events.
  • 38:05That actually would happen, right?
  • 38:07And that's the space I can generalize over.
  • 38:10I think that's also true. Maybe in science.
  • 38:15But many claims are generalizable.
  • 38:18But only over the narrow. Sort of.
  • 38:23Space of realistic. Unknown.
  • 38:29Contrived or non abnormal scenarios
  • 38:32BC as I said, it's very much a.
  • 38:37It's a very intuitive argument,
  • 38:39is not rigorous.
  • 38:41Well, I just said and it's you can't say.
  • 38:45For sure that the claim
  • 38:47is going to generalize.
  • 38:49So I think there is perhaps a crisis,
  • 38:51certainly in the fact that we can never be.
  • 38:54Sure that we've got a generalizable claim.
  • 38:59I've wondered also if some of
  • 39:01the questions that we ask of our
  • 39:04statistics are impossible to be
  • 39:06answered and so like something,
  • 39:08something I think about as as a clinician
  • 39:11as the mapping of diseases or symptoms
  • 39:14all the way back to jeans, right?
  • 39:17So there are so many intermediary
  • 39:19steps like these causal steps,
  • 39:21genes, molecules, circuits,
  • 39:23blah blah all the way to the expression of
  • 39:26behavior that I then interpret as a symptom.
  • 39:30Anne. I wonder sometimes whether
  • 39:32we're asking too much of our tools.
  • 39:36Like can this statistical
  • 39:38test really find true?
  • 39:40Like correlation that is generalizable
  • 39:43between one symptom or one diagnosis and?
  • 39:48Million jeans or something?
  • 39:49Yeah, millionaire snips.
  • 39:51I mean,
  • 39:51I think that so actually comes
  • 39:54back to I guess the question of
  • 39:57the plastic glass because. If you.
  • 40:00Talking about construct like,
  • 40:01it is like a psychiatric disorder,
  • 40:03which is something that can't
  • 40:05sort of directly measured,
  • 40:06but it's a it's a concept that we have.
  • 40:11Then whether we can ever
  • 40:14generalize about that.
  • 40:15Will depend on whether that construct.
  • 40:20Is actually. Itself a valid one.
  • 40:25That would mean it has to be very
  • 40:28carefully defined, so if you said.
  • 40:31If you decided that every every.
  • 40:35Everything that holds a drinking liquid,
  • 40:37it's a glass, right? Then?
  • 40:39This would also go visit your glass and
  • 40:42if you had that kind of a very broad
  • 40:46view of glasses then most glasses would
  • 40:48not break when you drop them right?
  • 40:51Because most things are not made of.
  • 40:55A major plastic or metal,
  • 40:56and they're not going to break.
  • 41:00So you would
  • 41:01think it was
  • 41:02falsified if you try to reproduce that
  • 41:04based on the yeah,
  • 41:05then you would think that you
  • 41:07would use got a non generalizable
  • 41:09claim that glasses break.
  • 41:10We drop them. Even though in fact.
  • 41:16In the kind of original meaning,
  • 41:18it was perhaps possible,
  • 41:20was perhaps valid valid.
  • 41:22Well, thanks for chatting with
  • 41:24me, I really appreciate your
  • 41:26time, specially meeting
  • 41:27and also take this.
  • 41:36Well, we hope you enjoyed that episode.
  • 41:39Thanks again to Neuro skeptic for
  • 41:41being on the podcast and for that
  • 41:44very kind bartender for letting
  • 41:46us record in his empty pub.
  • 41:48You can find neuro skeptic on
  • 41:50Twitter at Nuro underscore skeptic.
  • 41:52Again that's at neural underscore skeptic.
  • 41:54You can also find him of course at
  • 41:58discovermagazine.com where many of
  • 42:00his recent blog posts are listed.
  • 42:02Thanks to the Yellow School of Medicine
  • 42:04for sponsoring the podcast to age and
  • 42:06bottom burger for producing the podcast
  • 42:08and to Ryan McEvoy for his awesome Help,
  • 42:10sound editing and special thanks
  • 42:11to you for listening again.
  • 42:13My name is Daniel Baron and I've
  • 42:15been your host and I'll see you
  • 42:17next time here on science at all.