Skip to Main Content

The Ethics of Live-Brain Research

January 11, 2024

December 6, 2023

The Ethics of Live-Brain Research

Stephen R. Latham, JD, PhD

Director, Interdisciplinary Center for Bioethics

Yale University

ID
11170

Transcript

  • 00:00No, no slides going old school.
  • 00:04Oh, my goodness. All right.
  • 00:08So welcome. I'm so glad you're here.
  • 00:11And welcome to the folks who are
  • 00:12here on the ZOOM meeting as well.
  • 00:14So I think most
  • 00:15of you know, and I'm delighted you're here,
  • 00:17The, the program for Biomedical Ethics
  • 00:19and the Yale Pediatric Ethics Program,
  • 00:21we host an evening seminar twice a month.
  • 00:23There'll be one more before Christmas.
  • 00:25That's next week, in fact.
  • 00:26But tonight we have something
  • 00:27special and some of you are familiar
  • 00:28with this work and some are not.
  • 00:30I sent a note to you guys a couple
  • 00:32weeks ago about it a little bit, but
  • 00:34I had asked my colleague Steve Latham
  • 00:36to come and speak to us this fall.
  • 00:38Steve Latham it was is educated as an
  • 00:43attorney and a bioethicist to thinking
  • 00:44with the Harvard undergraduate as well as
  • 00:45the Harvard Medical School and has a PhD.
  • 00:49He's no medical school.
  • 00:51I can't believe I let you take out.
  • 00:53No, it was, it
  • 00:54was the law school I remember.
  • 00:56So he is an attorney and a philosopher
  • 00:59and is importantly very well known
  • 01:02in the world of health care ethics,
  • 01:05has been very much involved in
  • 01:06the American Society of Bioethics.
  • 01:07He's a fellow of the Hastings
  • 01:10Center and for us at Yale.
  • 01:12He is also for several years now
  • 01:13he's been the director of the
  • 01:15Interdisciplinary Center for Bioethics.
  • 01:17So you guys who have an interest
  • 01:18in this stuff who may not be
  • 01:19familiar with this, we have us,
  • 01:20our sister program on the main campus and
  • 01:22they host all sorts of various things,
  • 01:25among others, a really amazing summer
  • 01:27program which you can look into.
  • 01:29But if you just looked up,
  • 01:30you look up Yale Bioethics,
  • 01:31you'll see a link for us and
  • 01:32you'll also see a link for the
  • 01:34interdisciplinary Center for Bioethics.
  • 01:35Lots of lots of fascinating
  • 01:37offerings over there.
  • 01:38And so we do things back and forth,
  • 01:39we help each other out.
  • 01:40Steve is very well known teacher
  • 01:44specifically on the main campus.
  • 01:45He's taught a course in
  • 01:46bioethics there for years,
  • 01:47which has been either the most highly
  • 01:50subscribed course or one of the two or three
  • 01:52highly subscribed courses for many years now,
  • 01:57recently.
  • 01:57And so Steve's come over here,
  • 01:59I'd say at least once a year.
  • 02:00He does one of these sessions
  • 02:01for us and they're always great
  • 02:03because he's a very gifted teacher,
  • 02:04as you'll see.
  • 02:06He also has IRB experience and through
  • 02:10his work in terms of research ethics,
  • 02:12he got involved in a project
  • 02:13about live brain research which
  • 02:15is absolutely fascinating.
  • 02:16And some of you are familiar with this stuff.
  • 02:19Some of you historians who are sitting
  • 02:21in the French Revolution might find
  • 02:22some of this stuff fascinating.
  • 02:23I mean there's there's various
  • 02:25ways you can get into this.
  • 02:26But I I I wanted so much to
  • 02:28hear Steve speak about his
  • 02:29work on live Brain research. And so
  • 02:32I'm going to tell you guys some numbers,
  • 02:34which is the texting code for
  • 02:36credit for this for CME is 40966
  • 02:40out of town, please call collect
  • 02:4440966 to get CME credit.
  • 02:47We're going to do this the way we
  • 02:48usually do it, which is that Steve's
  • 02:50going to speak for about 45 minutes,
  • 02:52give or take a little bit.
  • 02:54And then after that we are going to
  • 02:56have a conversation and I'm moderate
  • 02:57and you guys can ask Steve questions,
  • 02:59you can fight with each other.
  • 03:00We can do all sorts of things until 6:30,
  • 03:03at which point I will be wrapping it up,
  • 03:05no matter how fascinating
  • 03:06the conversation is.
  • 03:07We'll close it off at 6:30.
  • 03:09But until then,
  • 03:09it's going to be a great evening.
  • 03:10I thank you so much for coming.
  • 03:12And I turn this over to my
  • 03:13friend and my colleague,
  • 03:14the director of the Interdisciplinary
  • 03:16Center for Bioethics at Yale,
  • 03:18Steve Latham.
  • 03:25Oh, thank you so much, Mark.
  • 03:26It's great to be here turning on my timer,
  • 03:30so I make sure I don't go too
  • 03:32long without giving you guys
  • 03:33a chance for some questions.
  • 03:34So I have to start with a couple
  • 03:38of mandatory disclosures.
  • 03:40I'm not a neurologist.
  • 03:42I'm not a neuroimaging expert.
  • 03:44I'm not even a neuroethicist.
  • 03:48So if you're here to learn about
  • 03:51a specialty area, you won't.
  • 03:53If you're here because you know everything
  • 03:55about neurology and neuroethics,
  • 03:57you will be hearing a non specialist
  • 04:00dribbling his philosophical ideas
  • 04:01over your field in a way that
  • 04:04will undoubtedly be disappointed,
  • 04:06disappointing to you.
  • 04:07So you've been warned.
  • 04:08Secondly,
  • 04:09more importantly,
  • 04:10I have a conflict of interest which
  • 04:13is that I'm in an I'm an ethics
  • 04:15advisor and a an equity holder
  • 04:18of Bexorg LLC which is a Yale
  • 04:21spin off company that originated
  • 04:23in the Ceston neural lab at the
  • 04:26medical school that is trying to
  • 04:29commercialize some of the techniques
  • 04:31that I'll be describing in the
  • 04:34last and largest part of my talk.
  • 04:36So I want to make money off this.
  • 04:40So don't believe a word I say. OK.
  • 04:44So having given you those warnings,
  • 04:47let me tell you how this talk is organized.
  • 04:50OK, first, I mean,
  • 04:52I I'm going to kind of do a very,
  • 04:55very fast overview of what I think
  • 04:58of as main issues in the general
  • 05:00field of neuroethics that have to
  • 05:03do with research on live brains,
  • 05:06including research on brains
  • 05:07that are in live humans, OK.
  • 05:11And then I will turn to the Bexorg work,
  • 05:14which is actually on brains that
  • 05:16are not in live people anymore,
  • 05:20but that are nonetheless live
  • 05:22in ways that I will describe.
  • 05:25OK, so the plot of the talk is this.
  • 05:27First I'm gonna give an opening
  • 05:29section which basically says neuro
  • 05:31research on live brains is just like
  • 05:33all the other research on live folks,
  • 05:36raising all the same basic issues.
  • 05:39And I'll explain what I mean by that.
  • 05:42And then there's a Part B,
  • 05:44which is except when it isn't,
  • 05:48OK,
  • 05:49which is going to be some of the
  • 05:51things that make neuro ethics and
  • 05:52live brain research and neuro a
  • 05:54little bit different from other
  • 05:56kinds of medical ethics.
  • 05:58And then I'm going to turn to
  • 06:01the work that the Ceston lab and
  • 06:03Bexorg LLC have been doing on
  • 06:06live brains that are ex vivo.
  • 06:10So why is a lot of neuro live
  • 06:16brain work mostly very like?
  • 06:18What? Why are the ethics issues
  • 06:21associated with neuro live brain
  • 06:23work mostly like ethics issues
  • 06:24in lots of other research areas?
  • 06:27I'll give you a few examples.
  • 06:28Number one, animal ethics issues, right?
  • 06:31Neuro uses regularly mice and rats,
  • 06:35zebrafish for certain kinds of work
  • 06:39where you require a sophisticated brain,
  • 06:41non human primates for work,
  • 06:44where you need a really good model of how
  • 06:46the human brain responds to trauma pigs.
  • 06:49So pigs do a lot of a lot of head
  • 06:51trauma work is done on pigs.
  • 06:52And you know
  • 06:59all of the animal ethics issues
  • 07:01that are raised in all other kinds
  • 07:04of animal ethics research for for
  • 07:06medicine are raised in the neuro
  • 07:08context and they're not that different.
  • 07:11If neuro is maybe a little bit harder,
  • 07:15it's because some of the things
  • 07:17that neuro folks are trying to
  • 07:19map in their live brain research
  • 07:21require a more sophisticated brain.
  • 07:23So you can't do it with a zebrafish.
  • 07:25You have to do it with you have to do
  • 07:27your Parkinson's model or your stroke
  • 07:29model in a non human primate or you have
  • 07:32to do your brain trauma model in a pig.
  • 07:40I have to say, and I've written
  • 07:43a very little about this,
  • 07:45I think we do way too much animal research.
  • 07:48I'm very much more hesitant than the
  • 07:50average person who approves of animal
  • 07:53research at all about animal research.
  • 07:57I am very worried about the FDA standard
  • 07:59that says we need results in animals
  • 08:01before we go on to human trials.
  • 08:04Because, guess what, it doesn't work.
  • 08:05Because most human trials for drugs
  • 08:08come to the FDA with excellent
  • 08:10animal evidence and then they fail,
  • 08:12even at phase one,
  • 08:13and they've almost none of
  • 08:14them get through phase three.
  • 08:16And that means the animal
  • 08:17models are not working.
  • 08:18And they're all kinds of good reasons
  • 08:20why the animal models aren't working.
  • 08:22We do it on all mice of the same
  • 08:25gender from the same little colony,
  • 08:27and mice aren't like us in
  • 08:29many ways anyway and so on.
  • 08:31And I think we waste a lot of
  • 08:33animals and I think it's really an
  • 08:35area where we need massive reform.
  • 08:37But none the less I think that
  • 08:40in neuroscience some use of
  • 08:42animals is justified.
  • 08:43And I'll say a little bit more about
  • 08:45that later when we get to talking
  • 08:47about the postmortem brain issue
  • 08:51with with Siston Lab and Bexorg.
  • 08:55Second,
  • 08:57another category in which a lot of
  • 09:00research in neurosciences on live
  • 09:02brains resembles other kinds of humans,
  • 09:04other kinds of research is human subjects,
  • 09:07right.
  • 09:08We've got some very well known
  • 09:10standards for informed consent
  • 09:12with human subjects research,
  • 09:14I would say, you know,
  • 09:16so it's not that different
  • 09:18for live brain research.
  • 09:20Humans consent to what?
  • 09:22They're supposed to what?
  • 09:24What will be happening to them.
  • 09:26They're supposed to understand
  • 09:27risks and benefits.
  • 09:28And IRB is supposed to have made
  • 09:32balancing choices about risks and
  • 09:34benefits and is supposed to have
  • 09:36approved the informed consent process.
  • 09:38I was going to say it's not brain surgery,
  • 09:40except that in many cases it is.
  • 09:42So I'm going to say it's
  • 09:44not rocket surgery. OK,
  • 09:49so and there are many examples of
  • 09:53these totally mainstream ethics
  • 09:55issues in live brain surgery,
  • 09:57but we don't, we shouldn't treat them
  • 09:59any differently than we treat parallel
  • 10:01issues in in in cancer surgery.
  • 10:03So for example, right now Lori Bruce,
  • 10:06my colleague at the Interdisciplinary Center,
  • 10:08and a colleague of ours from
  • 10:12Oxford University, Brian Earp,
  • 10:14have been working on this problem with
  • 10:17a lot of psychedelic studies in the
  • 10:20brains of living people who have PTSD.
  • 10:23There seems to have been a kind
  • 10:26of systematic problem of not
  • 10:29reporting unanticipated
  • 10:34adverse events in association with those
  • 10:37trials. Harms are not being reported
  • 10:39and then when they're not reported,
  • 10:40they're not being disclosed to new subjects
  • 10:44in the in the informed consent findings.
  • 10:48But the way you address that is exactly
  • 10:49the same way that you would address a
  • 10:51similar problem in a cancer trial, right?
  • 10:53You try to reform the kinds of things
  • 10:55that people report and you try to reform
  • 10:57the kinds of things that people ask
  • 10:59about in the course of informed consent.
  • 11:03Another couple of big kind of brain live,
  • 11:06brain study issues that arise,
  • 11:08but they arrive in they arise
  • 11:11in other areas as well.
  • 11:13The problem of predictive diagnosis.
  • 11:17Very often people who study brains,
  • 11:21if they're studying them using
  • 11:23genetic tools or even imaging tools,
  • 11:25they can come up with predictions about
  • 11:28brain diseases that someone might be,
  • 11:30that that someone is more apt to suffer
  • 11:34from than someone in the general population,
  • 11:37even though they're asymptomatic
  • 11:38at the moment.
  • 11:40This raises lots of issues in
  • 11:42particular can be quite dramatic,
  • 11:44traumatic for people to be informed that
  • 11:47they are at risk for certain disease
  • 11:50where there's no treatment and no cure.
  • 11:52But this is not a problem.
  • 11:54That's again,
  • 11:55it's not a problem that's unique
  • 11:56to brain science.
  • 11:57It's a problem that is exists across
  • 12:00the board and it shows up particularly
  • 12:03in every field that uses imaging and
  • 12:06you and there's a a parallel issue here
  • 12:08which has to do with incidental findings.
  • 12:11You you can find some some traits that
  • 12:15people have that are not actionable.
  • 12:17You can find other traits that people have
  • 12:20that you don't quite know what they mean.
  • 12:23There's something abnormal
  • 12:25in the person's body,
  • 12:26but you have no idea what it means.
  • 12:28And then you're going down the road
  • 12:30toward lots of additional tests and
  • 12:32a lot of uncertainty and a lot of
  • 12:34worry because something cropped up on
  • 12:37an image that you were doing for one
  • 12:39reason that might be relevant to other stuff.
  • 12:42And very often we don't adequately
  • 12:44inform people about the risks that
  • 12:46some kinds of imaging or some kinds of
  • 12:49tests might inform them about things
  • 12:51that we really don't know what to do about.
  • 12:55But again,
  • 12:56that's not a brain science specific problem.
  • 12:58That's a problem that we get in
  • 13:01lots of different kinds of medical
  • 13:03research and medical practice.
  • 13:06And there is growing wisdom on this.
  • 13:07Lots has been written about how to
  • 13:11handle the incidental findings issue and
  • 13:14we can talk about this in discussion,
  • 13:15but I I commend you to
  • 13:17that growing literature.
  • 13:18The point is that it's a thing that
  • 13:20shows up in a lot of brain studies
  • 13:23because we do a lot of imaging of
  • 13:25brains and you could come up with
  • 13:27something that looks a little unusual,
  • 13:28doesn't quite look like a tumor.
  • 13:30Is that a blockage? Right.
  • 13:32And you know what?
  • 13:33You do imaging on anyone,
  • 13:36and you will come up with all
  • 13:38kinds of unusual
  • 13:38findings that are not causing
  • 13:40any symptoms and that you will
  • 13:42need to explain and work on.
  • 13:43And it's burdensome and it's expensive.
  • 13:46And if you haven't informed people
  • 13:48of the risk of that in advance,
  • 13:50you're doing something wrong. OK.
  • 13:57Another another thing that live
  • 14:02brain research and here I'm thinking
  • 14:04particularly of emergency research
  • 14:06has in common with many other
  • 14:09fields is the problem of holding
  • 14:11out the prospect of life saving
  • 14:16but creating a a very low
  • 14:20quality of life, right.
  • 14:21So when we learned, for example,
  • 14:24that cooling off a traumatically
  • 14:27injured brain in the Ed could reduce
  • 14:30brain swelling and could save
  • 14:32some people from dying from the
  • 14:34swelling pressure against the skull.
  • 14:36When we learned that lots
  • 14:38of lives were saved,
  • 14:39but the lives that were saved
  • 14:41were lives of people who had just
  • 14:43gone through a traumatic brain
  • 14:44injury and many of them were saved
  • 14:47only to be profoundly disabled,
  • 14:49including mentally disabled, right.
  • 14:52So in every kind of cutting edge
  • 14:54piece of research we have this issue
  • 14:57of whether a new technology might be
  • 15:01advantageous in terms of life saving terms,
  • 15:04but that might might leave the
  • 15:07patients very unhappy and also in
  • 15:09a in a place where they need very
  • 15:12expensive care or where they put huge
  • 15:14demands on their families and so on.
  • 15:16But again,
  • 15:17this is not unique to brain sciences.
  • 15:22Another thing that's not unique
  • 15:24to brain sciences,
  • 15:25the treatment versus enhancement distinction.
  • 15:32You know, you can give someone Aricept
  • 15:34who's got Alzheimer's and that will
  • 15:37improve their short term memory.
  • 15:39It'll help them hold on to a few more
  • 15:42facts for the time that they're on it.
  • 15:44But I'm sure no one in this
  • 15:46room knows anything about this.
  • 15:47But it has been rumoured that the
  • 15:50occasional medical student has taken
  • 15:52Aricept just before an exam because
  • 15:56it is actually attention enhancing
  • 15:58and allows you to really focus even
  • 16:01when you're tired and so on, right?
  • 16:04So many of the very same kinds of
  • 16:07interventions that treat people can
  • 16:09also enhance people and make them
  • 16:13perform they're they're perfectly fine,
  • 16:15but the treatment causes them
  • 16:17to perform it a little bit more
  • 16:19than perfectly fine, right?
  • 16:21And we have endless debate about the
  • 16:25treatment enhancement line, right?
  • 16:27We give growth hormone to children
  • 16:30of unusually short stature,
  • 16:33but we don't give growth
  • 16:35hormone to ordinary kids,
  • 16:36so they can grow up to be 6 foot
  • 16:386 basketball players, right?
  • 16:40The very same treatment that
  • 16:42treats a thing that we characterize
  • 16:45as a condition also enhances.
  • 16:48Well,
  • 16:48this is coming up in brain science
  • 16:50a lot because a lot of what brain
  • 16:52science is focused on is trying
  • 16:54to relieve medical problems for
  • 16:56people who are losing their memories
  • 16:58or are losing capacity to focus
  • 17:01all these kinds of things.
  • 17:02And the and the treatments for those things
  • 17:07very often could also be used for
  • 17:11enhancement therapies. There's someone,
  • 17:13there are a couple of people who claim
  • 17:16that the treatment enhancement thing in
  • 17:18brain science is different from treatment
  • 17:21enhancement in terms of the genetics of
  • 17:25muscle firing and or other things because,
  • 17:28and this is John Harris at
  • 17:31Manchester University in in the UK.
  • 17:33Because if you enhance someone's intellectual
  • 17:37capacities, you make them smarter.
  • 17:40They will become better people,
  • 17:43They will make better ethical decisions.
  • 17:46Because smarter people
  • 17:47understand arguments better,
  • 17:49take in better information, and so on.
  • 17:51And I want to say, John,
  • 17:52have you never seen a Bond film?
  • 17:56Do you not know that there are evil villains?
  • 18:02Have you never read Hannah Arendt Anyway?
  • 18:06So I don't believe the argument that
  • 18:09the treatment enhancement distinction is
  • 18:11meaningfully different in brain science
  • 18:13than it is in other kinds of science.
  • 18:17And finally, in the long category of
  • 18:21things that aren't different about brain
  • 18:23science compared to things that are
  • 18:28access and justice issues,
  • 18:33in brain science, as in
  • 18:38genetics, as in surgery,
  • 18:41as in assisted reproduction,
  • 18:43every time you invent a new thing,
  • 18:47it's expensive and it always raises the
  • 18:51question of who will have access to it.
  • 18:54And that question is not
  • 18:56different across different fields.
  • 18:58It's really a question of how we will
  • 19:01ensure equitable access to any new kind of
  • 19:07intervention that we come up with.
  • 19:10So those are all really important neuro
  • 19:13ethics, live brain science issues,
  • 19:17but they aren't unique to live brain science.
  • 19:21OK, so Part 2, this is what's
  • 19:26not usual in brain science.
  • 19:29What does brain science have in terms of
  • 19:32research that other fields don't have?
  • 19:36One is this issue of neuro privacy,
  • 19:41the idea that some kinds of
  • 19:45brain interventions might give
  • 19:47us eventually dangerous kinds
  • 19:50of access to people's thoughts?
  • 19:56You may have seen, I don't know,
  • 19:57about four months ago,
  • 19:58a new technology that implanted
  • 20:00electrodes in the brain of a woman who
  • 20:02had previously not been able to speak.
  • 20:04And when she thought about sentences,
  • 20:06her her thoughts were translated into
  • 20:10words that were vocalized by, you know, AI.
  • 20:14And I thought that was amazing, right?
  • 20:17And then I thought,
  • 20:20can she turn it off?
  • 20:24Does everything that she thinks
  • 20:26come up on this machine as long as
  • 20:29the researchers have it on right?
  • 20:31So this gives you a a a sense of of
  • 20:33where neuro privacy issues might go.
  • 20:35We're not super close to
  • 20:37worrying about this really,
  • 20:38because for example, F MRI images,
  • 20:41it just sort of tells you which
  • 20:42parts of the brain are getting
  • 20:44more oxygen and things it doesn't.
  • 20:45It doesn't tell you what anyone's thinking,
  • 20:47but there are lie detectors and there
  • 20:49are much better kind of neuroscience
  • 20:51informed lie detectors than the
  • 20:52ones that just keep track of your
  • 20:54pulse and your breath rate, right.
  • 20:56And we know that we can have people
  • 21:00on exploratory kind of brain imaging
  • 21:03machines and find out what kinds of
  • 21:06things they're thinking or what parts of
  • 21:08their brain are implicated in thinking.
  • 21:11There's a really famous ethics
  • 21:15problem called the trolley problem
  • 21:17that probably many of you.
  • 21:18Yeah, I see people nodding like,
  • 21:20oh God, not again.
  • 21:23So the the the trolley problem is
  • 21:28trolley's coming along.
  • 21:29You happen to be standing near
  • 21:31a lever for no good reason,
  • 21:33and the trolley is going
  • 21:35to plow into five people.
  • 21:37But if you push the lever,
  • 21:39it'll be shifted to a different
  • 21:42track and plow into one person.
  • 21:44And the question is,
  • 21:46do you push the lever?
  • 21:48And most people, not everyone,
  • 21:50but most people, when asked,
  • 21:52are willing to push the lever because
  • 21:54they would rather sacrifice one innocent
  • 21:56person than five innocent people,
  • 21:58given that they know nothing else
  • 22:01about who the people are and so on.
  • 22:03Some people say no because they
  • 22:05don't want to be part of the
  • 22:07causal chain that kills anyone.
  • 22:08They feel like I'm just going to keep
  • 22:10away from that lever and it's not my fault.
  • 22:12Whereas if I push the lever,
  • 22:13I killed that one person,
  • 22:16even though I also saved the phone.
  • 22:18OK.
  • 22:18But then there's another version
  • 22:20of the same problem,
  • 22:21which is the the bridge problem,
  • 22:25where the trolley is coming along,
  • 22:27is going to hit 5 people and there's
  • 22:29a bridge over the trolley track and
  • 22:32you're standing there next to a big guy
  • 22:36and you realize if I give this guy a shove,
  • 22:39he's going to fall down in
  • 22:41front of the tribe, block it.
  • 22:42I mean the trials going to run into him and
  • 22:45get all tangled up in his body and stop.
  • 22:47So I will have sacrificed one
  • 22:50innocent person and saved 5.
  • 22:51So it's the same as the previous problem.
  • 22:55Instead of pulling a lever,
  • 22:58I push a guy.
  • 23:00And apart from that,
  • 23:01I'm to blame for the death of
  • 23:04the one and I've rescued 5.
  • 23:07And when you ask people about that,
  • 23:09most people say I wouldn't do that.
  • 23:13And all the people who say, oh,
  • 23:14I'd push the lever in the first case say, oh,
  • 23:15I wouldn't push the guy in the second case.
  • 23:17And Josh Green at Harvard did brain imaging
  • 23:23studies and proved that when we humans
  • 23:26think about those two kinds of cases,
  • 23:29we use different physical parts
  • 23:31of our brain to think about them.
  • 23:34The first one,
  • 23:35because you're thinking about a lever,
  • 23:37and it's very abstract and calculative.
  • 23:39We deal with our prefrontal cortex,
  • 23:41but when we're thinking about
  • 23:44pushing someone to his death,
  • 23:46a much deeper,
  • 23:48older,
  • 23:48more reptilian part of the
  • 23:51brain is implicated.
  • 23:53We think about that problem in a
  • 23:55different part of our brain and we go
  • 23:58that's just wrong in a in a really deep,
  • 24:02responsive,
  • 24:03automatic way.
  • 24:06That was too long a story to make
  • 24:09the point that we are already
  • 24:11able to use imaging to determine
  • 24:13which parts of someone's brain is
  • 24:15engaged when they think about what.
  • 24:17And it's not far from there
  • 24:20to worries about how.
  • 24:22Well eventually we will be able
  • 24:24to understand what people are
  • 24:26actually thinking just by seeing
  • 24:28what portions of their brain
  • 24:30they're using to do that thinking
  • 24:32and and similar problems.
  • 24:34So that is a unique kind
  • 24:36of ethics of live brain research problem
  • 24:39that we don't get in other areas.
  • 24:41Second we have the problem of the moral
  • 24:44status of the thing that's growing in
  • 24:46the dish and here I'm thinking about
  • 24:52brain organoids for example or something.
  • 24:55You've got quite a few,
  • 24:57quite a few articles and a lot of
  • 24:59bioethics attention just a few weeks ago,
  • 25:01these mix of of brain tissues and
  • 25:06mechanical AI elements which in
  • 25:11a recent publication have been
  • 25:13shown to be able to play pong.
  • 25:17OK, they can play pong,
  • 25:19which some of you,
  • 25:20if you're old enough,
  • 25:21might remember as a really primitive
  • 25:23video game from your youth.
  • 25:27So the question is if this thing
  • 25:30you can see an object coming and
  • 25:33strategize about how to meet it and
  • 25:35hit it in order to send it back
  • 25:37toward the goal post at the other
  • 25:39end of this hypothetical game field,
  • 25:46is it sentient? It seems to
  • 25:48be responding to external,
  • 25:52external stimuli.
  • 25:55That's what we do when we see.
  • 25:56That's what we do when we feel and hear.
  • 25:58Is this thing sentient?
  • 26:00If it's sentient, is it thinking?
  • 26:01Is it deciding what to do?
  • 26:03If it's deciding what to do?
  • 26:05Is it rational?
  • 26:07Do we need to respect it as
  • 26:09a moral character there?
  • 26:10And it's a little dish,
  • 26:11even though we made it and we
  • 26:13turned it on yesterday, right?
  • 26:15That doesn't come up in other
  • 26:17fields other than brain sciences,
  • 26:19except AI people are worried about.
  • 26:22It's very strongly parallel.
  • 26:24People are very worried about the
  • 26:26acquisition of certain kinds of
  • 26:29human like traits by AI or by brain
  • 26:32organoids that might give them
  • 26:34a certain kind of moral status,
  • 26:36which would then limit us in the way
  • 26:39that we could treat them in the dish.
  • 26:40We might have to start thinking
  • 26:42about them more like mice, say,
  • 26:45than we think about just tissue samples.
  • 26:51Another thing. I'll say another
  • 26:54difference is there's a possibility
  • 26:56in some kinds of live brain research,
  • 26:58we've had some dramatic examples of this of
  • 27:03of the research causing what we might
  • 27:06think of as changes in personal identity.
  • 27:13So for example, my colleague Joe Finns,
  • 27:16who was I've been a visitor at my
  • 27:18bioethics center and because of visits
  • 27:20at my bioethics center is now affiliated.
  • 27:22He's he's a neuroscientist from
  • 27:25from from Weill Cornell Medical,
  • 27:27but he's affiliated with the law school here.
  • 27:29He works with law students on issues
  • 27:32about various diminished states
  • 27:34of consciousness, that you know,
  • 27:37the difference between permanent vegetative
  • 27:39state and persistent vegetative state
  • 27:41and minimal consciousness and so on.
  • 27:45Anyway, Joe has done some work on direct
  • 27:48brain stimulation for Parkinson's that he's
  • 27:51published some stuff about very recently.
  • 27:53And direct brain stimulation for Parkinson's
  • 27:56in many patients is fantastic in terms of
  • 28:00restoring their physical ability to move.
  • 28:04You put electrodes in the brain.
  • 28:05You you stimulate some of the cells
  • 28:07in the brain to be to some of the
  • 28:11neurons to be charging and and working.
  • 28:13And this results in improving
  • 28:15dramatically the physically the physical
  • 28:18mobility of Parkinson's patients.
  • 28:21And it also some similar work that Joe
  • 28:24has been working on that shows it it it
  • 28:27takes people from minimally conscious
  • 28:29states to slightly more conscious
  • 28:32states because of stimulating the
  • 28:34brain to do its work electronically.
  • 28:36But there are cases,
  • 28:38one from Scandinavia and
  • 28:42guessing It's almost 10 years ago now,
  • 28:44that it was the first documented
  • 28:45case where someone had Parkinson's
  • 28:47and the deep brain stimulation
  • 28:51cured their mobility problem,
  • 28:54but induced in them a really
  • 28:58uncontrollable manic personality.
  • 29:00So much so that the person needed
  • 29:03to be civilly committed so he
  • 29:06could walk around and move.
  • 29:08He had all this restored mobility,
  • 29:10but he had to be committed because he
  • 29:13was uncontrollably manic and therefore
  • 29:15a danger to himself and others.
  • 29:18And you could imagine.
  • 29:19And here I should give credit to
  • 29:21Adina Roskis, who writes about this
  • 29:23in her neuroethics article on the
  • 29:26Stanford Encyclopedia of of Philosophy.
  • 29:29But you could imagine circumstances in which
  • 29:33somebody like this objects to their state.
  • 29:37They they have capacity to
  • 29:38understand the situation.
  • 29:39And when they're manic and and
  • 29:43and confined because they're a
  • 29:45danger to themselves and others,
  • 29:47they could say stop the treatment.
  • 29:49I don't want this treatment anymore.
  • 29:50It's causing me to be confined.
  • 29:53And then they go back to the state
  • 29:56where they have highly compromised
  • 29:59mobility and they're depressed about
  • 30:01that and they say give me the treatment,
  • 30:05I hate this, right.
  • 30:06And that's an extreme version of the
  • 30:09problem that and we've seen this in
  • 30:11all kinds of deep brain stimulation,
  • 30:14everything from from the level where
  • 30:17where mood can be dramatically changed
  • 30:20just by the presence of deep brain
  • 30:23stimulation to serious personality
  • 30:28components more dramatically
  • 30:31impactful than just mood changes can
  • 30:34be changed by brain intervention.
  • 30:36So that is a unique problem that
  • 30:39brain research faces also.
  • 30:42And finally I will mention, well,
  • 30:45now I have two more things to mention.
  • 30:47One is my colleague at the
  • 30:48philosophy department here at Yale,
  • 30:50Lori Paul has this has this idea
  • 30:55of the transformative experience,
  • 30:57and her examples are things like
  • 30:58getting married or buying a house.
  • 31:00What she says is you can talk
  • 31:04to as many people as you want.
  • 31:05You can watch as many movies and
  • 31:07read as many novels as you want.
  • 31:08But guess what?
  • 31:10You don't know what it's like to
  • 31:13be married till you get married.
  • 31:15You don't know what it's like
  • 31:17to buy your first house till you
  • 31:19buy your first house.
  • 31:20So you can't consent to that.
  • 31:25No one can adequately tell you
  • 31:27what it's going to be like for you.
  • 31:28It's a transformative experience.
  • 31:30It's going to be radically different from
  • 31:32anything that you've ever thought about,
  • 31:35and it might even change who you are.
  • 31:38OK,
  • 31:38it turns out that there might
  • 31:40be quite a few of those.
  • 31:41And it also turns out that many kinds of
  • 31:46brain interventions might be those
  • 31:50kinds of transformative experiences,
  • 31:53and that might actually pose some
  • 31:55interesting ethical problems about
  • 31:56the nature of informed consent.
  • 32:01Finally, there's the issue in Brain
  • 32:03in Live Brain Science of the whole
  • 32:06idea of a neurodiversity movement,
  • 32:09which raises lots of questions about
  • 32:11what is actually the condition
  • 32:12that should be treated right.
  • 32:14There are people with autism who
  • 32:16claim that autism is not a disease.
  • 32:20It's not the kind of thing we
  • 32:21should be trying to eradicate.
  • 32:22It's not the kind of thing
  • 32:23that I require treatment for.
  • 32:24It's just a different way of
  • 32:26perceiving the world, in some cases,
  • 32:28quite a radically different
  • 32:29way of perceiving the world.
  • 32:30But in many cases,
  • 32:32it allows people with autism
  • 32:35extraordinary kinds of access to
  • 32:37all kinds of sensory input that
  • 32:39other humans just don't get right.
  • 32:42And the whole neurodiversity issue is
  • 32:44a particular challenge in brain science
  • 32:47in a way that is not in other areas.
  • 32:53OK, so that was my, I don't know
  • 32:58my 15 minute overview of of all
  • 33:01of neuroethics and which bits
  • 33:04of it are unique to live brain
  • 33:07work and which bits of it aren't.
  • 33:11So now I'm going to talk
  • 33:13about mostly pig brains.
  • 33:21So in 19, it was 19. That's how old I am. In
  • 33:272018, I think it was, I got a panicked
  • 33:31phone call from Nenad Sestan and a
  • 33:34couple of his senior doctoral and
  • 33:38postdoctoral students in his neural
  • 33:40lab here at Yale Med, he said.
  • 33:45With this project where we're trying
  • 33:47to keep brain tissue alive so
  • 33:49that we can do it in our research,
  • 33:52and what we've been doing is we've
  • 33:54been hooking up these brains to
  • 33:56a perfusion machine that pushes
  • 33:58A perfusate through the brain.
  • 34:00And and just for caution's sake,
  • 34:04we've been putting Eegs on them.
  • 34:07And yesterday we saw some organized
  • 34:11activity showing up on the EEG,
  • 34:13so we shut down the entire experiment
  • 34:14and we heard you were a bioethicist
  • 34:16and we wanted to talk to you about
  • 34:20it. So two things were great
  • 34:23about that phone call.
  • 34:24One is that it started me in a relationship
  • 34:27with Sestine Lab that has lasted to now,
  • 34:30which has been one of the more interesting
  • 34:32and rewarding parts of my professional life.
  • 34:34And two, the EEG signals they
  • 34:36were getting were artifactual.
  • 34:38They were just coming off other
  • 34:39parts of their electrical equipment,
  • 34:41and the brain had not actually woken up.
  • 34:43So first we got to go, OK,
  • 34:46but then we got to think, right?
  • 34:50Because here's what they were doing.
  • 34:53They were getting
  • 34:56in an effort to preserve functional
  • 35:00tissue for as long as possible
  • 35:03for use in the lab, right?
  • 35:05They wanted to have tissue that they
  • 35:07could use and research that had did
  • 35:09not have a lot of warm ischemic time
  • 35:11and a lot of cellular damage and so on.
  • 35:13So they thought, can we come up
  • 35:15with a preservation method that
  • 35:16we can use in the lab?
  • 35:18And in an effort to try to preserve tissue,
  • 35:20a couple of the postdocs in the
  • 35:23lab came up with a perfusion
  • 35:26machine and a perfusate
  • 35:30that they could hook surgically
  • 35:33to the vasculature of a pig brain
  • 35:36and pump the fluid through it,
  • 35:38supplying it with oxygen, supplying it with
  • 35:43glucose and so on. And then
  • 35:48and cycle that through again and again,
  • 35:50filter out cellular waste that's
  • 35:52taken out of the brain and
  • 35:54just keep going with the brain.
  • 35:56And at this point,
  • 35:57they were able to keep brains going
  • 36:00for about six hours or so post mortem.
  • 36:04So they would they would go to a
  • 36:07slaughterhouse near New Haven where
  • 36:09pigs were being killed for sausage.
  • 36:12And they would buy a head
  • 36:15from the slaughterhouse dude,
  • 36:16and take the head on ice back to the lab,
  • 36:19exsanguinated.
  • 36:19And it would take them about a
  • 36:22full hour from the pig's death
  • 36:25to attach the now ex vivo brain.
  • 36:28The whole head has been removed,
  • 36:31had no blood and therefore
  • 36:34no oxygen for an hour,
  • 36:37and at about the one hour point
  • 36:38they could finish up the surgery
  • 36:40and have the have the thing
  • 36:41hooked up to the machine.
  • 36:42And then they started to profuse it.
  • 36:43And what they found was that a lot of
  • 36:48the ischemic damage was successfully
  • 36:50reversed by exposure to this fluid,
  • 36:53and the brains began to function
  • 36:58normally
  • 37:01metabolically, taking in appropriate
  • 37:03levels of oxygen and glucose,
  • 37:06releasing appropriate
  • 37:07levels of cellular waste.
  • 37:09A lot of what happens with
  • 37:11ischemic damage is that cells
  • 37:13begin to swell with waste,
  • 37:14and then they burst.
  • 37:16But being exposed to this
  • 37:18perfusate actually cleared a
  • 37:19bunch of that waste and reversed
  • 37:21ischemic damage across all the
  • 37:23different kinds of cell types
  • 37:25that you can find in a pig brain.
  • 37:29So the paper in 2019 came out,
  • 37:31Nature, basically showing that
  • 37:33they've been able to reverse ischemic
  • 37:36damage post mortem and sustain the
  • 37:39brain metabolically for six hours.
  • 37:45Why six hours?
  • 37:46Because their controlled brains,
  • 37:48which weren't being perfused,
  • 37:50had turned into yogurt at six hours,
  • 37:53so they had nothing more to
  • 37:55compare their study brains to.
  • 37:56So that's sort of where they
  • 37:58had to stop because the control
  • 38:00just didn't work anymore.
  • 38:01Today, it's not hours, it's days.
  • 38:06It's more than a week in some cases,
  • 38:09that they can keep a pig brain
  • 38:11metabolically functioning
  • 38:12on this perfusion machine.
  • 38:17Do they wake up? You may ask.
  • 38:20No, they do not, and they cannot.
  • 38:23And the reason that they cannot is that
  • 38:25the perfusate contains no sodium and
  • 38:28neurons cannot fire without any sodium.
  • 38:32When you take a neuron out of one of these
  • 38:36brains and rinse off the perfusate and
  • 38:39give it the right kind of atmosphere,
  • 38:42it'll it'll fire again.
  • 38:44They're alive.
  • 38:46They're metabolizing glucose and oxygen.
  • 38:48They're releasing cellular waste.
  • 38:50They're just not firing.
  • 38:51So it's as if we had a perfectly
  • 38:56functioning Ford Montego that we
  • 38:59were studying because we wanted to
  • 39:02learn things about Ford Montego's.
  • 39:04We just don't turn it on.
  • 39:07OK, so the pig brains are in there and
  • 39:12they're being and they're being exposed
  • 39:16to this perfusate and they are living
  • 39:19at the metabolic and cellular level,
  • 39:22not just for hours anymore,
  • 39:23but for days and days.
  • 39:24We've done this now with hundreds
  • 39:26and hundreds of big brands.
  • 39:28They're all EEG monitored.
  • 39:29We've never seen any evidence of
  • 39:31organized electrical activity of the
  • 39:33sort that would be associated with
  • 39:35consciousness or experience or anything.
  • 39:37And besides that,
  • 39:38they're just brains in a vast
  • 39:40they're not connected to a nervous
  • 39:42system or eyeballs or ears.
  • 39:44So there's nothing coming in
  • 39:46to them to stimulate anything,
  • 39:48any kind of activity.
  • 39:54So what kinds of issues does this raise?
  • 39:57Ethical issues, does this raise?
  • 39:59Because this is different from doing
  • 40:01this kind of research in a living animal.
  • 40:06You know, I think you and I will
  • 40:09agree when we have the brain in
  • 40:11the lab and we're perfusing it.
  • 40:13And it's been going there for six hours
  • 40:17and then it's been there for two days.
  • 40:19The pig is dead by cardiovascular criteria,
  • 40:24if nothing else, right?
  • 40:25The pig is dead.
  • 40:27It's been ground up for sausage.
  • 40:29So the kinds of questions
  • 40:31that we have are about,
  • 40:33they're more like the moral
  • 40:35status of what's in the dish,
  • 40:37kind of questions that I raised earlier
  • 40:39about what's going on with this,
  • 40:40with this brain.
  • 40:41There are of course the
  • 40:43normal analytics issues.
  • 40:45Pigs are smart social animals.
  • 40:48Should we be using them
  • 40:51for this kind of research?
  • 40:53And I think I
  • 40:59I work on this but I answer in the
  • 41:01affirmative and the reason that I
  • 41:03answer in the affirmative is that
  • 41:05there is a huge amount of promise in
  • 41:08if you can keep a a large mammalian
  • 41:12brain functioning on a machine that
  • 41:16is a much better platform for testing
  • 41:18brain disease related drugs than
  • 41:21mice which is the current standard.
  • 41:24And it is also it is permitted
  • 41:29some kinds of unprecedented and
  • 41:32soon to be published brain mapping.
  • 41:36If if you have a brain that is
  • 41:38that you can observe while it's
  • 41:41metabolically functioning for days,
  • 41:44you can put markers in that brain
  • 41:46and you can map the brain connectome
  • 41:49in a cell by cell, way,
  • 41:51that has just never been done before.
  • 41:53And that will be incredibly
  • 41:55awesome for brain science.
  • 41:56Not only clinical,
  • 41:58but just basic brain science.
  • 42:00So there's lots of cool stuff about this.
  • 42:03So the animal ethics thing,
  • 42:04I think it is worth
  • 42:06actually sacrificing pigs.
  • 42:07The bonus is that we can say for the vast,
  • 42:13vast majority of the hundreds
  • 42:15and hundreds of pigs we've used,
  • 42:16we can say that these pigs didn't die for
  • 42:19the research because they died for sausage,
  • 42:21right?
  • 42:21So we're getting the brains from
  • 42:23pigs that were going to be killed
  • 42:26anyway that indeed were killed,
  • 42:27right?
  • 42:32Second,
  • 42:35does this do anything to brain death?
  • 42:40Most states in the country use
  • 42:43the Uniform Determination of
  • 42:45Death Act to decide who's dead,
  • 42:47and they have two standards.
  • 42:48They have cardio pulmonary standards.
  • 42:50They have brain death standards.
  • 42:52Cardio pulmonary Standards says
  • 42:53complete and irreversible cessation
  • 42:55of cardio pulmonary function.
  • 42:58Brain death standards as complete and
  • 43:00irreversible cessation of all brain function,
  • 43:02including in the brain stem. OK.
  • 43:09Irreversible might actually be a
  • 43:13little bit undermined by this work.
  • 43:16Because if you think about taking a
  • 43:18brain that has not had any oxygen,
  • 43:20not even any blood for an hour and
  • 43:25getting it to turn on and function at
  • 43:28the metabolic and cellular levels,
  • 43:30that's at a minimum.
  • 43:31That means that a lot of stuff
  • 43:34that was thought to be irreversible
  • 43:37a couple of years ago is now
  • 43:40very possibly reversible.
  • 43:42What we've discovered is,
  • 43:44is a great deal more robustness.
  • 43:47That's a word Robustity to to
  • 43:51ischemic damage than anybody thought.
  • 44:00Luckily for us, our work is hardly
  • 44:02the only thing that's getting in the
  • 44:04way of our current standards for
  • 44:06brain death and of that law that
  • 44:08determines what brain death is.
  • 44:10There's a million problems
  • 44:12with brain death right now.
  • 44:14First of all, some groups,
  • 44:16for religious and cultural reasons,
  • 44:18object to the very idea of brain death.
  • 44:20Like many Orthodox Jews think that
  • 44:22in in the Torah and tell what
  • 44:25a commentary's life is so fully
  • 44:28associated with breath that they
  • 44:30don't want to accept the brain death
  • 44:31criteria and they only want to accept
  • 44:33cardio pulmonary criteria for death.
  • 44:36Apart from that, many other people just
  • 44:39don't trust a diagnosis of brain death,
  • 44:42partly for these very understandable
  • 44:43reasons that someone who's brain dead,
  • 44:45who's still hooked up to a vent,
  • 44:48doesn't look dead.
  • 44:49Their chest is rising and falling.
  • 44:50Their heart is beating.
  • 44:52Their skin looks not Gray or or, you know,
  • 44:56I mean they they look alive and they
  • 44:59sort of have to trust a Doctor Who
  • 45:03says that your loved one is is dead.
  • 45:06And many people in some cases for
  • 45:09very good historical reasons,
  • 45:10don't trust the doctors on
  • 45:12that kind of thing.
  • 45:15And in addition to that,
  • 45:18there have been several court challenges
  • 45:20around the United States that have
  • 45:23basically effectively shown that
  • 45:24many of the tests that are organized
  • 45:27by specialty societies to to test
  • 45:29for the presence of brain death
  • 45:32don't actually test for complete and
  • 45:35irreversible cessation of all brain function,
  • 45:38including the brain stem.
  • 45:41So many people have shown that
  • 45:45particularly a lot of glandular
  • 45:47function keeps on going after many
  • 45:49people are actually declared brain
  • 45:50dead and there's a bunch of isolated
  • 45:53neuronal firing that's going on.
  • 45:54So it doesn't mean that all
  • 45:56the cells are dead.
  • 45:57Is that continued function or not?
  • 46:00Right.
  • 46:00So there's a lot of controversy
  • 46:03about brain death and about the
  • 46:09UDDA, the Uniform Determination
  • 46:10of Death Act, and a lot of people
  • 46:12saying it needs to be revisited.
  • 46:14And our research, I think,
  • 46:16only adds to that pretty
  • 46:21serious contemporary problem for
  • 46:22bioethics and and and policy.
  • 46:32But there's another issue,
  • 46:34which is that sometimes brains that
  • 46:36you get to use in research come
  • 46:40from organizations that have simply
  • 46:42gotten someone's blanket consent
  • 46:45to donate their entire body for
  • 46:47medical research after they die.
  • 46:49And this is not a very well regulated area.
  • 46:54People who you might actually
  • 46:55just very recently, you might,
  • 46:57you might profitably watch
  • 46:58John Oliver on this.
  • 46:59He just had a piece come out on
  • 47:01this like this week right there.
  • 47:03There's a lot of problems because the
  • 47:07quality of consent necessary to have
  • 47:10your body used after you're dead is
  • 47:13it's not the standards of informed consent.
  • 47:15You don't have to fully understand everything
  • 47:16that's going to go on to your body.
  • 47:18You just sort of say,
  • 47:18I am yielding my body to research
  • 47:21and the people who receive the body
  • 47:23are not well regulated and some of
  • 47:25them are totally upright and doing
  • 47:27things that you might expect if
  • 47:28you donate your body to research.
  • 47:30And some of the some of them are doing
  • 47:32things that you wouldn't expect.
  • 47:34And for me,
  • 47:35if somebody donated their whole
  • 47:37body to research,
  • 47:38I would like it if they had
  • 47:41been told before they did that,
  • 47:43that their brain might be kept
  • 47:46metabolically functioning for a
  • 47:48week or two postmortem right now.
  • 47:52If we were to get a brain to do this,
  • 47:55get a human brain to do this kind of
  • 47:58research from one of these outfits
  • 47:59that that takes these things,
  • 48:01takes donations of bodies for research,
  • 48:04I don't think we'd be getting the kind
  • 48:06of consent that I wish that we were getting.
  • 48:09But that might require
  • 48:10changes in laws and so on.
  • 48:12I will say
  • 48:15the move to human brains from pig
  • 48:18brains is definitely happening
  • 48:21in our line of research,
  • 48:24and in some ways that's great.
  • 48:27For example, if you're getting
  • 48:29brains from people who consent,
  • 48:31that's my time up,
  • 48:32but I'll be done in just a second.
  • 48:35If you're getting brains
  • 48:36from people who consent,
  • 48:39you don't have any of the animal
  • 48:40ethics issues that you used to have,
  • 48:42right, 'cause people can understand,
  • 48:43oh, you know, you're going to
  • 48:44do this to my brain. That's OK.
  • 48:46And and they donate post mortem,
  • 48:50no humans will be killed for the research,
  • 48:52even though it might be the case
  • 48:54that the occasional pig might be
  • 48:56killed for the research and so on.
  • 48:58So there's there's ways in which moving into
  • 49:01the human world is an unethical relief.
  • 49:04If you're going to try to do this research,
  • 49:07what kinds of research can you do with a
  • 49:10human brain that's being sustained on this,
  • 49:12on this machinery?
  • 49:16There's there's a functional brain bug,
  • 49:20brain barrier.
  • 49:21You can look at the way brains metabolize
  • 49:24different kinds of potential drugs.
  • 49:27You can test drugs in the brains of
  • 49:29people who actually had the disease
  • 49:31that you're testing the drug for.
  • 49:34It's just a really amazing model.
  • 49:36And as I said, you could do
  • 49:39amazing brain mapping of a kind
  • 49:41that's not been possible before.
  • 49:46So then the last thing that I will mention,
  • 49:49which is the thing that most
  • 49:51people really think about when
  • 49:52they first hear about this,
  • 49:53is this problem of consciousness.
  • 49:56I've told you that our perfusate
  • 49:59doesn't permit neuronal firing.
  • 50:02I would happily donate my brain
  • 50:04to this research project because
  • 50:06I know that it won't wake up.
  • 50:09OK, I'm entirely, 110% confident of that.
  • 50:14Why won't it wake up?
  • 50:16Because we don't want it to.
  • 50:18And by the way,
  • 50:19Yale owns the intellectual property
  • 50:21to the Perfusade and the Machine,
  • 50:23and they're not going to license it to
  • 50:26anybody who wants the brain to wake up.
  • 50:30But if you tinkered with
  • 50:32the perfusion formula
  • 50:36and put in some sodium and changed
  • 50:40it in a couple of other ways,
  • 50:44is it possible that a perfused
  • 50:47brain could wake up in the machine?
  • 50:52We don't know.
  • 50:52But yeah, it's probably.
  • 50:54It might well be possible now.
  • 50:56It might be that in the total absence
  • 50:58of any sensory stimulation whatsoever,
  • 51:01it would have no desire to wake up.
  • 51:03It would not be motivated to wake up.
  • 51:04Doesn't hear anything.
  • 51:05Doesn't see anything. Right.
  • 51:06But let's say it did wake up,
  • 51:08then what would that be like?
  • 51:09Would it be wondering where it was?
  • 51:11Would it be saying, oh damn,
  • 51:13I should never have signed that consent form?
  • 51:16Would it be having a dream?
  • 51:17Would it be having a panic attack?
  • 51:19What is a panic attack like
  • 51:21if you don't have a body,
  • 51:22what is a panic attack like if you
  • 51:26don't have breath and muscles, right?
  • 51:32And you might ask,
  • 51:33in connection with the animal ethics thing,
  • 51:34how would a pig brain waking up in
  • 51:37the VAT compare to the brain of a
  • 51:40pig that was used to model traumatic
  • 51:42brain injury for human beings?
  • 51:45I sort of feel like we're already
  • 51:47doing much worse things to pigs
  • 51:49than this would constitute doing,
  • 51:50but I'm really also not sure of
  • 51:52the moral status of that argument.
  • 51:54I think it kind of assumes too much
  • 51:56about what we're doing already being OK
  • 52:02For my part, I think that I cannot
  • 52:06imagine the research study that would
  • 52:09justify inducing consciousness either
  • 52:11in animals or in human brains that
  • 52:13are hooked up to these machines.
  • 52:16The kinds of things that I think about when
  • 52:17I think about who would want to do that,
  • 52:19I think about there's some characters
  • 52:22out there who want to do brain
  • 52:24transplants so they can move someone's
  • 52:26brain into a non disabled body.
  • 52:28I think that's an an illegitimate
  • 52:30medical goal and I would oppose using
  • 52:32this kind of technology to support that.
  • 52:34On those reasons,
  • 52:36there are transhumanists.
  • 52:37I hear from them all the time.
  • 52:40They want their because they're so,
  • 52:42I guess because they're so important.
  • 52:44They want their brains to be kept alive
  • 52:47on machines for a really long time until
  • 52:49we can supply them with robotic bodies.
  • 52:52So they can just go on forever.
  • 52:57And I don't think that's
  • 52:59a legitimate goal either.
  • 53:00And and then maybe if somebody were
  • 53:04trying to create brain machine interfaces,
  • 53:08maybe that would be a reason to
  • 53:11allow a brain into that to wake up.
  • 53:13But I actually think you'd probably
  • 53:15do that much more effectively
  • 53:17with living human volunteers.
  • 53:18So I don't see any reasons at the
  • 53:21moment to allow consciousness to occur.
  • 53:27So I guess my quick conclusion, and I'm late.
  • 53:32Neuroethics is its own field.
  • 53:35It needs the space and the specialty
  • 53:37journals that is carved out.
  • 53:39It needs experts.
  • 53:41I am not one, but many of the
  • 53:46problems that neuroethics faces
  • 53:47are entirely common with problems
  • 53:50that non specialty bioethicists
  • 53:52like myself face every day.
  • 53:55And we can inform those debates.
  • 53:57Some few are on their own and really
  • 53:59need specialty people who understand
  • 54:01the science in order to address
  • 54:03them and and the problem of live
  • 54:08brain research in ex vivo brains.
  • 54:10I think the main ones that we face
  • 54:14are the is the question of would
  • 54:16it ever be legitimate in the future
  • 54:20to permit a brain to experience
  • 54:23or even to risk experiencing
  • 54:26consciousness in this state.
  • 54:28What kind of consent would
  • 54:29we need for that and so on.
  • 54:33Questions. Right. Mark. Yeah.
  • 54:42So if you wanted, you can have a seat.
  • 54:44You can stand however
  • 54:45you want to do this. And
  • 54:46I'll look here I've got a this is,
  • 54:48this is on. So I can set Michael.
  • 54:50Work. Is that working?
  • 54:51Yeah. So that works.
  • 54:53So I invite your questions and folks,
  • 54:56Karen will eventually bring me a laptop
  • 54:58up and we'll work with that.
  • 55:00First, I just wanted to let you
  • 55:02know that I heard from the guys
  • 55:03in my club during the talk,
  • 55:06the big guys and and here's The thing is
  • 55:09we are large guys, we're affectionate,
  • 55:12we like the hug. So just be aware,
  • 55:14should any of you choose to throw
  • 55:16any of us off the bridge. OK,
  • 55:20Just be aware you're coming
  • 55:22with us. OK. All right.
  • 55:24Just so just so we're just
  • 55:25so we're clear on that.
  • 55:26I thought that was a fascinating
  • 55:27talk, a lovely overview,
  • 55:29as well as of course very tantalizing
  • 55:32talk on the on the cutting edge
  • 55:34stuff that's been going on the last
  • 55:36couple years over in the main campus.
  • 55:38So there's, there's much to talk about.
  • 55:39So let's start, Please wait one
  • 55:43second. I nearly forgot.
  • 55:44We got to do this properly so
  • 55:45that the people on Zoom can
  • 55:48hear and also there we go.
  • 55:51Thank you. So please do wait until you have
  • 55:53a mic so the folks on Zoom could hear and
  • 55:55also so that everyone in the room could
  • 55:57hear You hold it up nice and close so
  • 55:58it doesn't work like you. Really. Yeah.
  • 55:59So interesting talk.
  • 56:00I just want to ask you a quick question.
  • 56:03So I don't know how many billion
  • 56:05neurons a pig brain has.
  • 56:07The human has about 100 billion,
  • 56:10of which there's enormous
  • 56:11variability in susceptibility
  • 56:14to oxygen deprivation, right.
  • 56:16So you you, you made the assertion
  • 56:20that these neurons were alive.
  • 56:23And it's unclear to me how you made that
  • 56:26assertion and how much you're sampling,
  • 56:28because you're saying there's no neuronal
  • 56:31activity because there's no sodium.
  • 56:32And yet you're seeing,
  • 56:34you're saying you can
  • 56:35measure that they're alive.
  • 56:39What do you mean by that?
  • 56:40Yeah, a couple of ways.
  • 56:43When you take them out of the brain,
  • 56:45they twitch. How many?
  • 56:48How many cells do you look at?
  • 56:51It is not 100% of the cells that
  • 56:54recover from ischemic damage.
  • 56:56And how many cells?
  • 56:59I don't know. You have to read the
  • 57:02Nature cover story that ran in 2019.
  • 57:04That was the first one to talk about
  • 57:06the restoration of metabolic and
  • 57:08cellular function in the pig brains.
  • 57:10The exact numbers are in that article,
  • 57:12but I don't know them.
  • 57:13But my general impression of the
  • 57:16result is that it was a small minority
  • 57:19of cells that had not survived
  • 57:22the hour and had not been revived.
  • 57:26And one of the major ways that
  • 57:28you can tell this is by comparing
  • 57:31controlled brains that
  • 57:35in live pigs, by measuring input and
  • 57:37output in the vasculature of live pigs,
  • 57:40to the brains in the machinery and
  • 57:43looking at the amount of oxygen and
  • 57:46glucose that is taken in and used and
  • 57:49the amount that comes out at the other
  • 57:50end after having gone through the brain.
  • 57:52Right. So you're looking at what goes
  • 57:55into the brain and what comes out of
  • 57:58the brain in terms of food and waste.
  • 58:00And what my understanding of what they
  • 58:03saw in the 1st paper was that very
  • 58:06rapidly the brain began behaving in a way
  • 58:10that was completely characteristic of a
  • 58:13traumatized brain in a living animal, right.
  • 58:16So it was acting like an injured brain in
  • 58:18terms of how much material it was taking
  • 58:20in and how much cell it was giving out,
  • 58:22more than the usual amount of cellular
  • 58:24waste because it's trying to clear out
  • 58:25some of the stuff from the damage.
  • 58:27And then by the end of the multiple
  • 58:30hours of perfusion,
  • 58:31it was looking more like a normal
  • 58:34brain in terms of how the whole
  • 58:36brain is metabolizing material.
  • 58:40I think there's much better and much
  • 58:42more detailed information now that
  • 58:44hasn't yet been published because
  • 58:46these days the lab is keeping the
  • 58:48brains going for a week at a time
  • 58:50rather than for hours at a time.
  • 58:56Please.
  • 59:04Hi. Thank you for the talk.
  • 59:06I'm curious like what what
  • 59:08kind of evidence or response
  • 59:10might convince you that a brain
  • 59:12or a system has become conscious?
  • 59:14And then maybe even a more pressing
  • 59:15question like what what kind of evidence
  • 59:17would convince you that we should start,
  • 59:19like to be safer, to be ethical,
  • 59:21like we should act as if
  • 59:22it's conscious just to be just to be safe?
  • 59:30I I think the baseline would be some
  • 59:33kind of network neuronal firing.
  • 59:37We know that in people who have
  • 59:39been diagnosed brain death, we see
  • 59:41isolated neurons firing here and there.
  • 59:43That doesn't give rise to experience.
  • 59:50And we have, you know, pretty,
  • 59:53pretty good understanding of the kinds of
  • 59:55the way that the way that network firing
  • 59:59looks in certain kinds of situations.
  • 01:00:01And I think we'd need to
  • 01:00:03detect something like that.
  • 01:00:05But again, in the research that's
  • 01:00:09currently being done at Yale,
  • 01:00:11the perfusate doesn't allow
  • 01:00:13any neuronal firing at all.
  • 01:00:16It allows the neurons to feed
  • 01:00:19themselves and take in oxygen and
  • 01:00:21expel cellular waste and so on,
  • 01:00:23but it doesn't allow them to fire.
  • 01:00:25So in that circumstance,
  • 01:00:28there's there's not anything there.
  • 01:00:30But if if you,
  • 01:00:32if you were to set up the perfusase
  • 01:00:35so that neuronal firing was possible,
  • 01:00:36then you'd be looking for some kind
  • 01:00:40of organized network firing activity.
  • 01:00:43What would count as the place
  • 01:00:44where you would say, oh,
  • 01:00:45now is the time to shut this off.
  • 01:00:48I don't know enough about,
  • 01:00:50but I actually think that there are
  • 01:00:52some neuroscientists out there who
  • 01:00:54know an awful lot about what different
  • 01:00:56kinds of states of awareness or
  • 01:00:58experience or sensation look like.
  • 01:01:02So I want to push back on one of the on
  • 01:01:07a fundamental premise this whole thing.
  • 01:01:10Just so I'm clear on this, We as a society,
  • 01:01:14we as a species have long ago decided
  • 01:01:17and we as a society still accept with
  • 01:01:20a notable minority who feel otherwise.
  • 01:01:25We have a society accept the fact
  • 01:01:26that one can take a pig and put it
  • 01:01:30in a little tiny pen forever, right?
  • 01:01:32And then when we're finally done
  • 01:01:34growing to a certain size,
  • 01:01:35we can then slaughter it, OK, and eat it.
  • 01:01:38Now many people,
  • 01:01:39including some good friends of mine,
  • 01:01:41would say, well that's the wrong thing to do.
  • 01:01:43But the fact of the matter is this is
  • 01:01:45widely practiced throughout the world,
  • 01:01:47has been throughout human history.
  • 01:01:49So I'm actually a little bit it
  • 01:01:52it doesn't quite pass for me the
  • 01:01:54straight face test to say,
  • 01:01:55hang on a second,
  • 01:01:56we're going to take this pig. We're going
  • 01:01:57to just treat it in for lots of better worth,
  • 01:01:59these incredibly cruel circumstances
  • 01:02:01throughout its entire existence.
  • 01:02:03But geez, we're not going to let
  • 01:02:04that brain wake up for a couple days,
  • 01:02:05because that would really be me to me.
  • 01:02:08What might be happening to
  • 01:02:09that brain for that six hours
  • 01:02:11or seven days now has got to
  • 01:02:15be infinitesimally trivial
  • 01:02:18compared to the experience that
  • 01:02:20Pig had throughout his life.
  • 01:02:22So I'm fascinated that there
  • 01:02:24has been such resistance.
  • 01:02:25Some people I'm amazed we took the
  • 01:02:28salt out of the perfusate that there
  • 01:02:31isn't more interest in seeing if they
  • 01:02:32can create that and and to me to say
  • 01:02:34that well this is this is because to
  • 01:02:36do so would be unkind to the pig,
  • 01:02:38to me that strange credulity
  • 01:02:40given the rest of the pig and Full
  • 01:02:43disclosure I eaten my share of pigs.
  • 01:02:45OK, but as most people in the room have.
  • 01:02:50But is there honestly a feeling
  • 01:02:52that we can't do this?
  • 01:02:53Because it would be unethical
  • 01:02:54for this pig brain to experiencing something
  • 01:02:57totally disregarding everything he's
  • 01:02:58experienced for its entire existence?
  • 01:03:02Two things. One is there are a lot of
  • 01:03:07people very close to all the members
  • 01:03:09of the research community who are
  • 01:03:10really worried about the premise.
  • 01:03:12Like maybe we shouldn't be treating
  • 01:03:13those pigs that way at all.
  • 01:03:15Maybe that is fundamentally wrong.
  • 01:03:16And if that's fundamentally wrong, you know,
  • 01:03:19why should our little world be different?
  • 01:03:21Why should we use the horribleness
  • 01:03:23of Agri business to justify our
  • 01:03:26own horribleness, right?
  • 01:03:27That's one kind of response.
  • 01:03:32Another kind of response is anything
  • 01:03:35you do with a pig brain today,
  • 01:03:37you might do with a human brain tomorrow.
  • 01:03:40And there is therefore some reasonable
  • 01:03:47prudential argument for not facilitating,
  • 01:03:52not creating the technology that might
  • 01:03:55allow us to do things that we find
  • 01:03:58fairly appalling with human brains.
  • 01:04:01Right. There's that kind of slippery
  • 01:04:03slope argument is is out there as well,
  • 01:04:08and those things, and
  • 01:04:16maybe this isn't an ethical statement at all.
  • 01:04:18This is just kind of a a a statement of fact.
  • 01:04:23There's nothing as far as I know,
  • 01:04:24there's nothing that Ceston lab or
  • 01:04:27that the Bexorg spin off company
  • 01:04:30that's trying to commercialize some
  • 01:04:32of the Ceston Labs techniques.
  • 01:04:34There's nothing that we want to
  • 01:04:35do with big brains that would be
  • 01:04:37assisted by having them be awake.
  • 01:04:41So if we can do our science
  • 01:04:45without even coming close to
  • 01:04:48stepping on an ethical landmine,
  • 01:04:51that's what we want to do.
  • 01:04:54Got it? I
  • 01:04:55have a question from someone in
  • 01:04:56the Zoom world who says could
  • 01:04:59you characterize the control?
  • 01:05:01I'm soon we're talking about
  • 01:05:02the pig experiments here again.
  • 01:05:02Could you characterize the control
  • 01:05:04conditions in a little more detail?
  • 01:05:07What are the control conditions
  • 01:05:08you referred to the control conditions
  • 01:05:10for the 1st paper where that we were
  • 01:05:15comparing a perfused the brain that was
  • 01:05:18being perfused with our perfusate to a
  • 01:05:21a brain that was not being perfused and
  • 01:05:24just comparing the conditions and the
  • 01:05:27ischemic damage that happened over time.
  • 01:05:30And we also had considerable measurements
  • 01:05:34about the activities of brains.
  • 01:05:36And you would not believe how well
  • 01:05:40mapped out the contours of pig
  • 01:05:42brains have been by Agri business.
  • 01:05:45They know what EEGS and pig brains
  • 01:05:48look like when pigs are hungry.
  • 01:05:50They know what Eegs look like.
  • 01:05:52And pigs that have been hurt
  • 01:05:53by being hit in the head,
  • 01:05:55they know what pigs brains EEG readouts look
  • 01:05:58like when pigs want to have sex, right.
  • 01:06:00It's incredibly well mapped.
  • 01:06:02So we had access to all that
  • 01:06:04kind of literature,
  • 01:06:05but we also had controlled brains that
  • 01:06:07were just being allowed to degrade.
  • 01:06:10And the reason,
  • 01:06:11my understanding of the reason for
  • 01:06:13why we stopped in the first article,
  • 01:06:15why we stopped reporting about
  • 01:06:17what was going on in hour six,
  • 01:06:19was that the controlled brain
  • 01:06:20at that point was just useless
  • 01:06:23and not creating any data.
  • 01:06:24It didn't just turn to mush.
  • 01:06:27So we're
  • 01:06:28looking at serial, presumably
  • 01:06:30serial Histology and watching
  • 01:06:31the necrosis progress until
  • 01:06:32the point until it's liquefied
  • 01:06:34or looking at the conditions of
  • 01:06:37cells at different points in time
  • 01:06:40and looking at input and output from
  • 01:06:44the perfused brains. Yeah, Thank you.
  • 01:06:49So I think like we can all think of like
  • 01:06:5310 people at least who would pay 50 to
  • 01:06:56$100 billion for this technology, right.
  • 01:06:58And and will, I mean the the patent's only
  • 01:07:01going to last for like 20 years, right.
  • 01:07:03And they will 18 now or something like that,
  • 01:07:05Yeah. But leaving aside that those folks,
  • 01:07:09it seems to me that that there are like a
  • 01:07:11a number of like very legitimate uses I I
  • 01:07:15can think of for for this technology in,
  • 01:07:20you know, life saving medical care
  • 01:07:22for humans. So you know the number
  • 01:07:24one killer in in the developed world
  • 01:07:27is heart disease, right?
  • 01:07:28And you have potentially here a technology
  • 01:07:32that could restore life to a brain after
  • 01:07:36a catastrophic cardiac event, right?
  • 01:07:39And and allow, you know,
  • 01:07:44intervention and treatment
  • 01:07:45potentially even like you know,
  • 01:07:48heart transplant to repair the cardiac injury
  • 01:07:52and the person could go on with their life,
  • 01:07:54you know, in their body if you
  • 01:07:57could just restart the brain.
  • 01:07:58I mean that that I can,
  • 01:08:01I can multiply the examples of the ways
  • 01:08:03that this could be used clinically the
  • 01:08:06the perfusate itself since it is very
  • 01:08:08good at keeping very complex organs alive.
  • 01:08:11It's like a substitute blood and
  • 01:08:12it might well help with transplant.
  • 01:08:15You might be able to take organs from
  • 01:08:17someone who's a donor and hook them
  • 01:08:19up to a a perfusion machine that's
  • 01:08:21been customized for that kind of
  • 01:08:23organ and fly it across the country,
  • 01:08:25fly it around the world and use
  • 01:08:27it for transplant exists.
  • 01:08:28You do that for kidneys,
  • 01:08:30you do that for livers and so on.
  • 01:08:32And that's how these guys derive
  • 01:08:34the solution to food.
  • 01:08:36There's a derivation of the
  • 01:08:39previous solution that had for
  • 01:08:41kidneys and livers and so on that has
  • 01:08:45been around for long. I'll have
  • 01:08:46to take your word for that.
  • 01:08:50Although I do know that we had a
  • 01:08:52study of whole body perfusion with our
  • 01:08:55perfusate that compared it to ECMO and
  • 01:08:57it performed better than ECMO at support
  • 01:09:01of organs across a whole deceased pig.
  • 01:09:07But right it might have
  • 01:09:09applications for for transplant.
  • 01:09:11And also the perfusate is a cellular so it
  • 01:09:14carries oxygen without any red blood cells.
  • 01:09:17And that means if somebody's had a stroke,
  • 01:09:20which is in in the form of, you know,
  • 01:09:22a sieve in a capillary in the brain,
  • 01:09:25Perfuse 8 might be able to carry
  • 01:09:27oxygen through the sieve to the
  • 01:09:28downstream parts of the brain.
  • 01:09:30There are many possible medical applications,
  • 01:09:34but those don't require consciousness
  • 01:09:38and they're being licensed out.
  • 01:09:40Yale owns the intellectual property
  • 01:09:42to the Perfuse 8 and machinery and
  • 01:09:45the the labs here and also Bexorg are
  • 01:09:48partnering with research universities
  • 01:09:50around the world to try to work on
  • 01:09:52developing those clinical applications.
  • 01:09:54Including, you know,
  • 01:09:55trying to bring someone back who's had,
  • 01:09:57who's not breathed 'cause they had
  • 01:09:59a heart attack in a swimming pool
  • 01:10:011/2 hour ago that that kind of
  • 01:10:03stuff is definitely on the table.
  • 01:10:05But, but specifically,
  • 01:10:07restoration of consciousness after 10
  • 01:10:10minutes or an hour of death would be,
  • 01:10:14you know, a medical miracle, right?
  • 01:10:18Like that.
  • 01:10:18That seems to me like something
  • 01:10:20worth investigating.
  • 01:10:29Yeah. I think the way that you
  • 01:10:32would investigate it would be via
  • 01:10:34emergency room applications because
  • 01:10:38we know that it can do some reversal
  • 01:10:42of ischemic damage in the brain
  • 01:10:44and restore metabolic function.
  • 01:10:46And I think the way you would
  • 01:10:49do it is you would try it out in
  • 01:10:51people who otherwise would die
  • 01:10:53and see if it restores function.
  • 01:10:56And then you run into the thing
  • 01:10:57that I already mentioned, the talk,
  • 01:10:59which is that you could have a
  • 01:11:01therapy that restores function but
  • 01:11:03doesn't restore great quality of life.
  • 01:11:05I'm not sure what would what the
  • 01:11:08additional gain would be to show that
  • 01:11:10it it wakes up the brain in the VAT.
  • 01:11:13Why that?
  • 01:11:13Why you would want to do that before
  • 01:11:16you try to apply it to someone whose
  • 01:11:18other option is death in the ER,
  • 01:11:24Yeah, no, sorry.
  • 01:11:24My question was just making sure that
  • 01:11:26you were saying a little bit louder,
  • 01:11:27a little bit closer, if you would.
  • 01:11:28My question was just making
  • 01:11:29sure that you were saying that
  • 01:11:31you would go like straight
  • 01:11:33from brain about not not like awake
  • 01:11:36to testing on humans to see if that
  • 01:11:40would awake. And you don't see
  • 01:11:42a benefit to like trying out
  • 01:11:45that on a like, pig brain before
  • 01:11:47heading over to like 'cause like,
  • 01:11:50I guess I I I don't know,
  • 01:11:52I don't know enough.
  • 01:11:53But wouldn't you be able to
  • 01:11:55sort of tell if you aren't
  • 01:11:58regaining function that is
  • 01:12:00conducive to any quality of life
  • 01:12:03in a brain of that before you got to
  • 01:12:07doing that in life, like live humans?
  • 01:12:11I'd say I don't know the answer to that.
  • 01:12:16So yeah, I I I don't know how much,
  • 01:12:22how well, again, the brain is
  • 01:12:27disembodied on the perfusion machine.
  • 01:12:30It is not getting any sensory information.
  • 01:12:34It's not clear to me that the awake
  • 01:12:37brain in the machine is a decent
  • 01:12:40model for somebody who's lying there
  • 01:12:42in a whole body with arms and legs
  • 01:12:45and eyes and ears and a nose, right?
  • 01:12:48It's not clear to me that whatever you
  • 01:12:52would see in the perfusion context
  • 01:12:54would be a decent map of what you
  • 01:12:57would see in the emergency department
  • 01:12:59if you restored metabolic function to
  • 01:13:00the brain of someone who was injured.
  • 01:13:05It's not clear to me how great
  • 01:13:08the value would be of adding
  • 01:13:11that step before trying it out,
  • 01:13:14especially with patients who
  • 01:13:16otherwise we would give up on.
  • 01:13:20Steve, a question and this
  • 01:13:23is this is on some Physiology
  • 01:13:26going on here. I think in
  • 01:13:27reference to your description
  • 01:13:28perhaps of some of the potential,
  • 01:13:29how is the connectivity of the
  • 01:13:30brain maps without neurons firing?
  • 01:13:38Because neurons are connected
  • 01:13:40in permanent pathways,
  • 01:13:41even if they're not firing.
  • 01:13:43There is something like you can
  • 01:13:45imagine the Google Maps of the brain,
  • 01:13:47and there are certain kinds of markers,
  • 01:13:51viral markers mainly,
  • 01:13:52that if you put them in a brain
  • 01:13:55that is metabolically functioning,
  • 01:13:57will spread from cell to cell in the
  • 01:14:00pathway that those cells are connected to.
  • 01:14:02So for example, you you can
  • 01:14:06do that
  • 01:14:09not in the same way and not at this
  • 01:14:12single cell to single cell level.
  • 01:14:14But I believe that in the next six
  • 01:14:18months you will see a publication
  • 01:14:20that shows that the kind of brain
  • 01:14:22mapping I'm talking about is not
  • 01:14:24currently available anywhere.
  • 01:14:28Thank you. Yeah, this gentleman.
  • 01:14:29It does seem to me that there
  • 01:14:31are some potential clinical
  • 01:14:33applications of this technology that
  • 01:14:35would require inducing consciousness.
  • 01:14:40It seems to me that we might
  • 01:14:41be able to make some advance.
  • 01:14:43And I'm not a I'm not a psychiatrist,
  • 01:14:45I'm not a neurologist.
  • 01:14:45But it does seem to me that we can make some
  • 01:14:47advances in the field of,
  • 01:14:48for example, mental health care
  • 01:14:50if we are able to
  • 01:14:54somehow experiment on these brains in a VAT,
  • 01:14:57so to speak, that are conscious
  • 01:15:00now without any sensory input.
  • 01:15:04But could we? Could we not?
  • 01:15:08I mean, yeah, I think so.
  • 01:15:10When these people say to me,
  • 01:15:11I want to upload my brain into a computer,
  • 01:15:15I part of my response to them is,
  • 01:15:18what's your plan for Tuesday
  • 01:15:20morning with no body?
  • 01:15:22Yeah, right. I I, I really,
  • 01:15:25I I feel like the disembodied,
  • 01:15:27like one of the things
  • 01:15:28that I mean when I say,
  • 01:15:30hey, the pig is dead,
  • 01:15:32even though we've got its brain
  • 01:15:35metabolically going in our lab,
  • 01:15:37that pig is dead, right.
  • 01:15:39I think we are much more
  • 01:15:44embodied
  • 01:15:47than we than people who are very
  • 01:15:50concerned with brains often reckon on
  • 01:15:55anyway. But I, I interrupted you.
  • 01:15:56Yeah. No, I I, I, you know,
  • 01:15:59I don't know if we need sensory
  • 01:16:00input per SE to look at
  • 01:16:02those things. Because in,
  • 01:16:03I mean in in theory, right.
  • 01:16:04A a conscious disembodied
  • 01:16:06brain will still carry with
  • 01:16:07it its memories, right.
  • 01:16:09And its experiences. And I don't
  • 01:16:11know I'm making this up, right.
  • 01:16:13This is all science fiction.
  • 01:16:14We have to, I don't know,
  • 01:16:15talk to them about them,
  • 01:16:17figure out what they were,
  • 01:16:19stimulate particular memories,
  • 01:16:21all in a completely disembodied model.
  • 01:16:24I I, I don't, I don't really see it.
  • 01:16:28So depending
  • 01:16:29on how much of the brain you
  • 01:16:30take out, of course it's it's
  • 01:16:32quite possible to get
  • 01:16:34stimuli into a brain just through
  • 01:16:38cranial nerves, for example,
  • 01:16:41or auditory. I mean there are,
  • 01:16:42there are people who are building
  • 01:16:45brain organoids that they're trying
  • 01:16:47to connect to sensory organs.
  • 01:16:53Your interface is that that input
  • 01:16:55visual stimuli into the right.
  • 01:16:56The the pong playing brain
  • 01:16:58computer interface that just
  • 01:16:59was written about last month or
  • 01:17:00cochlear implants that
  • 01:17:01send electrical signals
  • 01:17:02directly to the auditory nerve and and
  • 01:17:04you know you you think and you move a
  • 01:17:07joystick so you can drive your wheelchair,
  • 01:17:09those kinds of things there.
  • 01:17:10No, there's a lot of there's a lot of
  • 01:17:12human brain mechanical interface available.
  • 01:17:18No, there's sensory input. There's
  • 01:17:19what I'm saying is there's there's
  • 01:17:21actually sensory input available as well.
  • 01:17:25Yeah. And you'd have to explain to
  • 01:17:26me why it would be better to do that
  • 01:17:28in a model that was disembodied than
  • 01:17:30you know a volunteer human subject
  • 01:17:32that that's a fairpoint that that that
  • 01:17:35I agree with. But
  • 01:17:37I do think that you know
  • 01:17:39there there is potential benefit
  • 01:17:41across that event horizon
  • 01:17:42of consciousness as well.
  • 01:17:44And if we're, you know,
  • 01:17:45we're just doing the math
  • 01:17:46in our head right now, we say, all right,
  • 01:17:47that that benefit's not worth it.
  • 01:17:51I think that's a reasonable
  • 01:17:52conclusion to draw,
  • 01:17:54but that could be, that could be.
  • 01:17:59I've got one question here.
  • 01:18:02This is a follow on from the
  • 01:18:03first question I asked you which
  • 01:18:05was that when you spoke of a a
  • 01:18:07traumatized brain in a living animal,
  • 01:18:09this is a quote or a traumatized,
  • 01:18:12traumatized brain in a living animal
  • 01:18:14is to control a brain or an animal.
  • 01:18:18I'm not quite sure what that question is.
  • 01:18:20I have to apologize to the
  • 01:18:21question of your well,
  • 01:18:22I think when I was talking
  • 01:18:24about traumatized brains,
  • 01:18:24I was I was trying to compare
  • 01:18:31suppose we were to bring a pig
  • 01:18:34brain to consciousness in the lab.
  • 01:18:39What could what could the negative experience
  • 01:18:42of that pig brain possibly amount to?
  • 01:18:45And I was making a point that's sort of
  • 01:18:48parallel to the point that you made about
  • 01:18:50the fact that we all eat pork in other
  • 01:18:53parts of our medical research complex.
  • 01:18:56We bash pigs in the head in order to make
  • 01:19:00judgments about the effects of brain trauma.
  • 01:19:03And I can't imagine that the ill
  • 01:19:06effects of a pig of having its brain
  • 01:19:08come to life in a VAT in the lab,
  • 01:19:11could be worse than the ill effects on
  • 01:19:14the pig who has his head bashed in as
  • 01:19:16part of a brain trauma experiment and
  • 01:19:18or a pig who is part of a cancer study
  • 01:19:21where death is the end point and so on.
  • 01:19:23Right. We do lots of things to pigs
  • 01:19:27that strike me as worse than what
  • 01:19:30we would be doing to the pig brain
  • 01:19:32in our research studies. So
  • 01:19:34but let me ask you this, but
  • 01:19:37that's not enough to say that we could
  • 01:19:38go ahead and do our research studies
  • 01:19:40because it might be really wrong that
  • 01:19:42we're doing all those terrible things.
  • 01:19:43But let me ask this question I I get that.
  • 01:19:46But but to me it's a cannonball and a
  • 01:19:48BB to some extent to say we're really
  • 01:19:50going to worry about this BB because but
  • 01:19:52we're going to ignore the cannonball.
  • 01:19:53But the other aspect of this is that
  • 01:19:55there's a lot of concern about the pigs,
  • 01:19:58because pigs, as you point out,
  • 01:20:00are very intelligent,
  • 01:20:01very social animals, etcetera.
  • 01:20:02And from an ethicist point of view.
  • 01:20:05And again, I know you don't call
  • 01:20:07yourself an animal ethicist,
  • 01:20:07but this is a question
  • 01:20:08we can all ask ourselves
  • 01:20:09is does that make this work more or less
  • 01:20:12permissible than if this were the brain
  • 01:20:15of a a mouse, or the brain of
  • 01:20:18something much smaller and and and
  • 01:20:21less intelligent, less social?
  • 01:20:24Yeah, I mean this is controversial,
  • 01:20:25controversial view in philosophy.
  • 01:20:27But I accept what is widely accepted in the
  • 01:20:32kind of in some parts of animal ethics,
  • 01:20:34including Shelley Kagan here at Yale
  • 01:20:38and in in a lot of the debate about
  • 01:20:42the use of animals in medical research,
  • 01:20:45The the refine and reduce and replace
  • 01:20:49kind of mantra that iacooks obey when
  • 01:20:53their institutional animal care and
  • 01:20:54use committees obey when they're trying
  • 01:20:57to figure out what kinds of animals
  • 01:20:58you can use for what kind of research.
  • 01:21:01They will always say that it's preferable
  • 01:21:04to climb down the species ladder.
  • 01:21:07If you can get the result in your
  • 01:21:09study from a zebrafish,
  • 01:21:10don't get it from a mouse.
  • 01:21:11If you can get it from a mouse,
  • 01:21:13don't get it from a marmoset, right.
  • 01:21:16I get
  • 01:21:16it if they say that what I'm asking
  • 01:21:18is the ethical justification for it.
  • 01:21:21Because the thought is that
  • 01:21:23the amount of suffering you're
  • 01:21:25inducing in the animal inversely
  • 01:21:28varies with their mental capacity.
  • 01:21:33Humans exposed to a certain
  • 01:21:35kind of experimental
  • 01:21:39intervention might have enhanced capacity
  • 01:21:42to suffer during that intervention.
  • 01:21:45Even Peter Singer, right Mr.
  • 01:21:47Animal Rights Guy says.
  • 01:21:48Like one of the one of the primary
  • 01:21:52inventors of modern era animal
  • 01:21:55rights rhetoric and philosophy.
  • 01:21:57He says if there are 590 LB animals
  • 01:22:02in a boat, four of them are people
  • 01:22:04and one of them is a large dog,
  • 01:22:06and only four can survive with
  • 01:22:08the provisions in the boat.
  • 01:22:10He says you should throw the dog overboard,
  • 01:22:13not one of the people. Why?
  • 01:22:15Because he's being a good utilitarian.
  • 01:22:17He thinks the dog, like us,
  • 01:22:20will roam around in the water
  • 01:22:21for a long time in the cold
  • 01:22:22water until it gets too cold,
  • 01:22:24and then it's going to drown and die.
  • 01:22:25But the dog is going.
  • 01:22:28I'm getting tired and having
  • 01:22:30various physical bodily sensations,
  • 01:22:31but he's not thinking I'm going to
  • 01:22:34miss my granddaughter's birthday.
  • 01:22:35Oh, how will my wife find out
  • 01:22:37what happened to me? Right.
  • 01:22:39We are capable of much more
  • 01:22:41sophisticated levels of suffering
  • 01:22:42in that kind of circumstance.
  • 01:22:45So toss the dog, OK?
  • 01:22:47That's what the Peter Singer example is.
  • 01:22:49Well,
  • 01:22:49something like that is going
  • 01:22:50on when you say it's worse to
  • 01:22:52experiment on a pig than it is
  • 01:22:54to experiment on on a mouse.
  • 01:22:55We just sort of feel that pigs
  • 01:22:58have more sophisticated capacity
  • 01:23:01for suffering than mice have.
  • 01:23:04You're leaving that assumption
  • 01:23:09of what it's it's a different. It's not it,
  • 01:23:13it's a different guy. Yeah. Well, but
  • 01:23:17yeah, on evidence of the mental
  • 01:23:20capacities of different kinds of animals
  • 01:23:23as demonstrated in their behaviors,
  • 01:23:25their degree of sociality.
  • 01:23:28Oysters, for example.
  • 01:23:32Yeah.
  • 01:23:35So hang on. It's.
  • 01:23:36Well that. Yeah, that go
  • 01:23:38ahead. If brain, brain,
  • 01:23:41the microphone is going to
  • 01:23:42be among many other things.
  • 01:23:43But at this point the last let me
  • 01:23:44just say so this this will be.
  • 01:23:46So the question was about you've
  • 01:23:48made a comment and then this
  • 01:23:49will this will be the last one.
  • 01:23:50So let me just be it has to do with
  • 01:23:52comparing the suffering of different
  • 01:23:54species as and and and what is there
  • 01:23:56evidence for that and you were about
  • 01:23:59to discuss the evidence for that.
  • 01:24:02As far as I understand it
  • 01:24:03and this is not my field.
  • 01:24:05I'm quoting people who are more
  • 01:24:08expert in animal ethics than I am.
  • 01:24:10But it has to do with brain size,
  • 01:24:12brain capacity, evidence of sociality,
  • 01:24:14evidence of the degree of memory
  • 01:24:17the different animals possess when
  • 01:24:19they're tested in different ways.
  • 01:24:22There are, there are lots of people
  • 01:24:24who spend a lot of time worrying
  • 01:24:26about the comparative mental
  • 01:24:28capacities of different animals now.
  • 01:24:30A capacity to sell, Sure.
  • 01:24:32And, and I will say,
  • 01:24:34it's been a huge trend in recent
  • 01:24:37years for for people to come out with
  • 01:24:40lots of surprising evidence about how
  • 01:24:43extraordinary the capacities of animals,
  • 01:24:45how extraordinarily greater than
  • 01:24:46we used to think the capacity
  • 01:24:49of certain animals are.
  • 01:24:50Like birds, for example,
  • 01:24:52can remember the faces of people
  • 01:24:54who've been mean to them,
  • 01:24:56and fish can learn to avoid painful stimuli,
  • 01:25:00and they can learn from experience
  • 01:25:01and change their swimming patterns
  • 01:25:03in response to things.
  • 01:25:05So there's been a huge theme in recent
  • 01:25:08animal ethics literature about how
  • 01:25:10much more mental capacity different
  • 01:25:13kinds of animals have than we have
  • 01:25:16commonly given them credit for.
  • 01:25:18But that is not at all the same thing
  • 01:25:21as to say their mental capacities
  • 01:25:23are so identical that they all
  • 01:25:25deserve the same level of moral
  • 01:25:27regard or the same level of rights.
  • 01:25:29There are people like Tom Regan who
  • 01:25:31think that any animal that is the
  • 01:25:33subject of a life that leads a life
  • 01:25:35and makes choices about what it's
  • 01:25:37going to do next has an absolute
  • 01:25:40right not to be interfered with by us.
  • 01:25:43But most animal ethicists acknowledge
  • 01:25:45some kinds of degrees of priority
  • 01:25:48between different kinds of animals,
  • 01:25:51and we spend a lot of time
  • 01:25:53agonizing and discussing the
  • 01:25:54different degrees of of moral status
  • 01:25:57and priority among our own species.
  • 01:25:59People with dementia,
  • 01:26:02newborns, fetuses, right.
  • 01:26:06This is absolutely fascinating stuff.
  • 01:26:07Steven, we're going to follow
  • 01:26:08this work with interest and we I,
  • 01:26:10I really appreciate your coming and
  • 01:26:12sharing this with us and working
  • 01:26:13this through this has been terrific.
  • 01:26:15Thank you so much.
  • 01:26:15Thank you so much for having me.
  • 01:26:16And thank you for all the great questions.
  • 01:26:22All right. We'll see you next week.