Mistrust and Misinformation in Medicine—Causes and Consequences
November 08, 2022October 6, 2022
Dhruv Khullar, MD, MPP
Information
- ID
- 8239
- To Cite
- DCA Citation Guide
Transcript
- 00:00My name is Mark Mercurio.
- 00:01I'm the director of the
- 00:03program for Biomedical Ethics.
- 00:04And on behalf of myself,
- 00:06our associate directors,
- 00:07Jack Hughes and Sarah Hall,
- 00:09and our manager,
- 00:10Karen Cole, welcome.
- 00:11This is, as I mentioned a few minutes ago,
- 00:13our first hybrid session for the year.
- 00:15And I am very grateful to those
- 00:17of you who came here in person so
- 00:19we can have a live studio audience
- 00:20for Drew and also grateful for
- 00:22those of you who run the zoom call.
- 00:24So thank you very much for those of
- 00:27you who are new to our format here and
- 00:29new to our seminars we do this about.
- 00:31Twice a month and there is a schedule you
- 00:33can access at biomedical ethics at Yale,
- 00:36on the Internet,
- 00:37and we have a mailing list.
- 00:39You can certainly reach out to me or
- 00:41to Karen to to get on the mailing list.
- 00:44We
- 00:45are very happy for your participation,
- 00:47and I'm particularly happy about tonight.
- 00:49And I know Jack is too.
- 00:52A few years ago now,
- 00:53more than a few years ago,
- 00:54Jack and I together led a
- 00:57trip called fast Speed Trip.
- 00:59This is the fellowships at all.
- 01:01This is for the study of professional ethics,
- 01:03which is a bit of a longer story,
- 01:05but it's a marvelous opportunity for
- 01:06a select group of medical students
- 01:08from all over the country to spend 2
- 01:10weeks overseas studying professional
- 01:11ethics as well as the history of the
- 01:14physicians complicity and participation
- 01:15in the atrocities of the Nazi era.
- 01:18And this is a a marvelous program that Jack
- 01:20and I have been involved with for years.
- 01:21And in fact one of the academic leaders
- 01:23of this is our own Nancy Angoff,
- 01:25who's also here tonight.
- 01:26And some years ago,
- 01:28we were very fortunate to make
- 01:30the trip with a truly outstanding
- 01:31medical student named Drew Kular.
- 01:33And and just watch how things go.
- 01:35When Drew was a medical student here at Yale,
- 01:37he had gone to college here at Yale.
- 01:39So a lot of you are wondering
- 01:40that anybody who went to college
- 01:41at Yale ever amount to anything.
- 01:42And the answer is, yeah, Drew did.
- 01:46So we had a a marvelous experience with
- 01:48Drew and we weren't at all surprised
- 01:50that he went on to do great things.
- 01:52He did his residency up at the
- 01:54Mass general and he's been on the
- 01:56faculty at Cornell for some time now.
- 01:58And it was not long ago I was just
- 02:00telling Drew that I just happened to
- 02:01stumble across something he had written
- 02:02in the New York Times a few years ago.
- 02:04And what a great pleasure that
- 02:05was just to happen,
- 02:06to see one of your former students names
- 02:08on the byline at the New York Times.
- 02:10And Drew has done quite a bit of
- 02:12writing since and policy work,
- 02:13so to for a more formal introduction
- 02:15of one of the most.
- 02:16Impressive medical students I've had
- 02:18the pleasure to teach drive collar is
- 02:20a physician and assistant professor
- 02:22of health policy and economics
- 02:23at Weill Cornell Medical College.
- 02:25He's also a writer at The New Yorker,
- 02:27where he writes about medicine,
- 02:28healthcare, and politics.
- 02:29He currently serves as the Director of
- 02:31Policy dissemination at the Physician
- 02:34Foundation Center for Physicians
- 02:35Practice and Leadership and was
- 02:37recently a senior Research fellow at
- 02:39the New York City Health and Hospitals.
- 02:41His research focuses on value based care,
- 02:44health disparities, and medical innovation.
- 02:46And has been published in JAMA and
- 02:48the New England Journal of Medicine.
- 02:50Drive,
- 02:51as I mentioned,
- 02:51went to college and medical school
- 02:53here at Yale,
- 02:54did medical training at Massachusetts General
- 02:56Hospital and at Harvard Medical School.
- 02:58He also received a master's in public
- 03:00policy from the Harvard Kennedy School,
- 03:03where he was a fellow at the
- 03:04Center for Public Leadership.
- 03:06He's been recognized by LinkedIn as among the
- 03:09top ten healthcare professionals under 35,
- 03:12by the National Minority Quality Forum
- 03:14as a 40 under 40 leader in health,
- 03:17and by FAST for the organization
- 03:18I just mentioned with the 2019.
- 03:21Distinguished fellow award
- 03:22for ethical leadership.
- 03:23It is with great pleasure that
- 03:26I introduce you to Drew Kular,
- 03:28who's gonna speak to us for a bit
- 03:30about the misinformation in the
- 03:32medical setting and then we'll have
- 03:34a conversation the way this typically
- 03:35works and the way it's going to
- 03:36work tonight now because again,
- 03:37this is our first hybrid session.
- 03:39Is Drew will speak for about 45 minutes
- 03:42plus or minus and after that we will
- 03:45sit here and I'll moderate Q&A session.
- 03:47If you have a question or a comment,
- 03:49please raise your hand.
- 03:50Karen will be walking around
- 03:51with a microphone.
- 03:52Please wait for the microphone,
- 03:54obviously at a courtesy to
- 03:54everybody else so they can hear you.
- 03:56But also now for the folks
- 03:57who are here on zoom,
- 03:58so they can hear you and we
- 04:01can have a conversation,
- 04:02you can ask him anything you want now.
- 04:05And I'll also be looking at the
- 04:06zoom callers for questions there.
- 04:08So for the folks on zoom out.
- 04:09Ask you to please send your questions
- 04:11through the Q&A portion and I'll be
- 04:13going through those while Drew is
- 04:15speaking during the second portion
- 04:16to pick out questions for him.
- 04:18So introduce you now Dr Kullar.
- 04:48Thanks so much Mark.
- 04:49It's it's so good to be back at Yale.
- 04:52I can't tell you how much it means
- 04:54to to see old professors and to just
- 04:56get off the the Metro-North at Union
- 04:59Station and and you know coming back
- 05:01here always makes you feel kind of very
- 05:04viscerally what a special place Yale is.
- 05:06And and I think while you're here
- 05:07sometimes you can you can lose sight
- 05:09of that and when you leave you
- 05:11can you know you don't appreciate
- 05:12that as much but then coming back
- 05:14really really drives it home.
- 05:16So thank you for the opportunity to
- 05:18come come back and speak with you all.
- 05:19So I wanted to talk about misinformation
- 05:22and particularly misinformation in the
- 05:24medical setting and in public health.
- 05:26And it's something that obviously
- 05:27is on the minds of a lot of people,
- 05:29particularly during the COVID-19
- 05:31pandemic with the rise of social media.
- 05:34But the more I thought about it,
- 05:36the more it seemed that you can't really
- 05:38understand the story of misinformation
- 05:40without understanding the story of
- 05:42mistrust and distrust in society and
- 05:44what has happened over the past few decades,
- 05:47because it's clear that misinformation.
- 05:50And disinformation really thrive and
- 05:53flourish in settings where trust is lacking,
- 05:57where people don't trust institutions, where
- 05:59they don't trust the people around them.
- 06:01There becomes kind of a vacuum that is
- 06:04filled with poor quality information.
- 06:06So I want to start the talk by
- 06:08talking about mistrust,
- 06:08and then we'll kind of transition over to
- 06:11to to some issues around misinformation.
- 06:14This is an ethics seminar,
- 06:15so I'm going to disclose my conflicts
- 06:18of interest up front.
- 06:20So I I have grants,
- 06:21none of which are are related
- 06:24to misinformation or mistrust,
- 06:25and unfortunately, no commercial conflicts.
- 06:29OK, So what are we going to talk about today?
- 06:31I want to talk about three main things.
- 06:33The first is to describe trust and
- 06:36mistrust in institutions in the United
- 06:38States over the past few decades.
- 06:40The second is related, but is this idea of?
- 06:44Trying to understand the environment
- 06:46in which medical misinformation can
- 06:48flourish and why it's spreading,
- 06:49in part because, as we'll see,
- 06:50the massive decrement in trust in nearly
- 06:53every institution in the United States
- 06:55and 3rd is to discuss possible avenues
- 06:58to minimize the harm of misinformation.
- 07:00And I view this section as as very
- 07:02much a work in progress because
- 07:04I don't know how to do it,
- 07:05and I don't think anyone knows how to
- 07:08do it in a really effective way yet.
- 07:10But I think there are some ideas that
- 07:12are intriguing and over hopefully
- 07:14over the next few years.
- 07:15We'll we'll come up with evidence
- 07:18based strategies that actually
- 07:19minimise the harm that's caused by
- 07:22by poor quality health information.
- 07:24So we're at a medical school.
- 07:25I want to start with a medical mystery.
- 07:28This one starts in the early
- 07:301980s in in in Chicago.
- 07:34And so on September 29th,
- 07:361982,
- 07:36a 12 year old girl named Mary Kellerman,
- 07:38she woke up with a cold in
- 07:40the Chicago suburb area.
- 07:43Her parents gave her some Tylenol out
- 07:45of a out of two two capsules of Tylenol
- 07:49and the next morning she had died.
- 07:53That same day, a 27 year old man he was
- 07:55supposed to worker named Adam Janis,
- 07:58he developed discomfort in his chest.
- 08:01He passed away.
- 08:02People thought that it had been some kind
- 08:05of freak early heart attack and his,
- 08:07his family comes to his home.
- 08:09They're mourning, they develop,
- 08:11you know, headaches, they're crying.
- 08:13Both his brother and his sister-in-law take
- 08:16Tylenol from the same bottle that Adam used.
- 08:20And by the end of the week both
- 08:22of them had also. Passed away.
- 08:25So over the next couple of days,
- 08:27there are three more mysterious deaths
- 08:29of this nature in the Chicago area,
- 08:32each of them linked to someone taking
- 08:35a Tylenol from from a recently
- 08:38purchased bottle of Tylenol.
- 08:40So a few facts come to the forefront.
- 08:42The first is that Tylenol was
- 08:45laced with cyanide.
- 08:46As you can imagine,
- 08:48this causes widespread disarray and
- 08:51alarm across the United States.
- 08:54There are more than 100,000 stories in
- 08:57newspapers about the Tylenol crisis.
- 09:00In the coming weeks,
- 09:01there's hundreds of hours of
- 09:03television dedicated to this story.
- 09:04More than 90% of Americans hear about
- 09:07cyanide laced Tylenol that's killing people.
- 09:10In Chicago,
- 09:11and by some estimates,
- 09:12this story gets more coverage than
- 09:15any story in the United States
- 09:18since the JFK assassination.
- 09:20OK,
- 09:20so Johnson and Johnson has a
- 09:22crisis on its hands.
- 09:23Obviously,
- 09:23it's a it's a tremendous tragedy
- 09:25that they're dealing with,
- 09:27but they also have a financial
- 09:29calculus at play.
- 09:30The first thing to note is that
- 09:32Tylenol accounts for nearly 20% of all
- 09:34of Johnson and Johnson's profits in 1982.
- 09:37Tylenol accounts for nearly 40%
- 09:39of the analgesics market share
- 09:41in the United States.
- 09:42So if Tylenol were its own company,
- 09:45its profits would place it in the
- 09:47top half of Fortune 500 companies.
- 09:49So there's a massive amount of money.
- 09:51At stake in this crisis?
- 09:55So a few facts emerge over
- 09:57the next couple of days.
- 09:58The first is that the tampering
- 09:59is not occurring at the
- 10:01Johnson and Johnson facility.
- 10:02There's nothing wrong with what
- 10:03they're doing at the facilities.
- 10:05It seems like someone is
- 10:06sneaking in to these pharmacies,
- 10:08opening up the capsules,
- 10:10putting cyanide inside and and and
- 10:12that's the way that this poison is
- 10:14spreading throughout the Chicago
- 10:16area and there are no cases
- 10:18anywhere else in the United States.
- 10:20So Jim Burke,
- 10:21who's the chairman of Johnson and Johnson,
- 10:24he has a few decisions to make,
- 10:25and his behavior in the next few
- 10:28days and weeks and months is often
- 10:31thought of as a model for how to
- 10:33regain trust that has been lost.
- 10:36And so he actively engages the media.
- 10:38He's out there talking to people
- 10:39as much as he can.
- 10:41Johnson and Johnson establishes
- 10:42a hotline for customers and
- 10:44news organizations to call in
- 10:45to get up-to-date information.
- 10:47They recall every bottle of
- 10:48Tylenol in the entire country,
- 10:50not just in the Chicago area.
- 10:52And they introduce new tamper resistant
- 10:54packaging for all their products,
- 10:57not just for Tylenol.
- 11:01I don't know if this is gonna
- 11:02work, but I'm gonna try it.
- 11:07Can you guys see this on zoom?
- 11:10I guess they can't answer it.
- 11:15We can see it.
- 11:21That's always good advice.
- 11:22There we go. OK, there we go.
- 11:25Yeah. OK, great.
- 12:49Alright, so we'll we'll stop there.
- 12:53OK. So, so Jim Burke is out front,
- 12:56he's he's doing these things.
- 12:57And as I mentioned it,
- 12:59it's very successful.
- 13:00So it becomes a model for crisis management.
- 13:04Tylenol sales as you might imagine
- 13:06plunged the 7% of the analgesic
- 13:08market after the crisis and then
- 13:10within a year it's back to baseline.
- 13:12So basically they are exceeding
- 13:14Tylenol sales within a year.
- 13:16The following year Congress passes
- 13:17in part because of the work that
- 13:20that he and and Johnson Johnson
- 13:22had done passes the Tylenol bill.
- 13:23Making it a federal offense to
- 13:25tamper with consumer products.
- 13:27A few years later,
- 13:28FDA establishes federal guidelines for
- 13:31manufacturers of all products that are
- 13:33sold in this way to be tamper proof.
- 13:35So it's it's it's really a
- 13:37success story in this way.
- 13:39But what I wanna focus on and what I'm,
- 13:41you know, curious about,
- 13:42is there a way to get rid of this here?
- 13:49You know, I think what I want to focus
- 13:51on is trying to understand, you know,
- 13:54could this have happened today,
- 13:55or at least how different would it be
- 13:58for for an organization or corporation
- 14:00to go through something like this
- 14:02and regain the trust of the public?
- 14:04And there's a few reasons that I think
- 14:06it would be far more difficult than it
- 14:08is far more difficult. The first is,
- 14:10as we'll talk about in a minute,
- 14:11the crisis of trust in institutions.
- 14:13And I should say, not just a crisis
- 14:15of trust on part of the public,
- 14:17but I think a crisis of trustworthiness.
- 14:19On the part of corporations
- 14:20and some institutions as well,
- 14:22Johnson Johnson itself recently
- 14:24reached a $5 billion settlement
- 14:26for its role in the opioid crisis.
- 14:29So this isn't just a unidirectional
- 14:31why don't,
- 14:32why doesn't the public trust these
- 14:35institutions or these corporations?
- 14:36There's something going on in
- 14:38terms of trustworthiness as well.
- 14:40The second point is around
- 14:42deepening political polarization.
- 14:43And not just polarization,
- 14:45but negative tribalism.
- 14:46Negative polarization,
- 14:47meaning you don't just like your
- 14:49team and want to root for your team.
- 14:50You hate the other team,
- 14:52and part of what you want to
- 14:53do is see the other team lose,
- 14:55not just seem at your own team win.
- 14:58The third factor, I think,
- 14:59is economic inequality and
- 15:00worsening economic inequality.
- 15:01The feeling that a lot of
- 15:03people are being left behind,
- 15:05that their wages are stagnant,
- 15:06that the gains in the economy
- 15:08are realized by a very small.
- 15:11Trust of society.
- 15:12And that understandably creates a level
- 15:15of resentment against people who are
- 15:18perceived to be gaining a lot more,
- 15:21and perhaps unjustly.
- 15:23The 4th factor I think is the
- 15:25splintering of the medial ecosystem.
- 15:26So Jim Burke back in the day
- 15:28would go and speak with three
- 15:30or four or five news anchors.
- 15:32And today that's completely different in the
- 15:35area of Twitter and Facebook and you know,
- 15:38thousands of new media organizations
- 15:41and also the cratering of old
- 15:44traditional legacy institutions that
- 15:46were based in local communities,
- 15:49a lot of newspapers that were
- 15:51kind of the heart and soul.
- 15:53Helping people understand what
- 15:55was going around on around them.
- 15:57They have gone out of business
- 15:58over the past one or two decades.
- 16:00Obviously,
- 16:01the rise of social media,
- 16:02we'll talk a little bit about that.
- 16:03All this creates an environment
- 16:05in which misinformation and
- 16:06disinformation can thrive.
- 16:11OK, so let's talk a little bit
- 16:13about trust and step back and think
- 16:15about why trust is so important.
- 16:17So the first thing to recognize,
- 16:19I think, is that trust has elements
- 16:21of both risk and vulnerability.
- 16:22Now, if you have nothing at risk,
- 16:24whether it's your reputation or your mental
- 16:26piece or your health or your livelihood,
- 16:28there's no reason to
- 16:30trust that trust kind of.
- 16:32You have to have some level,
- 16:33something at stake in order to trust
- 16:35that someone will treat you in a
- 16:37way that you want to be treated.
- 16:39Trust is also voluntary, so.
- 16:42If you involve, if it's involuntary,
- 16:44it's really just dependency.
- 16:45So trust has to be given in a
- 16:48voluntary manner and it's prospective.
- 16:49It's an assessment of the way that
- 16:51things might go in the future as
- 16:53opposed to satisfaction or discontent.
- 16:55Those are appraisals of how
- 16:56things went in the past.
- 16:57And then finally,
- 16:58of course trust is malleable.
- 17:00You know it.
- 17:01It doesn't take a lot to to destroy
- 17:03trust and it takes a lot to to to gain
- 17:06it back after it's been destroyed.
- 17:08OK, so here's a skeptical view by by
- 17:11Doctor Edward Deming, statistician.
- 17:13He says in God,
- 17:14we trust all others must bring data.
- 17:17And I think as scientists, as doctors,
- 17:18as clinicians, actually we, we,
- 17:20we tend to think a lot in this way and that,
- 17:23you know, we want to see the data
- 17:25before we we believe something.
- 17:26But that's not actually the way I
- 17:28think most of society operates.
- 17:29That's not even the way that
- 17:32we really operate.
- 17:33So I think it's much closer to Kenneth era,
- 17:36the Nobel Prize winning economist conception.
- 17:38Which is that virtually
- 17:40every commercial transaction,
- 17:42and I would say commercial,
- 17:43social, medical transaction,
- 17:45has within itself an element of trust.
- 17:48Much of the economic backwardness
- 17:50in the world can be explained by
- 17:52the lack of mutual confidence.
- 17:53So think about just something as
- 17:55simple as going to get a haircut and
- 17:57how much trust is embedded in that.
- 17:59You don't even think about the millions
- 18:02of little things that you are trusting in.
- 18:04Maybe not millions,
- 18:04but at least dozens, right?
- 18:05So you put your credit card
- 18:07information into the Uber app.
- 18:09We trust that it will be secure and private.
- 18:11You call the Uber, you open the door,
- 18:14and some complete stranger asks
- 18:15you to get in their car.
- 18:17You get in their car.
- 18:18You trust that they're they're,
- 18:19they're not going to kidnap you.
- 18:20You then trust that that the Department
- 18:22of Motor Vehicles has given them a
- 18:24license and they are a sensible driver,
- 18:26that they're not going to run red lights,
- 18:28they're not going to get into an accident.
- 18:30You get to the haircut, the hair salon.
- 18:32You meet the Barber.
- 18:34You never met this Barber before.
- 18:35He's going to take scissors and wheel
- 18:37them within millimeters of your head.
- 18:39He's not gonna you think.
- 18:40He's not gonna cut off your ear.
- 18:41He's not gonna harm you in any way.
- 18:43And then you give him some,
- 18:44some cotton in your pocket as some paper,
- 18:47and it's worthless,
- 18:48only that everyone believes that it's money.
- 18:51And every the entire kind of economic system
- 18:54depends on trust in the US dollar, right?
- 18:57So. So. So I think we just don't even
- 19:00think about and assume all the ways in
- 19:02which trust is really central to the
- 19:05functioning of society and why it's so
- 19:08devastating that trust has declined.
- 19:09So many different areas.
- 19:12OK, so, you know, I think that,
- 19:15you know, trust can feel mushy,
- 19:17it can feel complicated.
- 19:18But I think it really comes down
- 19:20to three questions and the ability
- 19:21to answer 3 questions for people.
- 19:23And I think about this in
- 19:25the patient care setting.
- 19:25I think about this more broadly
- 19:27in the public health world.
- 19:29So the first is this idea of
- 19:30competence you want to answer.
- 19:31Do you know what you're doing?
- 19:33And you want.
- 19:33It's very hard to trust someone if you
- 19:35don't think that they're competent,
- 19:36that they know what they're doing.
- 19:38The 2nd is a question of transparency,
- 19:40and it's this idea of will you tell me?
- 19:42What you're doing,
- 19:43you might know what you're doing,
- 19:44but if you're completely opaque about it,
- 19:46it's also very hard to trust someone.
- 19:48And the third is,
- 19:49are you doing it to help me,
- 19:50or are you doing it to help yourself?
- 19:51It's a question of motive.
- 19:52So I think competence,
- 19:54transparency, and motive.
- 19:55You know,
- 19:56I've I've recently done some
- 19:57writing about vaccine hesitancy
- 19:59and why people are so hesitant.
- 20:00And there's all sorts of reasons,
- 20:02of course,
- 20:02but one thing that came
- 20:04up a lot was this idea of.
- 20:05Pharmaceutical corporations making
- 20:06a lot of money and pushing out
- 20:09these vaccines on people and and
- 20:10you know that is something that is
- 20:12on the minds of a lot of people.
- 20:13And so this idea of motive I think
- 20:16still is really important to think
- 20:18about as we're communicating
- 20:19about science and public health.
- 20:24So let's talk about some very depressing
- 20:26slides about trust and institutions.
- 20:28The first one here is trust
- 20:30in the federal government.
- 20:31This is some polling from Pew.
- 20:33It basically asked, you know,
- 20:35do you have, do you trust the
- 20:37government to do what's right,
- 20:38at least most of the time?
- 20:40So as you can see in the
- 20:42Eisenhower administration,
- 20:43the Kennedy administration,
- 20:44even the Johnson administration back then,
- 20:46trust to for the federal government to do
- 20:48what's right most of the time, 70 to 80%.
- 20:51That has declined precipitously
- 20:53over the past. 50 years.
- 20:57There's obviously a partisan aspect to this.
- 20:59So as you can see,
- 21:00you know when your own party is in power,
- 21:03you're more likely to trust the government.
- 21:04When the other party is in power,
- 21:05you're less likely to trust the government.
- 21:07The red squiggles are higher
- 21:08during the Reagan administration.
- 21:10The blue struggles, squiggles are
- 21:11hired in the Obama administration.
- 21:13That all makes sense.
- 21:14But I think what's important to note,
- 21:16and what's particularly damaging here,
- 21:17is that trust in the past when the other
- 21:21party was in power was higher than trust.
- 21:24Now when your own party is in power, so.
- 21:26So Democrats under Eisenhower had way
- 21:29higher levels of trust than Democrats
- 21:31do under the Obama administration.
- 21:34Right at the lowest point of Republican
- 21:36trust in the Johnson administration,
- 21:38right.
- 21:38Remember Vietnam here,
- 21:39if you remember kind of the
- 21:41turmoil of that period,
- 21:42the lowest point of trust among Republicans
- 21:45in the Johnson administration will still
- 21:47higher than Republicans trust in the
- 21:49government during the Trump administration.
- 21:50So.
- 21:51So this is something that has
- 21:52a parse and valence to it,
- 21:54but but it's a it's a broad secular trend.
- 21:58OK, let's look at trust the media.
- 22:00Trust the media in 1976 was 72%.
- 22:03This is not a huge,
- 22:06a very high bar.
- 22:07How much trust and confidence you
- 22:08have in the mass media when it
- 22:10comes to reporting the news fully,
- 22:11accurately and fairly,
- 22:12at least a great deal or
- 22:15at least a fair amount.
- 22:17So just a fair amount is
- 22:18what we're looking at.
- 22:19Not even not a super high bar.
- 22:22So in 76 it was 72% that doesn't,
- 22:25it's not totally surprising.
- 22:27This was after Watergate kind of the.
- 22:28You know the high point I think of
- 22:30of of American journalism today
- 22:32that is down to about a third.
- 22:35So only about 1/3 of Americans have even
- 22:37a fair amount of faith that the mass
- 22:40media is doing fair and accurate reporting.
- 22:42Again,
- 22:43there's a huge partisan divide here.
- 22:46And so what's interesting to note is,
- 22:48you know, even 20 years ago,
- 22:49if you look back 1997,
- 22:51nineteen 99, you know,
- 22:5459% of Democrats said that they have a
- 22:56fair amount of trust in the mass media,
- 22:5752% of Republicans not a
- 22:59huge discrepancy over time.
- 23:01As you can see,
- 23:02Democrats have actually developed
- 23:04more trust in the mass media and the
- 23:06Republican support has totally cratered.
- 23:0811% of Republicans have trust
- 23:11in the mass media.
- 23:12And it's important to note that
- 23:14independents are closer to Republicans
- 23:15than they are to Democrats.
- 23:19Alright, getting closer to home, science.
- 23:21So science is, is it's a good story right.
- 23:24So look back in 75,
- 23:27you know do you have quite a lot
- 23:28of trust or a great deal of trust,
- 23:30confidence in science,
- 23:3170% of US adults say yes,
- 23:34now it's 64%, so not a huge decrement.
- 23:36This is on par with some of the most
- 23:38trusted institutions in the United States,
- 23:40small businesses and the military.
- 23:43This is a little bit less satisfying.
- 23:46So as you can see again,
- 23:48what has happened here is that Democrats
- 23:51and liberals now have more trust than
- 23:53they did 50 years ago, 40 years ago.
- 23:56Independents have declined modestly.
- 23:58Republicans have totally cratered a 30
- 24:00point decrement in their trust in science.
- 24:03And I think what's important
- 24:04to note is back in 1975,
- 24:05Republicans had more trust in
- 24:07science than the Democrats.
- 24:09You 72% of Republicans say
- 24:11they trust science, only 67.
- 24:13Percent of Democrats, you know,
- 24:16there's an interesting story
- 24:17to be told here in that,
- 24:18you know,
- 24:19around that time there's a distinction I
- 24:22think some sociologists have made between
- 24:24production science and impact science.
- 24:26So back in the post war era,
- 24:27there's this huge swell
- 24:29of production science,
- 24:30production scientists,
- 24:31things that are thought to fuel
- 24:34the economy that, you know,
- 24:36influence our military might that,
- 24:40you know, are able to develop
- 24:42new technologies that might be.
- 24:43Helpful and interesting.
- 24:46That overtime shifted somewhat.
- 24:47I mean,
- 24:48that's still happening to impact science.
- 24:50What are the consequences of these things,
- 24:52the most obvious one being
- 24:54something like greenhouse gases,
- 24:55but other things like pesticides,
- 24:57carcinogens,
- 24:57the unintended consequences of science.
- 24:59As that happened,
- 25:01it ushered in also more regulation
- 25:04and things that are kind of
- 25:07anathema to conservative ideals.
- 25:10And so that I think is,
- 25:11is part of the story here.
- 25:12In any case, huge partisan divide.
- 25:15All right. Medicine.
- 25:17OK.
- 25:17So confidence in medical leaders,
- 25:20this is particularly distressing.
- 25:21So in the 60s,
- 25:23about 75% of Americans had great confidence.
- 25:27Today, that's about 1/3 great
- 25:30confidence in medical leaders.
- 25:33How about doctors in your country?
- 25:34So this is an international perspective.
- 25:37Switzerland, 83% of the Swiss,
- 25:39they're very trusting,
- 25:40strongly agree or agree that
- 25:42their doctors can be trusted.
- 25:43Britain, you know, comparator countries,
- 25:4676% US was down at 2424,
- 25:49so only less than six in ten Americans.
- 25:52I think that medical professional
- 25:53can be trusted.
- 25:54I want to throw in a caveat here,
- 25:56which I think is is a reason for
- 25:58hope and potentially a way that we
- 26:00can kind of rebuild some of this.
- 26:02The US actually has very high levels
- 26:04of satisfaction with your own doctor.
- 26:06So you may trust this,
- 26:07trust the medical profession in general.
- 26:09You may distrust healthcare,
- 26:10but they have among the highest levels
- 26:13of satisfaction with your own doctor.
- 26:15So it's kind of like people hate Congress
- 26:17but they love their own congressmen.
- 26:18It's like a similar type of thing
- 26:20going on here where people do trust
- 26:22in the relationship they have with
- 26:23their own clinicians.
- 26:24And I think this gets back to
- 26:26the power of relationships.
- 26:27So both parties without parse
- 26:29and differences, very high levels
- 26:31of trust and healthcare workers.
- 26:32That they personally know.
- 26:35And Even so, even though doctors
- 26:37in the United States generally
- 26:38may not have as high levels of
- 26:40trust as some other countries,
- 26:41and public trust and doctors and nurses are,
- 26:43is still much higher than the
- 26:45rest of the healthcare system,
- 26:46and they are among the most trusted
- 26:48professions in the United States.
- 26:51So Kaiser last year did a poll,
- 26:53and this is a poll just of people who
- 26:54are on the fence about vaccination,
- 26:56about COVID vaccination.
- 26:57Where are you going to turn to figure
- 26:59out whether you don't want to get vaccinated?
- 27:01By far,
- 27:02your own personal doctor or your own nurse,
- 27:04some healthcare provider?
- 27:05About 8:00 and 10:00 of those people
- 27:07who who are have some level of
- 27:09vaccine hesitancy said that they
- 27:10would return to their own clinician.
- 27:12So I just want to keep this in the
- 27:14back of our mind as we're thinking
- 27:15about ways to try to rebuild trust
- 27:17or counteract some of the medical
- 27:19misinformation that's spreading.
- 27:20Across the country.
- 27:24Alright, so kind of a summary slide here,
- 27:26thinking about the consequences of
- 27:29mistrust in American institutions.
- 27:31Why is it so problematic?
- 27:32I think at heart what it does is
- 27:35it creates a legitimacy vacuum.
- 27:37There is no objective
- 27:38sources of truth any longer.
- 27:41There's a real fracturing of narratives.
- 27:42So because there's not a kind of
- 27:45centralized or several, you know,
- 27:47institutions that people say, OK,
- 27:49when I don't know what's going on,
- 27:50I can really trust this.
- 27:51There's a huge fraction of narratives,
- 27:53people.
- 27:53Select their their media sources
- 27:55that tell them what they want,
- 27:57or they simply, you know,
- 27:59do what the people around them are doing.
- 28:01And those people may be influenced
- 28:03by by kind of pernicious sources.
- 28:06And then there's lack of shared purpose.
- 28:07There's no coherent shared story
- 28:09that people can feel good about that
- 28:11they're part of some collective thing.
- 28:13We're all part of the same project,
- 28:14and I think all these things
- 28:16create an epistemic environment,
- 28:17an environment of knowledge,
- 28:19and the transmission of information in
- 28:21which misinformation can really thrive.
- 28:24There's two authors that I just
- 28:25want to highlight here who I
- 28:27have a lot of respect for.
- 28:29And what they're talking about here
- 28:30is not just that misinformation
- 28:32or bad information is out there,
- 28:34but actually the processes by
- 28:36which a society comes to figure out
- 28:38what is true and what is false.
- 28:41Those processes are what have been degraded.
- 28:43And so Jonathan hate here says
- 28:45in the righteous mind,
- 28:46if you put individuals together in the
- 28:48right way such that some individuals
- 28:50can use their reasoning power to
- 28:52disconfirm the claims of others,
- 28:53and all individuals feel some.
- 28:55Common bond or shared fate that
- 28:57allows them to interact civilly.
- 28:59And you can create a group that ends
- 29:01up producing some good producing
- 29:02good reasoning as an emergent
- 29:03property of the social system.
- 29:05So really talking about it's not your
- 29:07own powers of reasoning but really
- 29:10this idea of almost peer review.
- 29:12You know,
- 29:13getting the best people to point and
- 29:15poke holes in other people's theories
- 29:17and that's how we kind of move forward.
- 29:20Another Jonathan, this time Jonathan Rauch,
- 29:22who I also really like.
- 29:24He has this both the
- 29:25Constitution of knowledge.
- 29:25Like if you haven't read it,
- 29:27I would highly recommend it,
- 29:29he says the reality based community is
- 29:30a is the social network and he he talks
- 29:32about social networks on the online sense,
- 29:34but but kind of in in person social networks
- 29:37which adheres to classical liberal science,
- 29:40not not not political liberal.
- 29:43Rules and norms, objectivity,
- 29:46factuality, rationality,
- 29:47they live not just within
- 29:49individual minds and practices,
- 29:50but on the network.
- 29:51So again,
- 29:52getting at this idea that you know,
- 29:54we often think that we make
- 29:55decisions by ourselves,
- 29:56that we understand the world
- 29:58because we are researching things
- 30:00in in in in an individualistic way.
- 30:02But actually it is the network,
- 30:04the the social milieu,
- 30:06in which we kind of test different
- 30:09theories and and iterate upon
- 30:12what we know and what we don't know.
- 30:14That gets us to A to a better place.
- 30:16And this type of thing, he says,
- 30:18you know, applies not just to
- 30:19science but to scholarship,
- 30:20journalism, government, law,
- 30:21these are these are kind of the reality
- 30:23based communities that he's talking about.
- 30:26Alright, so that's kind of the,
- 30:27I think the, the baseline in terms of
- 30:30we're in a really bad place with the trust,
- 30:32trust in almost all institutions
- 30:34that's created over the past
- 30:36few decades and now we have,
- 30:38we have this rise of medical misinformation
- 30:40in part fueled by social media.
- 30:47OK, first thing to note is that this
- 30:50information, we talk about it a lot now,
- 30:52but it's actually a very, very old problem.
- 30:54There's probably been medical
- 30:56misinformation for as long as there's
- 30:58been medicine in some capacity.
- 30:59I came across this, this Bureau of
- 31:03Investigation that the AMA opened in 1906,
- 31:06and this Bureau was really dedicated
- 31:08to quote, exposing quacks,
- 31:10analyzing suspicious nostrums,
- 31:13and alerting both the medical
- 31:14profession and the public.
- 31:15To unscrupulous promoters.
- 31:17So the language is a little antiquated,
- 31:19but the general themes are things that
- 31:21that we want to be doing, I think.
- 31:22And this Bureau was in existence
- 31:24for six or seven decades, really.
- 31:25And it published articles about
- 31:27deceptive medical claims that
- 31:29responded to questions from the public,
- 31:31worked with federal agencies like the
- 31:33FDA and the Federal Trade Commission,
- 31:35and it's served as a resource
- 31:37for journalists who are writing
- 31:39about complex scientific issues.
- 31:41They could call up this Bureau
- 31:42and sign and get someone on the
- 31:44phone I was going back through.
- 31:46Some interesting articles
- 31:47that were published in Java.
- 31:49This one was JAMA 1919.
- 31:52So a few years after this Bureau opened
- 31:53and they published this article and
- 31:54just to give you a sense of what,
- 31:56what what types of things that
- 31:57they were doing at that time.
- 31:58So this article started the
- 32:00propaganda for reform.
- 32:01This was pre World War Two.
- 32:03So propaganda just meant,
- 32:04you know information not not kind
- 32:06of what it means.
- 32:07Now this is a mist or Murray's infallible
- 32:10system tonic it could they did some
- 32:13evaluations analysis contains mercury,
- 32:15licorice.
- 32:16And this guy got a fine for $25
- 32:18Doctor Bedlington 6 Prairie
- 32:21herbs sold in Minneapolis.
- 32:22Government chemists reported that
- 32:24the odor and taste suggested that
- 32:26of highly diluted and sweetened
- 32:28whiskey also fine of $10 here.
- 32:30And this is the real ivermectin of the day.
- 32:33So Kilmer, Swamp root,
- 32:35this cures Bryce disease,
- 32:37acute nephritis, cancer of the liver,
- 32:38acute and chronic gonorrhea,
- 32:40my favorite part etcetera.
- 32:42So Phil and Phil had to have the,
- 32:44the disease that you want to be cured here.
- 32:46He got a fine of $150.00.
- 32:48So anyway,
- 32:49the point is people have been dealing with
- 32:52this in one way or another for a very,
- 32:55very long time.
- 32:56But of course it's a different
- 32:58story with social media.
- 33:00Social media really has intensified,
- 33:02I think,
- 33:03the reach,
- 33:04the speed and the consequences
- 33:06of misinformation,
- 33:08the kind of rate in which it
- 33:11accelerates information out into
- 33:12the world with very little check
- 33:14on on it before it goes out.
- 33:16Into the eyes and ears of millions of
- 33:18people is like nothing we've seen before.
- 33:20So you know,
- 33:2170% of Americans use some form
- 33:24of social media.
- 33:25But I think that's quite a bit less
- 33:27relevant than the idea that social
- 33:29media actually has a huge influence
- 33:31on our media and our politics
- 33:33and our academic institutions.
- 33:34And so basically the information
- 33:36and the agenda setting that comes
- 33:38out of social media actually
- 33:40trickles down to everyone,
- 33:42whether or not you use social
- 33:43media in some way or another.
- 33:48So I I want to be clear,
- 33:50this is going to come off as as kind
- 33:52of a social media bashing lecture.
- 33:54But but I don't think it's all bad.
- 33:57I think there are things that
- 33:58are important and that we need
- 34:00to be careful about when we're
- 34:02trying to combat misinformation.
- 34:03It does democratize discourse.
- 34:05People who did not have a say,
- 34:07who weren't able to express their opinions,
- 34:09who couldn't get around the
- 34:11gatekeepers of traditional media.
- 34:12This is a way that all those
- 34:14things can happen.
- 34:15People can get their views
- 34:16out there in a way that.
- 34:18Never had the opportunity to before.
- 34:20On the other hand,
- 34:21I think it's important to recognize
- 34:23that very few accounts can often
- 34:26dominate an entire conversation.
- 34:27So there's analysis in 2021,
- 34:29only 12 accounts.
- 34:31The disinformation doesn't.
- 34:32We're responsible for 65% of anti
- 34:34vaccine information on Twitter,
- 34:36Facebook and Instagram that year.
- 34:37So if you can just imagine that we
- 34:40think we are democratizing discourse,
- 34:42and in some ways we are at the same
- 34:44time because of the virality of
- 34:46the way that information spreads.
- 34:48This doesn't look very democratic either.
- 34:50This is 12 people or 12 accounts
- 34:52that are able to have a huge,
- 34:54huge effect on an entire conversation
- 34:58around vaccine misinformation.
- 35:02So I think the hope is and I think you
- 35:04know I have shared this hope and I I still
- 35:07share this hope to some extent that social
- 35:09media will be the two new town square.
- 35:12It will be an online space or online spaces
- 35:15where people can connect with one another,
- 35:17they can share their views and they
- 35:19can debate issues of importance and
- 35:21these are all good things that we
- 35:23hope that that social media could be.
- 35:25But I think the reality is much darker
- 35:27and and the reality right now is that
- 35:30the business model of a lot of these
- 35:32platforms relies on advertising.
- 35:33And selling of your data,
- 35:36it rewards outrage at the expense of nuance.
- 35:39It increases political polarization.
- 35:41It increases confirmation bias.
- 35:42You can always find what you want
- 35:44and what your side is saying.
- 35:46There's a tremendous amount
- 35:47of information overload.
- 35:48And so even if there's a lot of good
- 35:50information out there and there's
- 35:51some bad information out there,
- 35:53very hard to figure out what
- 35:54is what for a lot of people.
- 35:55And there's a tremendous
- 35:57amount of harassment,
- 35:58including a physicians,
- 35:59as we'll see in a couple slides,
- 36:01for people who share unpopular
- 36:02opinions that could be unpopular.
- 36:03In any sense that you want to take it,
- 36:05but particularly in this case for
- 36:08physicians and other clinicians who
- 36:09share pro vaccine messaging and they
- 36:11have been subject to tremendous
- 36:13amounts of harassment online,
- 36:14sometimes in person as well.
- 36:17So I think we all kind of have
- 36:19a general sense but but just to
- 36:20kind of put a fine point on it,
- 36:22during the COVID pandemic
- 36:24misinformation has been tightly linked
- 36:26to the use of unproven treatments.
- 36:28It has been linked to non adherence to
- 36:30public health mitigation measures and
- 36:32to high levels of vaccine hesitancy.
- 36:34I came across a study recently
- 36:36which which was, you know,
- 36:38incredibly distressing that you know,
- 36:39higher immunization rates,
- 36:41not 100% but just higher,
- 36:43could have prevented between a third
- 36:45and half of all US COVID deaths.
- 36:48Over the past year and a half.
- 36:52Alright, So what to do about it?
- 36:53Here's my comprehensive list of
- 36:55things that I'm confident will work.
- 37:00But here are some things for discussion. So.
- 37:04The the first is algorithmic adjustment,
- 37:06and we'll talk about that in the next slide.
- 37:08The second is misinformation research
- 37:11and a surveillance system to try to.
- 37:14Respond quickly to misinformation
- 37:16when it's breaking out.
- 37:17The third is the training
- 37:19of medical professionals and
- 37:20other public health officials.
- 37:21And the 4th that's really
- 37:22engaging communities.
- 37:23And that gets back to this idea of trust
- 37:26is fundamentally relationship based,
- 37:28and so it's it.
- 37:29While it's possible to gain
- 37:31trust at a distance,
- 37:32I think we're on surest footing when
- 37:34we're working within our own communities
- 37:36and with people we know and meet.
- 37:40So algorithmic adjustment.
- 37:41So as we know in the past and and and
- 37:45certainly for many platforms currently,
- 37:47their algorithms tend to promote
- 37:49more extreme and sensational content.
- 37:52That's not surprising because that is
- 37:54what we subconsciously or consciously
- 37:56seek as people and it promotes user
- 37:59engagement and the goal is to get
- 38:01you on that platform as long as
- 38:04possible and get you coming back.
- 38:05There were some interesting work done
- 38:07by Zeynep Tufekci a couple years ago.
- 38:10She's a sociologist.
- 38:11Some of you may know what she would
- 38:13do is she would basically start going
- 38:15to YouTube and putting in a topic
- 38:17that she wanted to know more about.
- 38:19And what she found was when she
- 38:20put in a topic,
- 38:21the next video that YouTube served
- 38:23her up was always more extreme than
- 38:24the one that she had started with.
- 38:26So she starts her vegetarianism,
- 38:28leads her down to veganism,
- 38:29leads her down to the Paleo diet,
- 38:31leads her down all the way to
- 38:33some crazy diets.
- 38:34She starts to talk about jogging,
- 38:37she enters jogging,
- 38:38she wants to learn jogging techniques
- 38:39that goes down to ultra marathons.
- 38:41Within a few videos you put in Trump,
- 38:43you get to some right wing conspiracy
- 38:45theories, you put Sanders in,
- 38:46you go the other way and so,
- 38:48so this is what has been going
- 38:50on in a lot of platforms and
- 38:52particularly YouTube in this case.
- 38:54And so the the broad idea is to design
- 38:56algorithms that try to reduce the
- 38:58visibility of misinformation and to
- 39:00elevate high quality information.
- 39:02And YouTube actually recently partnered
- 39:04with the National Academy of Medicine
- 39:07to start to lay out some fundamental
- 39:09principles of what platform should be
- 39:12doing to elevate credible health sources.
- 39:15These aren't groundbreaking principles.
- 39:16You want to be science based,
- 39:17objective,
- 39:18transparent and accountable.
- 39:19But I think it's a step,
- 39:20at least at YouTube,
- 39:21I think has recognized that
- 39:22a lot of this is happening.
- 39:24They're working with the National
- 39:26Academy and trying to to move forward.
- 39:28I think there's a lot more work
- 39:29that needs to be done here,
- 39:30but it's in the encouraging side.
- 39:32So one part of this is elevating credible
- 39:34information and the other part of this
- 39:37is flagging and removing misinformation.
- 39:38And this often runs into a lot
- 39:41of arguments around free speech
- 39:43and if not the letter of the law.
- 39:46You know,
- 39:46these are in general private entities.
- 39:48They can have whatever terms of use
- 39:50agreements they want to enforce,
- 39:52but the spirit of free speech.
- 39:54And that's what I think a lot of
- 39:56people often say is that, look,
- 39:57people should be able to say what
- 39:59they wanna say.
- 40:00The best ideas went out.
- 40:01This is a marketplace of ideas.
- 40:02I think for a number of reasons
- 40:04that we can talk about that,
- 40:05that that doesn't totally hold water.
- 40:07But that's kind of the argument.
- 40:08In any case,
- 40:09I do think there needs to
- 40:10be a lot more thinking,
- 40:11a lot of discussions about
- 40:13the threshold at which we're
- 40:15going to flag a claim to be misinformation.
- 40:17Science changes quickly.
- 40:19There's often a lot less certainty
- 40:21around all sorts of things
- 40:22than we would like there to be.
- 40:24And so that does need to
- 40:26be a topic of discussion.
- 40:28But the easiest thing,
- 40:28or the most straightforward thing I think,
- 40:30is to start with material that is clearly
- 40:33pernicious and false and demonstrably so,
- 40:35and that even after warnings
- 40:37people refuse to take it down.
- 40:40So I think that's that's a place to start.
- 40:42But again, these are the this
- 40:43type of thing I think is is,
- 40:45is still in its early days.
- 40:48The second topic is misinformation,
- 40:50research and surveillance.
- 40:51And so I think, you know,
- 40:53we hear a lot about misinformation
- 40:54that has fueled by social media.
- 40:56But this is an incredibly new phenomenon,
- 40:58you know, probably in the last five years,
- 41:00certainly not more than the last 10 years,
- 41:02it's really taken off.
- 41:03And so, you know,
- 41:05it's often the academic thing to say is that,
- 41:06you know, we need more research.
- 41:07That's always the,
- 41:08the last line in your paper.
- 41:10But I think in this case,
- 41:11you actually do need more research
- 41:12and we need to figure out exactly
- 41:15how misinformation spreads.
- 41:17We need to figure out what the health.
- 41:18Social consequences are,
- 41:19but importantly we need to figure out how
- 41:22they vary by things like the platform,
- 41:24the country and the demographic group.
- 41:27You know,
- 41:28they do not have one set of you know,
- 41:32effects for people with very high versus
- 41:34very low education levels or people
- 41:37in one country versus another or the
- 41:39effect of Twitter versus Facebook.
- 41:42And so these types of things I
- 41:43think we we still need to tease out.
- 41:45You know,
- 41:46there are some things that people have shown.
- 41:48Can be effective.
- 41:49I think these aren't, you know,
- 41:52widely used things but in
- 41:54studies exposing people to cross
- 41:56attitudinal news outlets,
- 41:57meaning you know if you're, you know,
- 42:00liberal seeing more of the conservative
- 42:02side of things and and vice versa
- 42:04that in other recent in other,
- 42:05you know,
- 42:06prior research has not been as effective.
- 42:08But at least some studies around social media
- 42:10have shown that this this may be the case.
- 42:13And then prompting users
- 42:14simply to think about accuracy,
- 42:17it seems to decrease their
- 42:18intention to share things.
- 42:20Either without reading it or things
- 42:21that they think might be kind of on the
- 42:24fence about whether it's true or not.
- 42:25I do want to highlight this
- 42:27that this organization,
- 42:27the social Science Research Council,
- 42:29they recently, recently.
- 42:31Launched this project called the
- 42:33Mercury Project last year and the whole
- 42:35idea is to fund a global consortium
- 42:37of researchers to really examine the
- 42:39causal effects of of misinformation
- 42:41and potential intervention.
- 42:42So I, you know, the hope is that,
- 42:44you know,
- 42:44in a few years we have a much
- 42:46stronger evidence base to figure
- 42:48out what to deploy to,
- 42:49to minimize the consequences
- 42:50and the harm of misinformation
- 42:52in the medical setting.
- 42:56You know, one of my colleagues,
- 42:58David Scales at Cornell,
- 42:59he's done a lot of work and
- 43:01misinformation and thought a lot about
- 43:04a misinformation surveillance system.
- 43:06And this is really modeled off a surveillance
- 43:08system for emerging infectious diseases.
- 43:11And so the idea would be that you
- 43:13would observe in real time deviations
- 43:15from routine levels of misinformation,
- 43:17just like we do for infectious diseases.
- 43:19You would be able to characterize
- 43:21and describe how the disease,
- 43:23in this case misinformation,
- 43:24is spreading through networks.
- 43:26You would identify as super
- 43:28spreaders and super spreading events
- 43:30and ideally then employ evidence
- 43:32based methods for neutralizing
- 43:34those missing misleading messages.
- 43:36So I want to walk through kind of
- 43:39one example of how this might work.
- 43:41You know,
- 43:42it certainly hasn't been deployed yet,
- 43:43but this this would be kind of the the idea.
- 43:47So here's the situation.
- 43:49The CDC comes out with a
- 43:51report in September of 2020.
- 43:53It's a report of 314 people who were
- 43:56infected or part of the study in July 2020.
- 44:00154 of them had COVID-19,
- 44:02160 of them were controls and
- 44:04they're trying to figure out what
- 44:05were the what were the things that
- 44:07made some people more likely to get
- 44:08COVID and other people's people not.
- 44:10They come up with these, these,
- 44:12these, these two things which are
- 44:15not incredibly groundbreaking.
- 44:16Having close contact with
- 44:18someone with COVID-19,
- 44:20you're more likely to get it and
- 44:21going to bars and restaurants,
- 44:22indoor settings where you can't
- 44:24wear your mask as much and and
- 44:26there may not be great ventilation.
- 44:27As part of the study,
- 44:28they also ask people how how much of
- 44:30the time are you wearing your masks?
- 44:32Are you ask where are you not a mask
- 44:34wearer what the people who got infected said,
- 44:36you know, 70% of them said, look,
- 44:38we almost always wear our mask,
- 44:4015% say we mostly wear our mask.
- 44:42So together 85% say we we
- 44:44mostly wear our mask.
- 44:45Obviously,
- 44:46close contact with someone in your home,
- 44:48you're not wearing your mask at that time,
- 44:50obviously when you're eating a restaurant,
- 44:51not wearing a mask.
- 44:52So, so, so that's kind of the deal here.
- 44:55OK, so the federalist is an
- 44:58online conservative magazine.
- 44:59It's not incredibly widely read,
- 45:01but it is cited a lot by right of
- 45:04center radio hosts and cable hosts.
- 45:06They interpret this study as saying mass
- 45:09and face coverings are not effective
- 45:11in preventing the spread of COVID-19.
- 45:13This was on October 12th, 2000.
- 45:15Twenty 85% of people who got
- 45:17infected wear their masks.
- 45:18Therefore Wawa masks are not effective.
- 45:23The next day, Tucker Carlson on his show,
- 45:25which reaches 4,000,000 viewers every night.
- 45:27Almost everyone,
- 45:2885% who got the coronavirus in July was
- 45:31wearing a mask and they were infected anyway.
- 45:33So clearly this doesn't work
- 45:35the way they tell us it works.
- 45:37Two days later, Donald Trump picks it up.
- 45:40Did you see the CDC,
- 45:42that 85% of people wearing
- 45:43the masks catch it?
- 45:45OK,
- 45:45I always have trouble imitating
- 45:47his sentence structure,
- 45:48but that's that's that's what he says at
- 45:50this town hall reaches 13 million people.
- 45:52OK,
- 45:52so we go.
- 45:53In this tiny federalist magazine
- 45:55online to 20 million people in the
- 45:58matter of three days because of
- 46:00because of the way this was picked up.
- 46:03So the idea here,
- 46:03and this is this is David's
- 46:05work that I'm presenting here,
- 46:06is to come up with an epidemiologic
- 46:09model to counter misinformation.
- 46:12So he says, OK, how would this work?
- 46:13So various sources would provide data feeds.
- 46:16These sources would be things like Google,
- 46:18Facebook,
- 46:18other platform based monitoring tools.
- 46:21They would provide data feeds,
- 46:23info demonologists.
- 46:24This is his version of Epidemiologists.
- 46:26Not the most catchy term,
- 46:27but info demonologists,
- 46:29they integrate this information,
- 46:31they have their own on the ground kind of.
- 46:34People that they're talking
- 46:35to what they're hearing.
- 46:36This is modeled off a program that exists,
- 46:39the program for monitoring infectious
- 46:40diseases members of which the
- 46:42clinicians share information
- 46:43through the Sentinel network.
- 46:45You would spot something like the
- 46:47mischaracterization and the federalist
- 46:49this this paper here and you would
- 46:51see that and you and you would
- 46:53see that it's starting to spread.
- 46:54Use preemptive messaging,
- 46:55basically get the studies authors out
- 46:57there reiterating their findings,
- 46:59dismissing this type of misreading
- 47:01of the information and you.
- 47:03Disseminate that information
- 47:04to community based info,
- 47:05demonologists and fact checkers.
- 47:06This would all ideally happen before
- 47:08the President of the United States,
- 47:10you know says that on national
- 47:12television to 20 million people.
- 47:13That's that's kind of the idea you know,
- 47:15whether or not something like this
- 47:17works I think depends a lot on whether
- 47:19you have evidence based strategies to
- 47:21actually counteract that type of thing.
- 47:23But at least knowing about it early on,
- 47:24having a surveillance system I think is an
- 47:27interesting idea and potentially promising.
- 47:30OK.
- 47:31So let's talk about training
- 47:32medical professionals.
- 47:33As I mentioned before,
- 47:34physicians and nurses are
- 47:36still among the most trusted
- 47:38professions in the United States.
- 47:39We obviously received very little
- 47:42information about techniques
- 47:43on how to address false claims
- 47:46either online or in person.
- 47:48That is starting to change
- 47:50at Duke University,
- 47:51I think has one of the
- 47:52most interesting programs.
- 47:52They have a program specifically
- 47:54dedicated to training clinicians
- 47:56on how to address misinformation,
- 47:58how to engage with patients.
- 48:00You have beliefs that are informed
- 48:02by poor quality information and so I
- 48:04think this is a type of model that
- 48:06I think could be could be scaled
- 48:08to other institutions as well.
- 48:12One thing I want to note is that it's
- 48:14also important to think about how to
- 48:17support clinicians who who speak out.
- 48:19I was, I was shocked when I saw
- 48:20some of these numbers.
- 48:21But, you know,
- 48:22one in four physicians,
- 48:23according to some surveys,
- 48:24have been attacked on social media.
- 48:26The most common reason is vaccine advocacy.
- 48:29One in six female physicians have been
- 48:31sexually harassed on social media.
- 48:33And 2/3 of physicians who are
- 48:35interviewed in media outlets about
- 48:37COVID-19 report being attacked online.
- 48:39Even if you say, you know,
- 48:41these are high numbers.
- 48:41We take half of that.
- 48:43That's still incredibly alarming.
- 48:44And so there are now these groups that
- 48:47are forming that are basically groups
- 48:49of clinicians who can come to your
- 48:52aid if you're being attacked online
- 48:53and and they kind of descend and they're,
- 48:56they're supposed to try to
- 48:57support you on online and kind of
- 48:59develop a little bit of cocoon.
- 49:00That is something that is happening as well.
- 49:04And the last point here is
- 49:06around engaging communities.
- 49:06And I think this is, you know,
- 49:08a really fundamental point
- 49:10for all of healthcare,
- 49:11but also for misinformation
- 49:13in particular and and,
- 49:15you know,
- 49:15really engaging with community leaders.
- 49:17And that's particularly true of
- 49:20communities that have been systemically
- 49:23targeted by misinformation.
- 49:25You know, in New York and Rockland County,
- 49:27New York has a large Jewish community,
- 49:29Orthodox Jewish community.
- 49:31It has been targeted by
- 49:33vaccine misinformation.
- 49:34For years now we had our first case of polio,
- 49:37paralytic polio in the United States and in,
- 49:40in, in, in years.
- 49:42And so this is an example that you
- 49:45can see over the past decade what,
- 49:48what, what type of Flyers,
- 49:49what type of information has been going into
- 49:51this Community and the end result being,
- 49:53you know the first case of
- 49:54paralytic polio in in a long time.
- 49:56And so working with community
- 49:58leaders who have respect,
- 50:00who have relationships within
- 50:01those communities I think
- 50:02is is is a really important.
- 50:04Part of countering misinformation.
- 50:05It doesn't have this kind of national scale
- 50:08that we always want with with everything.
- 50:10But I think it,
- 50:11it has a more more promise than
- 50:14than some of those things as well.
- 50:16You know,
- 50:16I want to,
- 50:17I want to note one thing here
- 50:19around health systems and their
- 50:21role in engaging communities.
- 50:23Where I train at MGH during the pandemic,
- 50:25they started virtual town halls where
- 50:27they were just answering questions about
- 50:29basic things that community members,
- 50:31patients were interested in,
- 50:33wanted more information.
- 50:34Out and they've continued those overtime
- 50:37and some of them are now hybrid town halls.
- 50:40But that type of thing I think lets
- 50:42you develop the type of relationship
- 50:44with the local community that when
- 50:47someone hears something online but
- 50:48they know a doctor or a nurse or
- 50:51a clinician at a health system or
- 50:52they've been going to these town
- 50:53halls for some time and it's totally
- 50:55different from what they're saying.
- 50:57I think it provides an opportunity
- 50:58to to engage them,
- 50:59but also provides an opportunity
- 51:01to gain the trust that you need
- 51:03to pre bunk claims, you know.
- 51:05Debunked by pre bunk.
- 51:06Before even people are exposed
- 51:08to certain things,
- 51:09tell them about the techniques
- 51:11that people who are, you know,
- 51:13misinterpreting studies.
- 51:14For example,
- 51:14tell them about the techniques that
- 51:16are used to try to misrepresent
- 51:18data that's out there.
- 51:22OK. So we covered a lot of ground, I think.
- 51:24But I just wanted to kind of
- 51:26summarize by saying trust has fallen
- 51:28precipitously in the past few decades.
- 51:30That has created an information
- 51:32environment and ecosystem that
- 51:33is ripe for misinformation,
- 51:35which has been supercharged
- 51:37by by social media,
- 51:38has all sorts of harmful social,
- 51:40economic and health consequences.
- 51:42And there's a few strategies that I think
- 51:45we should start thinking about to try
- 51:48to combat some of that misinformation.
- 51:49Again, I want to thank you for.
- 51:51Allowing me to come back.
- 51:52And it's always a real pleasure
- 51:53to be here at Yale.
- 52:06Thank you so much.
- 52:07Truth that was terrific.
- 52:08Now just to let you folks
- 52:10know what we're going to do,
- 52:11we're going to keep going
- 52:12until 1:00 o'clock.
- 52:13So if you've been watching and I
- 52:15think that's been like that since
- 52:17since 2003 years, but anyway,
- 52:20while I was here
- 52:21it was still like that.
- 52:22Yeah. So time stands still
- 52:24here at the Alliance center.
- 52:26What we're gonna do now is we
- 52:27will drive and I'll have a seat
- 52:29and I ask you to raise your hand.
- 52:31I'll moderate a conversation.
- 52:32And for the folks who are
- 52:33watching on zoom, if you can,
- 52:35please send in questions so that you
- 52:36have might have or comments on the
- 52:38QA portion and I'll look at that and
- 52:40I'll read some of those to drove as well.
- 52:44So we're going to let me get
- 52:45hooked up with the mic here.
- 52:46Ohh, just to let you know.
- 52:47By the way,
- 52:49before I forget, we
- 52:50have plenty of extra sandwiches, so please.
- 52:53Take one home for your loved one or for
- 52:56your lunch tomorrow or something like that.
- 52:59And for those of you who are zooming
- 53:00in and join us in two weeks when you
- 53:02can get a free sandwich on top of
- 53:03everything else and all this fellowship,
- 53:05have a seat through.
- 53:06We're going to get started.
- 53:21All right, am I? Am I working?
- 53:25Yep. Terrific. Ben. Karen,
- 53:27if you've got the mic ready, Ben,
- 53:30I think has something he wants to ask.
- 53:38What's that?
- 53:41So, so thank you for a really
- 53:43chilling and and great talk.
- 53:45You know at the beginning you pointed
- 53:48to a constellation of potential
- 53:50causes and I mean you know there
- 53:52I can see many of them are very
- 53:54plausible during your talk you
- 53:56really drilled in on social media.
- 54:00I'm curious, do you think that is
- 54:02the the primary cause of this?
- 54:04The levels of mistrust?
- 54:05It seems like you know.
- 54:07It precedes the development of
- 54:10social media and I guess I'm
- 54:12I'm left wondering if it's if
- 54:14that's a marginal contributor.
- 54:16And we don't address the like the
- 54:19fundamental underlying causes.
- 54:23Playing with social media may be just
- 54:25sort of making marginal differences
- 54:27and in the like fundamental
- 54:29problem, yeah, I mean I
- 54:30think it's a great point. So.
- 54:32So I'm not or I'm not trying to argue
- 54:35that that social media has contributed to,
- 54:37to mistrust necessarily.
- 54:39I think all those that are saying
- 54:40all those trends preceded social
- 54:42media by by decades in some cases.
- 54:44I do think it's the case that a
- 54:47lot of the misinformation and the
- 54:48speed at which misinformation can
- 54:50spread is fueled by social media.
- 54:53And so I think the environment
- 54:55that we have
- 54:56because of these forces
- 54:58whether it's political polarization,
- 54:59economic instability,
- 55:02you know secular distrust in,
- 55:05in, in institutions that I think
- 55:08creates an environment in which
- 55:10people don't know what to think,
- 55:13they don't know who to trust
- 55:14and they turn to
- 55:16alternative sources for information,
- 55:19information. And you could imagine a world
- 55:21without social media in which you would just.
- 55:24Newspapers and magazines,
- 55:25and maybe even newspapers and magazines
- 55:28that peddled miss misinformation
- 55:30and disinformation and you could
- 55:32still go to your own outlet.
- 55:34I think that would have the virality
- 55:37of spread that social media allows.
- 55:41I think still someone
- 55:43would need to go to this.
- 55:46You know this particular outlet
- 55:48to get that information,
- 55:49whereas I think social media kind
- 55:51of supercharges that and not only,
- 55:53even if you don't see it on social media,
- 55:55you know Tucker Carlson seeing
- 55:57it on social media or Trump or
- 55:59someone else who then is able to.
- 56:03You know, use it to their
- 56:05political benefit or whatever it might be.
- 56:07But I do take your point in that social
- 56:09media is certainly not everything.
- 56:10I mean this talk was focused on social media,
- 56:12but you know, even in this example, right?
- 56:15I mean it was, it was Fox News,
- 56:17a very mainstream organization that
- 56:19ultimately took it from what it was to
- 56:21to to to millions and millions of people.
- 56:24So I think it's part of the story.
- 56:27I think we would be in a much better
- 56:29place if we could reduce the level
- 56:30of misinformation on social media.
- 56:32But I think you're right.
- 56:33I mean the.
- 56:33Issues around mistrust and distrust and.
- 56:38And other forms of misinformation
- 56:39I think would would still persist.
- 56:43Jack, please. I drew that was great a
- 56:47number of people have have attributed at
- 56:51least part of this increase in mistrust
- 56:54and and distaste with the other team to
- 56:57the to the differences in socioeconomic
- 57:00status and they the widening gap
- 57:02between the the upper levels and the
- 57:06and the bottom 50% and that has that's.
- 57:12It's plausible as as a contributor if people
- 57:16are are annoyed with with the liberal
- 57:21elite who also make a whole lot more money.
- 57:25And then on the other side,
- 57:26people have noted the well,
- 57:28one study has noted that the last
- 57:31acceptable prejudice on the part of
- 57:34the upper class is for those who are.
- 57:37Uneducated or less educated,
- 57:39and that that prejudice is.
- 57:43Noticeable and even.
- 57:47And certainly detected by the folks
- 57:50on the lower end of the social scale.
- 57:53So I don't know what do you find that
- 57:56that that resonates as you do you think
- 57:58that I think that's absolutely
- 58:00true. I mean I I think.
- 58:02There I mean if
- 58:04you look at you know just economic
- 58:05data all the wage gains over the past
- 58:07couple of decades and the and the
- 58:09wealth gains have gone to not just
- 58:11the you know top 50% but really the
- 58:13top 1% or five percent 10% certainly.
- 58:17And so that creates an environment
- 58:20in which you are already living
- 58:22in kind of different worlds with
- 58:24different lives and different access
- 58:27to opportunity including education and
- 58:30I think it's a very understandable.
- 58:32A set of concerns that that
- 58:35people have against.
- 58:39Certain types of elites who then try to
- 58:42impose their views and values on others,
- 58:44and I think in some cases I agree
- 58:46with those and those are mine.
- 58:48But, but at the same time,
- 58:49I don't think there's a whole lot of
- 58:53empathy for the way in which we talk about.
- 59:00Talk about exactly what you're saying
- 59:01and I and I think it's, it's true even.
- 59:04And again I think, you know,
- 59:06I think it's something like, you know,
- 59:08I made I guess an ivermectin joke.
- 59:10And you know, even when you know a
- 59:14shisha or other people in positions
- 59:16of power call it a horsky warmer and,
- 59:19you know, kind of dismiss it as
- 59:22basically trying to call the people who.
- 59:26Whose friends may have experimented with it?
- 59:29Or. Who who feel like maybe
- 59:33there's something here.
- 59:34I feel like we talk about things in
- 59:37a way that alienates a lot of people.
- 59:39And part of the empathy that we
- 59:42need to show is not just for people
- 59:45who who are on our team, obviously,
- 59:46but really there's like radical empathy,
- 59:49which is very hard to muster for people
- 59:52who who are both different from us
- 59:54but also see us as kind of the enemy.
- 59:59OK. Karen, could you give us the gentleman
- 01:00:01there in the white shirt and while
- 01:00:02while I'm waiting for that I'm gonna
- 01:00:04go ahead and read one of the questions
- 01:00:05from the from the zoom audience.
- 01:00:08It seems that there is a cycle of mistrust
- 01:00:10between the lay public and governmental
- 01:00:12and or public health authorities,
- 01:00:14both mistrust and the other with
- 01:00:16respect to public health leaders and
- 01:00:17unwillingness to speak about matters
- 01:00:19with nuance because of the concern that
- 01:00:21it may be mishandled by the public.
- 01:00:23Leading to a feedback loop of
- 01:00:26spiraling mistrust.
- 01:00:27Do we underappreciate the degree to
- 01:00:30which our leaders and public figures
- 01:00:32are contributors and not just simply
- 01:00:34victims of the culture of mistrust?
- 01:00:36I think absolutely. I mean,
- 01:00:38I I don't think, as I said, it's not.
- 01:00:40I don't think it's unidirectional.
- 01:00:41I think a lot of the mistrust that's
- 01:00:43out there is earned, and it has been.
- 01:00:48There have been a number of failures
- 01:00:49both in the public health community
- 01:00:51but also more more generally over
- 01:00:53the past however many years,
- 01:00:54and that has contributed to earned mistrust.
- 01:00:57It's not just that I don't think
- 01:00:59people in positions of power or
- 01:01:01institutions in positions of
- 01:01:03power are the victims necessarily.
- 01:01:05I think they have done also their part in
- 01:01:08squandering that that mistrust and you know,
- 01:01:1111 obvious example of that is,
- 01:01:14you know early in the pandemic,
- 01:01:16the Surgeon general of the United States.
- 01:01:17Whatever his motivation was,
- 01:01:19maybe it was to preserve
- 01:01:20masks for medical workers,
- 01:01:22tweets out in all caps,
- 01:01:23that masks don't work and he doesn't
- 01:01:25want the general public to to use them.
- 01:01:27And that seems to be an example
- 01:01:30of obviously that's going to show
- 01:01:32some level of discord and distrust,
- 01:01:34but also not giving it to the
- 01:01:36public straight, right?
- 01:01:37That's an issue of transparency.
- 01:01:39It seems like probably the
- 01:01:40motivation there was look,
- 01:01:41we have a limited number of masks we want
- 01:01:44healthcare workers to to have access to them.
- 01:01:46But the but that.
- 01:01:47That wasn't the message
- 01:01:48that was communicated,
- 01:01:49it was masks don't work,
- 01:01:50don't don't use them.
- 01:01:52And so that is one example
- 01:01:55of of many that I think.
- 01:01:57Where there's an instance of earned
- 01:02:00mistrust on the part of people in power.
- 01:02:02So to following that along you
- 01:02:04mentioned this the Surgeon General.
- 01:02:06So it it makes me think and it makes
- 01:02:08me think of the teaching that we've
- 01:02:09the conversation that we've had over
- 01:02:11the years about institutions and the
- 01:02:13importance of institutions within the
- 01:02:15spectrum of when we discussed the mid 20th
- 01:02:17century breakdown in Europe, the Institute.
- 01:02:20Actions themselves coming apart and
- 01:02:23specifically I think that you know,
- 01:02:26where trust has been placed with
- 01:02:29regard to information was,
- 01:02:30you know, with the.
- 01:02:32The anchors from the leading networks,
- 01:02:34for example, Uncle Walter,
- 01:02:36Walter Cronkite was the guy that most
- 01:02:38Americans trusted back in the day,
- 01:02:39so we're told.
- 01:02:41But also more bringing up more
- 01:02:44closely to home the institutions
- 01:02:46such as Yale School of Medicine
- 01:02:48or Cornell School of Medicine,
- 01:02:50right that that these are institutions
- 01:02:52that at one time this meant something in
- 01:02:54terms of the that it could be trusted.
- 01:02:56But have have the institutions
- 01:02:59including the medical schools
- 01:03:00lost credibility do you think?
- 01:03:02And is there something specific that that,
- 01:03:05if so, is there something specific that we,
- 01:03:07the medical schools,
- 01:03:08the medical institutions can do to
- 01:03:11improve our own credibility or the
- 01:03:12level of trust that that we're afforded?
- 01:03:15Yeah, it's, it's a good question.
- 01:03:16I mean I I don't, I don't you know,
- 01:03:19pretend, pretend to have the answer.
- 01:03:20I mean I think that good place to
- 01:03:22start is community engagement and
- 01:03:24understanding what the the community
- 01:03:27around you needs and part of that is.
- 01:03:29I think happening, listening sessions,
- 01:03:32talking to local community leaders,
- 01:03:34understanding how you can serve the
- 01:03:36needs of the people around you.
- 01:03:37And I think, you know,
- 01:03:39I highlighted something that MGH is doing.
- 01:03:42There's other medical schools that kind of,
- 01:03:44I think some of the newer medical
- 01:03:46schools I've heard like Dell and Kaiser,
- 01:03:48they have a really huge component of
- 01:03:51community engagement which I think
- 01:03:53probably generates some level of goodwill.
- 01:03:55So you know,
- 01:03:56I think that's that's a place to to start,
- 01:03:58you know, I don't think.
- 01:03:59So I don't think it's just
- 01:04:01a communication issue.
- 01:04:02I think a lot of people who
- 01:04:05are in positions of power.
- 01:04:08Feel like you
- 01:04:08know, if if we only communicated
- 01:04:09our our our values better to people.
- 01:04:12If we only found a way to you know
- 01:04:15frame this in a better way than
- 01:04:17people would believe us or trust us.
- 01:04:19I think in most cases it's it's a, it's a.
- 01:04:22Underlying value issue,
- 01:04:23you know it's it's where
- 01:04:24are you spending your time,
- 01:04:25where are you spending your budget?
- 01:04:26Are you allowing access to the
- 01:04:28people in your communities?
- 01:04:30Are you are you serving the
- 01:04:33community in a in a robust way?
- 01:04:35Are the people in your school
- 01:04:36who are being recruited are they,
- 01:04:39are they people who have had you
- 01:04:43know opportunities through various
- 01:04:44mechanisms to to participate if
- 01:04:46if they you know for whatever
- 01:04:48reason weren't able to so.
- 01:04:50So I think it's it's looking
- 01:04:51at those types of things.
- 01:04:52As opposed to just putting the varnish
- 01:04:55on on communication and in some way.
- 01:04:58Thank you. Yes, Sir.
- 01:04:59Thank you so much. First of all,
- 01:05:01Professor Howard Foreman speaks
- 01:05:03very highly of you and says hi,
- 01:05:05I had a couple, two quick questions.
- 01:05:07The first is that it feels like it's
- 01:05:09important to address misinformation
- 01:05:10if it comes from a tweet on Twitter,
- 01:05:13not in like a long form piece
- 01:05:15necessarily because the audience is
- 01:05:16different. But it's really hard to
- 01:05:18say a true statement in 280 characters
- 01:05:21versus a lie in 280 characters. And
- 01:05:23it's hard to capture all the nuance
- 01:05:25I wanted to get your thoughts on,
- 01:05:26like how to create and spread truth
- 01:05:29in in a short amount of space at a
- 01:05:31time when people's attention spans
- 01:05:33are a lot lower than than before.
- 01:05:35And the 2nd is that it feels like a
- 01:05:37part of the spread of misinformation
- 01:05:39has to do with an emotional response,
- 01:05:41like fear mongering
- 01:05:42or anger. And it's harder to create
- 01:05:45or elicit that emotional response
- 01:05:46for something that's dry and factual.
- 01:05:48So I wanted to hear your thoughts
- 01:05:50about that as well. Thank you.
- 01:05:52Yeah. Thank you.
- 01:05:53You know, truth is is certainly,
- 01:05:55I think, at a disadvantage.
- 01:05:57You know, as you said it's
- 01:05:58it's much more appealing when
- 01:06:00something is new or interesting or
- 01:06:02sounds like it couldn't be true,
- 01:06:04probably because it isn't true.
- 01:06:05That's something that that tends to spread.
- 01:06:07I think there is good research to
- 01:06:08suggest that actually incorrect
- 01:06:10or inaccurate statements spread
- 01:06:12faster on social media than
- 01:06:14do then do factual statements.
- 01:06:16You know, I don't know if I
- 01:06:18have a solution to that.
- 01:06:20I mean 111 part of it certainly is
- 01:06:23maybe there are some things around
- 01:06:25algorithms that that that you can do
- 01:06:28to try to once you see something taking
- 01:06:31off down weight that in some in some way.
- 01:06:35You know, I think we would just be
- 01:06:37in a much healthier place if we used
- 01:06:40social media half as much as we do,
- 01:06:42or 80% less.
- 01:06:43And if everyone just did that,
- 01:06:45you know, you know,
- 01:06:46that's not a not a real policy prescription.
- 01:06:48But you can see over time how people's use of
- 01:06:51different types of social media does change,
- 01:06:53right?
- 01:06:53So younger people don't use Facebook as much,
- 01:06:56right, as some people now use
- 01:06:59Twitter much less than they used to.
- 01:07:02Whatever it might be,
- 01:07:04we know that.
- 01:07:05Instagram has substantial mental health
- 01:07:08effects for particularly teenagers.
- 01:07:10So.
- 01:07:12No,
- 01:07:12I think we often tend as people who
- 01:07:14are interested in public policy
- 01:07:16to have a policy oriented tool.
- 01:07:18And maybe there are some of that,
- 01:07:20but I think we sometimes underweight
- 01:07:23cultural forces and the power of cultural
- 01:07:26norms spreading such that maybe,
- 01:07:28you know,
- 01:07:29in five years and 10 years actually
- 01:07:31people are less on certain types
- 01:07:33of social media and more on
- 01:07:35other types of social media.
- 01:07:37You know, there are good spaces online too,
- 01:07:39you know. I think my favorite example is.
- 01:07:42You know, that's an amazing and absurd place.
- 01:07:45Like how could this happen?
- 01:07:46You know,
- 01:07:47how could people spend so
- 01:07:49much time curating this,
- 01:07:50this wonderful knowledge filled thing?
- 01:07:54But it happens.
- 01:07:55And you know,
- 01:07:56I don't spend a lot of time on Reddit,
- 01:07:57but from what I understand is on Reddit,
- 01:08:00it's not just things that are like the most,
- 01:08:03it's actually the delta between
- 01:08:04the likes and the dislikes.
- 01:08:05And so, you know,
- 01:08:07if if something is outrageous in some way,
- 01:08:09it's not just one group of people
- 01:08:11that can send it to the top.
- 01:08:12It's it's the delta between the two.
- 01:08:14So if if you're, you know,
- 01:08:15hated by half the thing and
- 01:08:17loved by half of it, you don't,
- 01:08:18you don't get very far.
- 01:08:19So maybe there are things like that,
- 01:08:22but I don't know. I I use.
- 01:08:24Social media at 80% last night
- 01:08:26that I did two years ago and
- 01:08:28it's been nice for me.
- 01:08:30So I I recommend that for other people too.
- 01:08:33Yes,
- 01:08:34if you could give it to the.
- 01:08:36So there's a woman here.
- 01:08:37Please, Karen, while we're
- 01:08:38waiting for that passage out,
- 01:08:40I like this question very much coming
- 01:08:41in from zoom because it reminds me of
- 01:08:43a of a quote I heard in high school.
- 01:08:44I think it was Thomas Carlyle,
- 01:08:46somebody who said, well, you know,
- 01:08:48in terms of fixing the world,
- 01:08:49he said make an honest man of
- 01:08:51yourself and you can be sure there's
- 01:08:53one less Rascal in the world.
- 01:08:54So this is the question,
- 01:08:56which is how do we ourselves
- 01:08:59avoid promulgating misinformation,
- 01:09:00as many did with the Purdue Pharma and HRT,
- 01:09:04for example? So how do we?
- 01:09:06The folks in this room and on this zoom call,
- 01:09:08how do we keep from being part
- 01:09:09of the problem?
- 01:09:10Yeah.
- 01:09:12Well, it's a good question.
- 01:09:13I mean you know, I don't know if you,
- 01:09:17I don't know if I would call like HRT.
- 01:09:21Misinformation, right.
- 01:09:22I mean misinformation is you know
- 01:09:24at the time you have to understand
- 01:09:26the evidence base at the time and
- 01:09:28and figure out you know what is how
- 01:09:30strong is that evidence based and if
- 01:09:32the evidence points in one way and
- 01:09:34you support that claim maybe you you
- 01:09:35should do your best to caveat it and and
- 01:09:37you know hint at the uncertainty with it.
- 01:09:39But but I don't view that necessarily
- 01:09:42as misinformation you know I I don't
- 01:09:45know I mean I think a lot of what I
- 01:09:48tried to do at least is is to try to.
- 01:09:51Foreground my uncertainty with things and
- 01:09:53that can be unsatisfying because and it's
- 01:09:56not very good social media either, right?
- 01:09:59I mean, social media is fast and
- 01:10:02snarky and and sarcastic and and
- 01:10:04and and powerful and so there is
- 01:10:07this challenge of how do you both?
- 01:10:10You know,
- 01:10:11present the nuances of an issue but also
- 01:10:13get something to go go kind of spread.
- 01:10:15And
- 01:10:15I mentioned that Shirley,
- 01:10:16I think he, he actually does
- 01:10:18a fantastic job of this and he
- 01:10:19was able to do a fantastic job
- 01:10:22educating people through, you know,
- 01:10:23not just 280 characters but have
- 01:10:25these long threads in which you know,
- 01:10:28he would really pick apart an
- 01:10:30issue and then kind of get
- 01:10:31into the nuances of it good.
- 01:10:33So. So two thoughts on
- 01:10:34that real quick through.
- 01:10:35So one is that you mentioned that,
- 01:10:38so we've got to be able to.
- 01:10:40Really assess how good the evidence
- 01:10:41is before we say that treatment X is
- 01:10:44actually the best approach to disease Y,
- 01:10:46that we should be able to evaluate the data.
- 01:10:48But if we're being honest with ourselves,
- 01:10:49I think we don't really dive
- 01:10:51that deeply into the data for
- 01:10:53everything we treat because of that.
- 01:10:55There's there's a time factor there.
- 01:10:56So we rely, we trust others.
- 01:10:58So you know I'm a pediatrician
- 01:11:01when a statement comes out from the
- 01:11:04American Academy of Pediatrics.
- 01:11:06Not going that comes out with it,
- 01:11:07here's a three page guideline or
- 01:11:09three page consensus statement and
- 01:11:12it's followed by a 30 or 40 page
- 01:11:14report technical report and I'm sure
- 01:11:16somebody's reading them technical
- 01:11:17reports but you know it's a long
- 01:11:19day and so you read the guidelines
- 01:11:20and maybe you get to the technical
- 01:11:22report if that's a specific subject
- 01:11:24you're interested in,
- 01:11:24but nobody's reading all of them
- 01:11:26for everything that they do.
- 01:11:27And so we are all at to some extent
- 01:11:30having to trust other sources as
- 01:11:32as we go through this stuff I think
- 01:11:34that we need to be really careful.
- 01:11:37Which sources we trust?
- 01:11:38I think that's right and that
- 01:11:39raises the question of like
- 01:11:40do you need to tweet it out?
- 01:11:42Like do do you have to share that if
- 01:11:44you haven't if you haven't read it
- 01:11:45thoroughly and understood the issue.
- 01:11:46And I think the impulse is yeah of
- 01:11:48course I I need to like everyone
- 01:11:49needs to hear my thoughts. But
- 01:11:52but that's not
- 01:11:52true right. I mean I think
- 01:11:54there there's one version of this is,
- 01:11:55is kind of getting to the bottom
- 01:11:56of the issue for yourself and your
- 01:11:58colleagues and having that discussion.
- 01:11:59And the other part of this is,
- 01:12:01is, you know, do you or do you
- 01:12:03not need to share it before you
- 01:12:05truly understand an issue but.
- 01:12:07But yeah, I I take your point.
- 01:12:08I think it's
- 01:12:09thank you. Yes, yes, please.
- 01:12:11Thanks so much through for the
- 01:12:13talk is really interesting.
- 01:12:14I have a question about more about like
- 01:12:17uncertainty and specifically about mistrust
- 01:12:20and misinformation kind of burgeoning
- 01:12:23during times of deep uncertainty,
- 01:12:25especially with COVID-19.
- 01:12:26We obviously have no a lot more,
- 01:12:28but there's still a lot of uncertainty
- 01:12:30and there's gradations of misinformation
- 01:12:32around various products or even how
- 01:12:34you know treatment and prevention
- 01:12:35and things like that and so.
- 01:12:37One of the things is, you know,
- 01:12:39we saw like TV outlets and
- 01:12:41companies hiring doctors, you know,
- 01:12:44to be that voice that trusted
- 01:12:46messenger to the public in trying
- 01:12:48to dispel misinformation.
- 01:12:50But a lot of the messages have come
- 01:12:51across as very black and white,
- 01:12:53like, yes, you know, certainty about,
- 01:12:55you know, certain things, you know,
- 01:12:56know about certain things but
- 01:12:58not about like the Gray areas.
- 01:12:59And there doesn't seem to be a lot of,
- 01:13:01at least in social media from doctors in
- 01:13:04very high positions or other clinicians,
- 01:13:06kind of a.
- 01:13:07A like a phrase of like I don't know
- 01:13:09or like we just don't know yet.
- 01:13:11And it seems to be a lot of, you know,
- 01:13:13reliance on confidence as a message.
- 01:13:15And so I'm curious about that and
- 01:13:17how that fosters misinformation or
- 01:13:18that the research is around how to
- 01:13:21actually message medical uncertainty,
- 01:13:23not just in times of like COVID-19,
- 01:13:24but even beyond that when we just
- 01:13:27you're still learning things.
- 01:13:28You
- 01:13:28know, I think a lot about this idea
- 01:13:32of information vacuums and when
- 01:13:35when there's an information vacuum.
- 01:13:39Something's gonna fill it.
- 01:13:40And so, you know,
- 01:13:42I think it's important to to you know.
- 01:13:45Some advice that I've gotten
- 01:13:47is it's important to get out
- 01:13:49into that space if you want to.
- 01:13:50If you want to be someone who's
- 01:13:52communicating to the public,
- 01:13:53but also lead with your uncertainty and
- 01:13:55be comfortable with uncertainty in the
- 01:13:57sense of you may not have everything right.
- 01:14:00You may not know exactly,
- 01:14:03you know what the are not
- 01:14:04of this new variant is,
- 01:14:05or you know what people should be doing.
- 01:14:08But you know, I I'm sometimes torn
- 01:14:12about it because part of me thinks,
- 01:14:14you know we should be doing.
- 01:14:15Less of trying to do public
- 01:14:19health through tweet and not not
- 01:14:21and that's not entirely helpful.
- 01:14:23On the other hand,
- 01:14:24you know,
- 01:14:25if if public health leaders and
- 01:14:28medical leaders aren't totally
- 01:14:30abdicate that that domain,
- 01:14:32other things are going to fill it
- 01:14:34and presumably that's going to be
- 01:14:35less less effective and less high
- 01:14:37quality information, you know?
- 01:14:39I I do think that there's a
- 01:14:41market for people, obviously.
- 01:14:44Obviously it seems like everyone's
- 01:14:46on social media and Twitter is
- 01:14:49is kind of where the game is at.
- 01:14:51But actually a very small percentage
- 01:14:53of the US population is on
- 01:14:54Twitter or actively on Twitter.
- 01:14:56And so there is also this market of
- 01:14:59local news, long form journalism,
- 01:15:02other sources of information.
- 01:15:04And so, you know, I think we should just be,
- 01:15:07you know, as you mentioned,
- 01:15:08Ben, I think we should be.
- 01:15:10Thoughtful about all the different
- 01:15:11ways in which we can contribute
- 01:15:13to the conversation.
- 01:15:14And it doesn't just have to be
- 01:15:16via tweet or via social media.
- 01:15:19Thank you. Yes, Sir.
- 01:15:21Thanks for the talk. It's really great.
- 01:15:24I I wanted to ask you about.
- 01:15:28Whether different categories of information
- 01:15:30like if the category of information affects.
- 01:15:35Is related to, like how much
- 01:15:39someone is likely to mistrust you.
- 01:15:43You know, a statement about.
- 01:15:44So, like, you know, it's pretty obvious
- 01:15:47that there are a lot of people who will
- 01:15:50believe that a vaccine is bad for you,
- 01:15:52when we know that for the vast
- 01:15:54majority of people, it's helpful.
- 01:15:55Um, and so I'm a I'm an oncology fellow.
- 01:15:59So like.
- 01:16:00I am like, amazed sometimes that, like,
- 01:16:03we'll prescribe chemotherapy or something
- 01:16:05that's like potentially very toxic,
- 01:16:07and the patient will trust us to give them
- 01:16:09that, but they won't get the vaccine.
- 01:16:12And so like that,
- 01:16:13that kind of makes me wonder,
- 01:16:15is there something about specific
- 01:16:16types of information that are
- 01:16:18easier to mistrust than others?
- 01:16:21Yeah, I mean I think types of
- 01:16:23information is part of it.
- 01:16:25My, my my thought is that the the stronger
- 01:16:29influences is a social one and and with
- 01:16:32the social norms are and what your
- 01:16:34community believes and thinks about it.
- 01:16:37Very few people, a lot of people want to
- 01:16:39do their own homework and some people do.
- 01:16:42But the vast majority of our
- 01:16:44opinions on issues of importance are
- 01:16:46actually shaped by our communities,
- 01:16:48our families, the people around us.
- 01:16:50And so in the case of.
- 01:16:51Chemotherapy.
- 01:16:51You know it's it's obviously can be
- 01:16:55incredibly toxic and challenging to
- 01:16:57go through but it is widely accepted
- 01:16:59there's not a polarizing divide
- 01:17:02among Democrats and Republicans
- 01:17:05are conservatives and liberals.
- 01:17:08It's something that is obviously
- 01:17:09given after you have cancer as
- 01:17:11opposed to a preventative measure.
- 01:17:13Why should I put something in my body if
- 01:17:14I don't know I'm going to get COVID right?
- 01:17:16And so I I do think there are are are
- 01:17:19certainly differences there but but.
- 01:17:21Yeah,
- 01:17:21I I think part the type of topic or
- 01:17:24information is important but but I
- 01:17:26think more important is just what are
- 01:17:28the people around you think and believe.
- 01:17:31And that's why it's so pernicious
- 01:17:33that we now have basically wholly
- 01:17:36separate information ecosystems where
- 01:17:38certain groups of people believe X
- 01:17:40and certain believe Y and and and
- 01:17:43they may not be having despised.
- 01:17:47Thanks. So we're gonna have.
- 01:17:50Still not 1:00 o'clock.
- 01:17:51So we'll keep going,
- 01:17:51but actually we will we we
- 01:17:53will we will stop at 6:30.
- 01:17:55And I appreciate you folks saying
- 01:17:56it's gonna be two more questions.
- 01:17:58The gentleman right there.
- 01:17:59And the last question,
- 01:17:59Karen will be right up here in
- 01:18:01the I'm going to go with Peach.
- 01:18:03There we go.
- 01:18:04I got to crueler. Thanks.
- 01:18:06Thanks so much for talking.
- 01:18:07And I know you were a Zuckerman
- 01:18:08fellow at Harvard and I was too.
- 01:18:10So you know, just shout out to that.
- 01:18:12First off, two-part question.
- 01:18:15First, you talked about Twitter and
- 01:18:17social media being a town square but.
- 01:18:20Usually town squares are public and
- 01:18:22they're privately held companies,
- 01:18:24and so I'm I'm curious as
- 01:18:26to what you think about.
- 01:18:27Regulating private company activity and
- 01:18:29and the the social media they're in.
- 01:18:32And then two I want to ask about
- 01:18:34let's say regulation is successful.
- 01:18:37What happens if a,
- 01:18:39you know a different administration comes
- 01:18:41in and isn't so bound to rationality
- 01:18:44and starts policing those elements that
- 01:18:46would never happen.
- 01:18:52I don't know. I think I'm
- 01:18:53skeptical that regulation will solve
- 01:18:56this problem, in part because I think
- 01:18:59it's very hard to regulate your way.
- 01:19:02Around information and misinformation
- 01:19:04and I don't know that the.
- 01:19:08You know, the senators know that much about,
- 01:19:12you know, what, like how to go about.
- 01:19:14They didn't even know what Twitter is.
- 01:19:15One of them, you know, it's like,
- 01:19:17and so maybe there's a role for it.
- 01:19:19Maybe there's, you know,
- 01:19:21ways like guardrails that they want to wreck.
- 01:19:24Maybe there's changes to, for instance,
- 01:19:27you know, a lot of these platforms,
- 01:19:28they are not publishers.
- 01:19:30And in the traditional sense,
- 01:19:32you know, they cannot be sued for
- 01:19:33anything that's on the platform,
- 01:19:34unlike the New York Times or The New Yorker
- 01:19:36or something like that. And so maybe.
- 01:19:39But, but but I I don't know if.
- 01:19:43I don't know how much regulation will help.
- 01:19:45My sense is that there'll be a lot.
- 01:19:48The avenues to to to better,
- 01:19:51healthier social media environment
- 01:19:53are in part cultural and part
- 01:19:55social and part pressure on these
- 01:19:58companies to do things differently.
- 01:20:00Obviously, I think Elon Musk is
- 01:20:01going to buy Twitter soon.
- 01:20:02So I think things are who knows
- 01:20:04what's going to happen there.
- 01:20:06But you know, I'd be curious to
- 01:20:08hear other people's thoughts about,
- 01:20:10you know are there targeted
- 01:20:13regulatory policies that would be,
- 01:20:16you know,
- 01:20:16very beneficial to the social media.
- 01:20:18This is my I I I'm skeptical of it,
- 01:20:20but maybe it's possible.
- 01:20:22So we have, we have time for one
- 01:20:24more question and then I'll just
- 01:20:26wrap it up very quickly please.
- 01:20:28So I thank you
- 01:20:29for speaking. I think you had an
- 01:20:30article in The New Yorker in January
- 01:20:32where you described the pandemic
- 01:20:34as sort of like a liminal period
- 01:20:35or like a transitionary period.
- 01:20:36And how we've become a lot more aware,
- 01:20:39at least the conversation about mistrust
- 01:20:40and misinformation is growing during
- 01:20:42the pandemic and how we may be entering
- 01:20:44a post liminal period at some point
- 01:20:46when we kind of exit the pandemic.
- 01:20:48And you see that as a
- 01:20:49time for change, I think like a
- 01:20:51the monkey pox conversation that
- 01:20:52came a few months after that and
- 01:20:55that we saw the mistrust and
- 01:20:56misinformation surrounding that topic.
- 01:20:58Do you still believe
- 01:20:59like a time of uncertainty,
- 01:21:00a time where there's like a
- 01:21:02pandemic or even an epidemic?
- 01:21:03Is the right time to start a foster
- 01:21:05trust, or do you see that as a
- 01:21:06time for more of damage control
- 01:21:07on the behalf of medical professionals?
- 01:21:10That's a good question.
- 01:21:11I mean I think times of uncertainty,
- 01:21:12you know, limbo periods, liminal states,
- 01:21:15I think it's a it's a time in which a
- 01:21:17lot of progress can and should be made.
- 01:21:19And you know, when there is
- 01:21:21uncertainty and chaos in a society,
- 01:21:24as there has been,
- 01:21:25that's the time to push things in in the
- 01:21:28direction that that you want to see them go.
- 01:21:31You know, I think there's a lot more we
- 01:21:33should be doing right now around pandemic
- 01:21:35preparedness and and it seems like we're,
- 01:21:37you know, turning you know, our.
- 01:21:40Our view away from it for some reason
- 01:21:42and and it should be massive amounts
- 01:21:44of investment of next generation
- 01:21:45vaccines and investing public health
- 01:21:48departments and improving ventilation
- 01:21:50says all the things that we know work.
- 01:21:52And you know,
- 01:21:53it happened after 1918 and it
- 01:21:54seems like it's happening now.
- 01:21:56People want to forget about what happened,
- 01:21:59but that would be a mistake.
- 01:22:01You know,
- 01:22:01I think each of the last three decades
- 01:22:04there's been a coronavirus that's mutated
- 01:22:06into a epidemic type of virus and we
- 01:22:09were lucky her the first two times.
- 01:22:11There's no reason to think that in 10
- 01:22:12years or 20 years it won't happen again.
- 01:22:14And so I do think while these
- 01:22:17things are fresh in people's minds,
- 01:22:19while the pieces are shifting and we're
- 01:22:21trying to figure out what comes next,
- 01:22:24is exactly the time to try to
- 01:22:27capitalize on on this kind of feeling
- 01:22:29that we need to do something to
- 01:22:31prevent it from happening again.
- 01:22:34Doctor Drew Bullard, thank you so much
- 01:22:36for this evening. This is terrific.
- 01:22:37We really appreciate it. Thank you.
- 01:22:43Close.