Skip to Main Content

Hybrid Effectiveness-Implementation Trials: Testing Strategies, Assessing Outcomes, and Planning for Sustainability

December 15, 2023

Speaker: Rani Elwy, Ph.D

November 17th, 2023

This seminar discusses the process of implementing evidence-based treatments covers three stages. this talk will provide guidance on designing and conducting Hybrid Type II and III effectiveness-implementation trials, focusing specifically on theoretical guidance, selecting, and operationalizing implementation strategies, and assessing implementation and effectiveness outcomes.

ID
11097

Transcript

  • 00:01<v ->Okay. Well, hello, everybody.</v>
  • 00:04Welcome to our CMIPS seminar.
  • 00:07CMIPS is the acronym for our Center for Methods
  • 00:10in Implementation and Prevention Science.
  • 00:13I'm Donna Spiegelman, the Director of the Center,
  • 00:16and this seminar is being co-sponsored
  • 00:20by the Dissemination and Implementation
  • 00:22Science Methods Core, the NIH T32 training grant,
  • 00:26Implementation Science Research in Methods,
  • 00:32and R3EDI, the Rigorous, Rapid,
  • 00:34and Relevant Evidence Adaptation and Implementation
  • 00:38to Ending the HIV Epidemic Implementation Science Hub.
  • 00:42We're very pleased to welcome our guest,
  • 00:45Dr. Rani Elwy,
  • 00:46who's Professor of Psychiatry and Human Behavior,
  • 00:50and Professor of Behavioral and Social Sciences
  • 00:53at Brown University.
  • 00:57She is a health psychologist, health services researcher,
  • 01:01and an implementation scientist
  • 01:03who examines patients' access to
  • 01:06and uptake of mental health care,
  • 01:08the effectiveness and implementation
  • 01:10of complementary and integrative health services
  • 01:13for treating mental health disorders and pain,
  • 01:17and she works on crisis and risk communication
  • 01:20between patients, families, providers, and health systems.
  • 01:25Dr. Elwy is the Founding Director of the Brown Research
  • 01:28on Implementation and Dissemination
  • 01:30to Guide Evidence Use BRIDGE Program,
  • 01:33which sounds like a sister program to our own here,
  • 01:37Co-Director of Implementation Science in Advance,
  • 01:40Rhode Island Clinical Translational Research,
  • 01:43and Implementation Scientist
  • 01:45in the Biobehavioral Sciences Core
  • 01:47of the Providence/Boston Center for AIDS Research.
  • 01:51Dr. Elwy is a VA
  • 01:53or Veterans Administration Implementation Scientist
  • 01:56serving as the multiple principal investigator
  • 01:59of two program grants:
  • 02:01The Bridging the Care Continuum
  • 02:03for Vulnerable Veterans Bridge QUERI Program,
  • 02:07which consists of three hybrid
  • 02:09type three effectiveness implementation trials
  • 02:12to increase the uptake of mental health
  • 02:14and substance use services and treatments
  • 02:16among veterans in 18 sites.
  • 02:21And two, the Evidence Policy and Implementation Center,
  • 02:25a QUERI program
  • 02:26dedicated to building capacity in implementation science
  • 02:29across the entire VA healthcare system.
  • 02:32Additionally, Dr. Elwy is a Fellow
  • 02:34of the Society of Behavioral Medicine
  • 02:37and the recipient of a VA Research Center Scientist Award.
  • 02:41So, clearly, she's got her bona fide
  • 02:43in implementation science all across the spectrum.
  • 02:48Today, she's gonna talk about
  • 02:49Hybrid Effectiveness Implementation Trials:
  • 02:52Testing Strategies, Assessing Outcomes,
  • 02:55and Planning for Sustainability.
  • 02:57And, Dr. Elwy, would you prefer to give your talk
  • 03:00and then take questions,
  • 03:02or would you like to have people pop in questions
  • 03:04into the chat as you go along?
  • 03:06(indistinct)
  • 03:20<v ->Thank you for the invitation and that introduction</v>
  • 03:22and apologies that it was so long.
  • 03:25But anyway, I'm really happy to be here.
  • 03:27And as you see,
  • 03:28I actually took out the word trials and put in studies
  • 03:31and I'll explain why in a minute.
  • 03:36So I have some funding grants
  • 03:39and just wanted to provide those disclosures,
  • 03:41and also that my views are mine alone
  • 03:44and not that of the VA or the federal government.
  • 03:48So when I think about why I feel so passionate
  • 03:52about implementation science...
  • 03:54Actually I just realized that just so you know,
  • 03:56I can't actually see the chat here in the system that...
  • 04:02So yeah, so yeah, thank you very much.
  • 04:06<v Donna>You know, I'll take care of it. (indistinct)</v>
  • 04:09<v ->Thank you.</v>
  • 04:10So when I think about why I'm so passionate
  • 04:13about implementation science
  • 04:14or how I really just fell into it, two things come to mind,
  • 04:17and I'm very well aware
  • 04:19that I'm talking to a very quantitatively strong group,
  • 04:22so at least here are some statistics for you.
  • 04:25So the first is that we all know that on average
  • 04:27it takes 17 years for 14% of research
  • 04:31to make its way into practice.
  • 04:32I recently read that the statistic
  • 04:34is now approximately 15 years,
  • 04:37but that's really not enough of a change,
  • 04:40we still have a long way to go.
  • 04:42And then the other statistic is that 85% of federal research
  • 04:47is wasted every year due to the fact
  • 04:50that we do not move our evidence into practice,
  • 04:54and this equates to about $70 billion per year,
  • 04:58which is kind of shocking.
  • 05:00And it was actually those two statistics
  • 05:03that I said to my dean
  • 05:07who made him really sort of pay attention
  • 05:09to implementation science
  • 05:10and decide to invest in our program at Brown.
  • 05:14I feel very fortunate to have grown up
  • 05:16in the VA healthcare system as an implementation scientist.
  • 05:20Our national implementation science program is called QUERI,
  • 05:23which is the Quality Enhancement Research Initiative.
  • 05:26And under the leadership of Amy Kilborn, Dr. Amy Kilborn,
  • 05:30we now have over 50 centers
  • 05:32who partner with up to 70 or more different program offices
  • 05:37and regional partners around the country.
  • 05:39One of the programs that I am PI of,
  • 05:41with three other colleagues,
  • 05:42is the BRIDGE program, which is in Bedford, Massachusetts.
  • 05:46And it's one of these five year
  • 05:50five plus million dollar grants
  • 05:52where we have three hybrid type three trials
  • 05:55that we are doing all at once,
  • 05:57implementing three different evidence-based practices,
  • 05:59and so that's what I'm gonna be talking about today.
  • 06:02I brought my implementation science knowledge and training
  • 06:06from the VA into Brown five and a half years ago,
  • 06:09and we recently rebranded our implementation science core
  • 06:12as the Brown Research on Implementation and Dissemination
  • 06:17to Guide Evidence Use BRIDGE Program.
  • 06:19And as you all know here,
  • 06:21we just don't have enough training programs,
  • 06:23we don't have enough capacity building programs
  • 06:25for implementation science,
  • 06:26and so a lot of us
  • 06:27are just starting to like implement our own as a result.
  • 06:32So I just wanted to talk about hybrid designs and trials,
  • 06:38but I'm actually gonna try to convince us all
  • 06:42to use the language of hybrid studies from here on out
  • 06:46and I'll explain why.
  • 06:47So we all know the original paper that came out in 2012
  • 06:50by Jeff Curran, Mark Bauer, Brian Mittman,
  • 06:54Jeff Pyne, and Cheryl Stettler, that was really seminal.
  • 06:58I was just starting my implementation science training
  • 07:00in this year at the Implementation Research Institute,
  • 07:04and everyone was so excited
  • 07:06about these different hybrid designs,
  • 07:07the one, the two, the three.
  • 07:08I know that many of you already know what these are.
  • 07:11I'm just gonna be focusing today on the three
  • 07:13where really the primary aim is on testing the effectiveness
  • 07:17of the implementation strategies
  • 07:18to increase the uptake of the evidence-based practice.
  • 07:24When this paper came out,
  • 07:25Brian Mittman was one of my mentors and he said, you know,
  • 07:28they really should have said right from the beginning
  • 07:30that every randomized trial is always a hybrid one,
  • 07:35and so they've actually started
  • 07:36to really change that language.
  • 07:39And so just recently they published,
  • 07:41this was from last December,
  • 07:43reflections on 10 years of using hybrid designs,
  • 07:47hybrid trials,
  • 07:48and they tried to make a good case
  • 07:50for why it's really much more important
  • 07:53to be calling these hybrid studies.
  • 07:55And I'm really happy about this because this...
  • 08:00And I know that, again,
  • 08:01I'm talking to a very quantitatively savvy group.
  • 08:04Not all things in the real world can be randomized,
  • 08:07and so we have had to do studies
  • 08:09that I would consider a hybrid
  • 08:11but didn't fit under the definition
  • 08:13because it wasn't a randomized control trial.
  • 08:16So in these updated recommendations,
  • 08:18they really have three things
  • 08:20that they want people to take away.
  • 08:21Replacing the term design in favor of the word study
  • 08:26because, as I just said,
  • 08:27many people are applying hybrids and non trial designs
  • 08:30and it is possible to conduct a hybrid study
  • 08:35to answer questions about intervention effectiveness
  • 08:37and implementation in a wide range of study design.
  • 08:40(indistinct)
  • 08:43They offer in this paper,
  • 08:44four questions to help people decide
  • 08:49which of the hybrid studies that they should be conducting,
  • 08:53and I'll tell that to you in a minute,
  • 08:54and they've also really emphasized how you can build cost.
  • 08:57And I know that we have some cost people in the audience,
  • 08:59how to really bring in cost into hybrid studies,
  • 09:02because when our ultimate goal from all of this work
  • 09:06is to implement and sustain our evidence-based practice,
  • 09:09cost is such a huge and driving factor for that.
  • 09:13So these are the four questions that are asked in the paper.
  • 09:17What is the nature of your effectiveness data?
  • 09:19How much do you expect
  • 09:21the intervention will need to be adapted?
  • 09:23How much do you know about implementation determinants?
  • 09:26And how ready are you to evaluate
  • 09:28real world implementation strategies?
  • 09:31If you know a lot about your effectiveness data,
  • 09:33if you feel that there needs to be some components
  • 09:37of adaptation built into your actual aims,
  • 09:42if you already have a good sense
  • 09:44of what your implementation determinants,
  • 09:46your barriers and facilitators are,
  • 09:48and if you feel that you can develop and evaluate
  • 09:52those real world implementation strategies
  • 09:54to address those determinants,
  • 09:56then you're probably ready for a hybrid three
  • 09:58or if not at least a hybrid two.
  • 10:00But if you're more on the end of I don't know,
  • 10:03then you probably wanna go to more of a hybrid one.
  • 10:06And so this paper really helps people
  • 10:08think that through more than the original one.
  • 10:13So this is gonna be the crux
  • 10:14of what I'm gonna be talking about today.
  • 10:15When I think about what we want to achieve
  • 10:18in implementation science,
  • 10:19I think of these three big buckets,
  • 10:21and I'm sure that there are other ones
  • 10:22and we can definitely have a conversation
  • 10:24about why I haven't included others.
  • 10:26But I think of testing strategies
  • 10:28because strategies are what is going to make us
  • 10:31be able to get things implemented uptake in the real world.
  • 10:37But it has a lot of different factors to it.
  • 10:39You need to specify and operationalize your strategies,
  • 10:42you have to randomize on strategies,
  • 10:44and for some people that's a very new thing,
  • 10:46you're not randomizing on the intervention,
  • 10:48you're randomizing on the strategies, and tracking.
  • 10:52You know, I often say that as implementation scientists,
  • 10:55our job is just to track,
  • 10:56track everything, track adaptations,
  • 10:59track whether they were fidelity consistent,
  • 11:01track your strategies,
  • 11:02and especially if you're doing something
  • 11:03across a lot of sites, that's a lot of tracking.
  • 11:05So we'll talk about that.
  • 11:07I also think that we really need to be focusing
  • 11:09on assessing outcomes,
  • 11:10and these always need to be guided
  • 11:12by a theory, model or framework.
  • 11:14You need to do this over multiple time points.
  • 11:16One time is not enough
  • 11:17in the scope of an implementation trial,
  • 11:20things change over time.
  • 11:22So one strategy might be leading to a better outcome,
  • 11:27but when you use other strategies,
  • 11:28maybe those outcomes aren't so good,
  • 11:30and so we really need to know those differences.
  • 11:33And one of the most important things,
  • 11:35and this has been a message I've had to give
  • 11:37to a lot of my clinical trialist efficacy researchers,
  • 11:40our measures need to be pragmatic.
  • 11:42Please don't use a 60 item measure
  • 11:45in your implementation study,
  • 11:47it's not gonna be used in the real world.
  • 11:48Please don't include a two hour
  • 11:51clinical structured interview to assess outcomes,
  • 11:54again, not gonna be used in the real world,
  • 11:56really focusing on pragmatic, easy to use,
  • 11:59transferable measures.
  • 12:01And then finally planning for sustainability.
  • 12:04If we don't plan, it will not happen,
  • 12:07and that's also a very hard message for people to hear.
  • 12:10How do I know, if I'm doing a hybrid type one,
  • 12:13that whatever I'm implementing is gonna be effective?
  • 12:15And I say, let's just pretend that it will be.
  • 12:18Let's find out what is happening during that trial
  • 12:21that actually can inform our next steps
  • 12:24and help us think through,
  • 12:26you know, who is gonna own this eventually.
  • 12:28And obviously this involves
  • 12:30a lot of engagement with the partners, community based,
  • 12:32health system based that we're gonna be needing
  • 12:35to eventually sustain our efforts.
  • 12:39When I think about the theories,
  • 12:40models and frameworks I use, I always have a process model,
  • 12:43and so in the VA we have this QUERI implementation roadmap
  • 12:47that everyone can use.
  • 12:49It's very foundational, but it works in every case,
  • 12:55so we have a pre-implementation phase,
  • 12:57an implementation phase, a sustainment phase.
  • 13:00It looks like you always go in one way around this,
  • 13:02but you can go back and forth,
  • 13:04especially in the pre-implementation
  • 13:05and implementation phases.
  • 13:07So we use this to guide our work
  • 13:09and this is really applicable
  • 13:10and we'll talk more about these details,
  • 13:12but it's something that you should consider
  • 13:15to have a process model
  • 13:16to drive what your steps are of your work.
  • 13:21In our Bridge QUERI Program,
  • 13:23which is testing the uptake and hopefully sustainability
  • 13:28of three evidence-based practices,
  • 13:30these are each in a hybrid type three trial.
  • 13:32So they're simultaneously happening
  • 13:34led by three different people and their teams.
  • 13:37We're working with veterans who have opioid use disorders,
  • 13:41we're working with veterans who have been recently released
  • 13:43from an incarceration setting,
  • 13:45and then we have veterans
  • 13:47who are engaging in criminal activities
  • 13:50and are going through the veteran treatment court.
  • 13:53Veteran treatment courts
  • 13:53are actually based in the community,
  • 13:55but we work with our veteran justice outreach group
  • 13:57within the VA to work with that.
  • 13:59And so, as you can see,
  • 14:00these are not easy to solve problems,
  • 14:03this is a vast amount of effort on these people.
  • 14:07So the Homeless Overdose Prevention Expansion Project, HOPE,
  • 14:12is really trying to implement an opioid overdose education
  • 14:16and naloxone distribution to reduce overdoses.
  • 14:19So that's the evidence-based practice there,
  • 14:21and that is led by Dr. Amanda Midboe at Palo Alto.
  • 14:26The PIE Project is a peer support initiative.
  • 14:30It's a Post-Incarceration Engagement Project
  • 14:32where we're really helping to work with veterans
  • 14:35when they come out of jails for social support,
  • 14:37skill building, linkage to care.
  • 14:40And then we have MISSION-CJ,
  • 14:41which is very long acronym I'll tell about it in a second,
  • 14:44where we are aiding veterans in case management,
  • 14:46assertive outreach, hybrid treatments, linkage support.
  • 14:52Also really trying to make sure
  • 14:53that we are examining the health equity needs
  • 14:57of our veterans
  • 14:58as well as how we can help them stay housed
  • 15:04if they're at risk for homelessness.
  • 15:06My job in all of this is I run our implementation core,
  • 15:10and here we are trying across all these three projects
  • 15:13to have similar phases, similar measures, similar designs.
  • 15:18So that's been a real learning experience for me
  • 15:22to simultaneously work with three different trials
  • 15:24at the same time to really make sure
  • 15:26that we are capturing data on a widespread basis.
  • 15:30Here are my three MPI colleagues.
  • 15:33Keith McInnes is running the PIE Project with his team,
  • 15:36David Smelson is running the MISSION-CJ Project,
  • 15:39and Amanda Midboe is running the HOPE Project.
  • 15:41It's definitely a village that's doing this.
  • 15:45And what's really exciting
  • 15:46is when your work is totally aligned with policy,
  • 15:49and I really want people to think about that
  • 15:51with the types of work that you do,
  • 15:52because policy is actually going to help you
  • 15:54with your sustainability.
  • 15:56So this is from the February State of the Union,
  • 15:59and there were three parts of this that completely aligned
  • 16:03with the work that we are doing.
  • 16:04So expanding peer support services in the VA.
  • 16:08Two of our three evidence-based practices
  • 16:10that I'll tell you about, the PIE and MISSION-CJ,
  • 16:14are being implemented by peer support specialists.
  • 16:16So we need more of them in the VA,
  • 16:18and so the federal government is funding this.
  • 16:21We're trying to reduce homelessness.
  • 16:24So this is also a focus
  • 16:26of the Biden Administration for veterans.
  • 16:29And then we're also expanding outreach
  • 16:31to justice involved veterans,
  • 16:32which is a very big part of the MISSION-CJ Project
  • 16:35as well as PIE.
  • 16:36So we can see that we have
  • 16:38a lot of policies support behind this
  • 16:40and we just also need the funding too.
  • 16:43So just a brief thing about HOPE,
  • 16:45and just to sort of maybe state the obvious to people,
  • 16:49I am not the content experts with any of these,
  • 16:51I'm the implementation scientist.
  • 16:53My colleagues who run these projects
  • 16:55also have implementation science expertise,
  • 16:56but we needed a central core
  • 16:59to sort of oversee all of these.
  • 17:00And so Amanda is working with HOPE in five sites
  • 17:05that are in California, Nevada, and Hawaii,
  • 17:08with veterans who have a diagnosis
  • 17:11of an opioid use disorder,
  • 17:12a stimulant use disorder or are being prescribed opioids.
  • 17:17The PIE Project, the Post-Incarceration Project,
  • 17:20really works intensely with veterans
  • 17:22when they're coming outta incarceration
  • 17:23and coordinates with our healthcare
  • 17:25for reentry a veteran office and also our housing,
  • 17:30and the HUD-VASH is taking housing vouchers
  • 17:34from the federal government
  • 17:35and then pairing that with veteran supported housing
  • 17:39and other support services.
  • 17:41There are four components to PIE,
  • 17:43linkage and referral, skill building and goal setting,
  • 17:46community reintegration, and social and emotional support.
  • 17:49So right now PIE is being implemented in six sites,
  • 17:52other sites have either previously implemented it
  • 17:54or are no longer implementing it.
  • 17:58And then MISSION-CJ, which is our most complex intervention,
  • 18:00is Maintaining Independence and Sobriety
  • 18:03through System Integration Outreach
  • 18:05and Networking-Criminal Justice.
  • 18:07MISSION was developed 25 years ago by David Smelson
  • 18:10and has had a whole bunch of evidence behind it,
  • 18:14but it's pretty complex.
  • 18:15They build it into the criminology concept
  • 18:19of risk, need, responsivity,
  • 18:22where they're trying to identify
  • 18:24which person in the criminal justice system,
  • 18:27how can we tailor what they need to support them.
  • 18:29And so they have core services
  • 18:30of critical time intervention, empowering pro-social change,
  • 18:34dual recovery therapy, peer support,
  • 18:36and then they also offer
  • 18:38some vocational and educational support
  • 18:40and trauma-informed care,
  • 18:42and they are implementing this across eight sites.
  • 18:45David actually has a really massive $12 million HEAL grant
  • 18:50where he's doing this in even greater sites,
  • 18:52so I really don't know how he's managing,
  • 18:54but there's a lot of MISSION implementation
  • 18:56happening around the country right now.
  • 18:59So HOPE has five sites, PIE has six sites,
  • 19:02and MISSION has eight sites.
  • 19:06My job is to say how are we gonna do this all.
  • 19:09How can we keep similar methods
  • 19:12across everything that we're doing?
  • 19:13And so again, we've used the roadmap model, process model,
  • 19:17and we have a really strong pre-implementation phase.
  • 19:21And I will just also say, I don't know if anyone saw it,
  • 19:23but maybe four months ago, Lisa Saldana,
  • 19:28came out with a paper with some colleagues
  • 19:29that showed across a large swath of papers
  • 19:34that they reviewed that studies
  • 19:36that had a very in-depth pre-implementation period
  • 19:41actually were more successful
  • 19:42in their implementation efforts later.
  • 19:44So a lot of people, you know,
  • 19:45they wanna just dive in and I say embrace this period.
  • 19:50Even when you have an effective evidence-based practice
  • 19:53that you wanna implement,
  • 19:54you still need to know about a new site,
  • 19:57you still need to know what the clinical workflow is,
  • 19:59you still need to know what are the anticipated barriers
  • 20:03and facilitators implementing something.
  • 20:04And so from there, we're not gonna change
  • 20:07what our evidence-based practices are,
  • 20:08but what we're gonna do is we're gonna think
  • 20:10how can our strategies address these
  • 20:12and how can we create any adaptations
  • 20:15but without changing those core components.
  • 20:17So really, you know, let yourself be immersed in this phase.
  • 20:23Yeah, of course.
  • 20:25(audience conversing indistinctly)
  • 20:42(audience member speaking indistinctly)
  • 20:58(audience member speaking indistinctly)
  • 21:29Yeah, I don't know if the Zoom audience can hear,
  • 21:33but there's some conversation around
  • 21:38how difficult it is to get RO1 funding
  • 21:41to have a substantial pre-implementation phase
  • 21:43even when you already have an evidence-based practice
  • 21:47and whether the VA is different.
  • 21:49I do think the VA is different,
  • 21:53but we have built this into RO1 grants,
  • 21:55and in fact, I will say that what was really interesting
  • 21:58for my colleagues who...
  • 22:00So what I do at Brown
  • 22:01is work with a lot of efficacy researchers
  • 22:03who are building an implementation science
  • 22:05into their grants.
  • 22:06So I have several colleagues who do suicide prevention work,
  • 22:09and they were taking a suicide prevention
  • 22:11an intervention called STEP into an inpatient setting.
  • 22:16And it has a lot of evidence-based,
  • 22:18they're just moving it to a new setting and it's an RO1,
  • 22:22and the program, the POs from NIMH came back and said,
  • 22:28you need to do, as part of this,
  • 22:30a bunch of formative evaluation
  • 22:32with a health system leadership
  • 22:34before you can do the rest of the aims.
  • 22:38They weren't saying we're not gonna fund your project,
  • 22:40they said you need to build in a pre-aim
  • 22:43before your aim one.
  • 22:44And so I worked with the team
  • 22:46to build a formative evaluation phase
  • 22:48because the STEP had never been in an inpatient,
  • 22:51it had always been used outpatient,
  • 22:52and the program office said,
  • 22:55we're not gonna fund it
  • 22:56until you know more about what you're gonna go into,
  • 22:58and so that was very positive.
  • 23:00It didn't come from the reviewers,
  • 23:02it came from the program office,
  • 23:03so you know, that's a positive thing.
  • 23:05So yes, I will say you kind of have to limit yourself.
  • 23:10You could go crazy and spend a lot of time,
  • 23:12but like we lot a lot ourselves.
  • 23:14And unfortunately we started this in October of 2020
  • 23:19when the VA was still dealing with a lot of challenges
  • 23:22with COVID and virtual work
  • 23:24and also about a month and a half
  • 23:27before the vaccines were implemented.
  • 23:30So this was not a good time to do a lot of informative work,
  • 23:34I'll tell you, but anyway, we did it.
  • 23:36And then our implementation is a lot of the training of...
  • 23:40You know, because in a hybrid three,
  • 23:42the goal is to get the people at the sites
  • 23:44to do the implementation.
  • 23:45You are helping to direct that,
  • 23:47you're providing them with support through your strategies,
  • 23:49but you should not be implementing that in the hybrid three
  • 23:52because that's not real world.
  • 23:53And so we have, in the HOPE project, social workers,
  • 23:56and in the other team, PIE and MISSION-CJ,
  • 23:58we have peer support specialists
  • 24:00and we need to train them to do this,
  • 24:02and so we spend a lot of time on that and implementation.
  • 24:06And then obviously as we go through,
  • 24:08we're also assessing our outcomes
  • 24:09but they are a secondary aspect.
  • 24:12And then sustainability.
  • 24:13So we have just finished three years of our trial,
  • 24:17see I slipped, three years of our study,
  • 24:21and we've launched year four,
  • 24:24and so some of our sites
  • 24:25have gone through the implementation
  • 24:27and are moving towards sustainment assessment
  • 24:29and some are just starting.
  • 24:30So there's kind of like a...
  • 24:33Well, it's a stepped wedge design,
  • 24:34so we're not there with all of them right now.
  • 24:38So again, I'm not the statistician on the project,
  • 24:43but I just wanted to let you know
  • 24:45that we're doing cluster randomized step wedge trials,
  • 24:49in parentenses, studies,
  • 24:51as I try to transition to this language.
  • 24:54But our overall goal is really to estimate the effect
  • 24:57of what it's like to transition
  • 24:58to a higher intensity implementation strategy package
  • 25:02from a baseline lower intensity strategy package.
  • 25:05On each of the effectiveness outcomes that we're using,
  • 25:07we're gonna use mixed effects regression models,
  • 25:10we'll have a fixed effect
  • 25:11before the implementation strategy package.
  • 25:14I keep saying package but we do know
  • 25:17because we're tracking these things really well
  • 25:19that not all of our implementation strategies are being used
  • 25:23despite all our best efforts.
  • 25:24So we're trying to track individual strategies
  • 25:27within each of these higher intensity
  • 25:29and lower intensity states.
  • 25:32And then also we will,
  • 25:34we are trying to do some balancing
  • 25:36and, you know, we did that prior, you know,
  • 25:38to look at the different site characteristics
  • 25:40of our different wedges.
  • 25:42And we have had sites drop out,
  • 25:43we've had sites that we've added in,
  • 25:45it has not been textbook perfect by any means.
  • 25:50<v Donna>I want to ask a few questions.</v>
  • 25:51<v ->Yeah, but I may not be able to answer. (laughs)</v>
  • 25:56<v Donna>In terms of your primary outcome of analysis,</v>
  • 26:00when you write implementation strategy,
  • 26:02you mean the whole bundle or your putting in variables,
  • 26:06each one of the components individually assess the impact
  • 26:10of each one of the components of the bundle?
  • 26:13<v ->So when we wrote the proposal,</v>
  • 26:14we thought bundle slash package.
  • 26:17But as we are doing a lot of tracking of those strategies,
  • 26:20I think that our analysis is gonna be by the individual
  • 26:24so they don't overlap.
  • 26:25We tried really hard to make sure our bundle of strategies
  • 26:28in the higher intensity do not overlap
  • 26:30with the bundle of strategies in the lower intensity.
  • 26:32So if only a few
  • 26:34of the higher intensity strategies get used,
  • 26:36we'll know that those are higher intensity,
  • 26:38but they may not be the whole package.
  • 26:42That is definitely a difficult thing
  • 26:44to get people to use all of them.
  • 26:47And I will tell you about those strategies in just a second.
  • 26:50So now I'm gonna move into the testing strategy.
  • 26:51So as you remember,
  • 26:52I said really important to achieve what our goals,
  • 26:55in implementation science,
  • 26:56we need to randomize some strategies,
  • 26:58we need to do a lot of tracking,
  • 26:59we need to do a lot of specifying and operationalizing.
  • 27:02And one of the things I really wanna get across is this,
  • 27:04nothing about me without me motto
  • 27:07that was developed in the late 90s, early 2000s,
  • 27:10about patient-centered care,
  • 27:11and I would argue that this is absolutely necessary
  • 27:15for doing an implementation study.
  • 27:17Because as you're developing your implementation strategies,
  • 27:20you are doing this in concert with your sites,
  • 27:23with your partners, with the champions,
  • 27:26with everyone that you're gonna be working with.
  • 27:29And I think the reason this is so important is because,
  • 27:32in the end,
  • 27:33successful implementation is going to be because of trust.
  • 27:38And Alison Metz wrote a paper
  • 27:40in "Implementation Science Communications" last year
  • 27:43on sort of thinking of trust in a conceptual way
  • 27:45across implementation studies,
  • 27:47and she talks about intrapersonal trust
  • 27:50and interpersonal trust.
  • 27:51But I believe that in any shape or form,
  • 27:55this is why the pre-implementation work is so important,
  • 27:58you are building trust through those efforts.
  • 28:00And so people realize
  • 28:02that you are not just doing a research study
  • 28:04'cause you got funding,
  • 28:05you're not just trying to write a publication,
  • 28:06that you're actually trying to change care
  • 28:09and improve care and make lives better.
  • 28:11And so if people can see that
  • 28:13as you're doing your formative work,
  • 28:15then I think that that is the basis
  • 28:17for your successful implementation.
  • 28:20I like to show this slide
  • 28:22about the nine buckets of implementation strategies.
  • 28:25Even though it's the earlier paper
  • 28:26on implementation strategies by Byron Powell and colleagues,
  • 28:30it's really hard for people to conceptualize
  • 28:3173 implementation strategies,
  • 28:33but when you think about them in nine buckets,
  • 28:35I think that's much more helpful.
  • 28:37And so when you look at them,
  • 28:39you can imagine trust being part of all of this.
  • 28:41You know, when you're assessing people for readiness,
  • 28:44when you're trying to do interactive assistance,
  • 28:46when you're doing adaptations,
  • 28:48when you're trying to develop relationships
  • 28:50and training people, supporting clinicians,
  • 28:53engaging consumers, et cetera,
  • 28:54trust is such an important part of that.
  • 28:56People wanna know like,
  • 28:57why are you interested in doing this?
  • 28:58You know, what is in it for me and what is in it for you?
  • 29:03So I think relationship building is so critical
  • 29:06and the trust piece comes naturally with that.
  • 29:10So in addition to our process model
  • 29:12of the QUERI implementation roadmap,
  • 29:13we also have a model that's helping us examine
  • 29:16the various determinants
  • 29:18that we're going to be needing to consider throughout.
  • 29:21And we chose, because it's a hybrid type three
  • 29:23and we're focused on sustainability,
  • 29:25The Dynamic Sustainability Framework
  • 29:26that was developed by David Chambers,
  • 29:28Russ Glasgow, and Kurt Stingy.
  • 29:30And this is saying that there are three main components
  • 29:34that we need to consider, intervention,
  • 29:36the evidence based practice, the practice setting,
  • 29:38the context of what we're implementing,
  • 29:40and then the wider ecological system,
  • 29:42which is very much a thing to think about
  • 29:44from a sustainability perspective.
  • 29:46But what's different about their suggestions is that,
  • 29:48again, it's not just a one-time assessment.
  • 29:51How does this change over time?
  • 29:53So at the pre-implementation phase,
  • 29:54it may look like one thing,
  • 29:56at the implementation phase,
  • 29:57it may have a different feeling about it,
  • 29:59and then at the sustainment phase, we might see things.
  • 30:02So we need to have a constant process
  • 30:05by which we're examining that.
  • 30:07And in fact, Enola Proctor recently published a paper
  • 30:10sort of looking at, you know,
  • 30:1110 years of implementation outcomes
  • 30:13according to her implementation outcome framework.
  • 30:16The critiques that she has of the literature
  • 30:18is that people are just doing one-time assessments
  • 30:20of implementation analysis, that's just not enough.
  • 30:24So this is just an example from HOPE
  • 30:26about how we're sort of doing this
  • 30:27across the three phases of the roadmap
  • 30:29and then guided by The Dynamic Sustainability Framework.
  • 30:32So in phase one,
  • 30:34they did 52 interviews of various people at the sites,
  • 30:38and HOPE is still in the phase two and phase three stages,
  • 30:41as are all the other projects.
  • 30:42So there's been 21 interviews
  • 30:43so far in implementation phase,
  • 30:4521 interviews so far in sustainment phase.
  • 30:48But really doing interviews with the housing,
  • 30:50supportive housing staff, the prescribers for the Naloxone,
  • 30:55other key staff, pharmacists,
  • 30:58social workers and veteran patients.
  • 31:02And we're using a rapid directed content analysis approach
  • 31:06with really guided by
  • 31:08The Dynamic Sustainability Framework construct.
  • 31:12So when we decided on this proposal,
  • 31:15we had years of research building up to this
  • 31:18and so we decided that we were going to use facilitation
  • 31:21as our implementation strategy.
  • 31:24But when I say that, it sounds so funny,
  • 31:26because facilitation is like literally like 10 things.
  • 31:28It's a bundle in itself, so it's a natural bundle of things.
  • 31:32And so we're trying to use, you know, engagement,
  • 31:35identifying champions, action planning, staff training,
  • 31:38problem solving, technical support,
  • 31:41which is different from technical assistance, I'll just say.
  • 31:44So technical support is a much more hands-on process
  • 31:47and audit and feedback process.
  • 31:48So lots of things go into facilitation,
  • 31:50it's a naturally existing high intensity bundle.
  • 31:55And then we start with the more lower intensity bundle,
  • 31:59which is either education outreach or academic detailing.
  • 32:04They're very similar.
  • 32:05HOPE uses academic detailing,
  • 32:06the other projects use education outreach.
  • 32:08But this is more
  • 32:09to really have these targeted structured visits,
  • 32:12we're delivering tailored training
  • 32:14and we're doing technical assistance
  • 32:15as in contact us if you have a problem
  • 32:17as opposed to us contacting you.
  • 32:20So it's much more lower intensity.
  • 32:22I would love for our results to be really strong
  • 32:26in the lower intensity
  • 32:26because that's gonna be much more sustainable.
  • 32:28But that's an empirical question so we will...
  • 32:33It really comes from the world of pharmaceuticals, I think,
  • 32:37people would show up and have like a one-on-one and say,
  • 32:40"Dr. Spiegelman, let me tell you
  • 32:42about this medicine that I have
  • 32:45that can help patients with diabetes,"
  • 32:47and they'll have like a one-on-one conversation
  • 32:48and really just tell them about it.
  • 32:49And so we've taken that and made it into, I mean,
  • 32:53and we're not dealing,
  • 32:54but other people have made that
  • 32:55into more of a one-on-one strategy just to inform.
  • 32:58It's more of an educational activity.
  • 33:03So the good news is there's lots of tracking supports
  • 33:06available for us out there that are getting published.
  • 33:09The bad news is they're a lot of work.
  • 33:13So in 2020 our colleagues in Little Rock Arkansas
  • 33:17who are really have been the group
  • 33:19that have been defining what facilitation is,
  • 33:22doing lots of trainings, they have a manual,
  • 33:24if you need that let me know,
  • 33:25but it's probably listed in that paper.
  • 33:28But in this paper they actually, as an appendix,
  • 33:31gave a Excel tracking sheet.
  • 33:33This is how you can track facilitation and we use this,
  • 33:36we've adapted it a little bit
  • 33:37because we're also tracking the stage of implementation
  • 33:40in which people are at, but we can see what type of event.
  • 33:43So when you have facilitation,
  • 33:44you have an external person who's part of the team,
  • 33:46you have an internal facilitator at the site
  • 33:48and you're working one-on-one,
  • 33:50but the internal facilitator
  • 33:51is the one who's doing the work.
  • 33:53So our peer support specialists,
  • 33:54our social workers are the ones doing the work.
  • 33:56And so we are tracking all type of communication they have,
  • 33:59we're tracking the type of personnel involved at the site,
  • 34:03we're tracking the facilitation activity codes,
  • 34:05which of the various things of facilitation are happening.
  • 34:09Really importantly,
  • 34:10we're tracking how many hours and minutes
  • 34:12each facilitation activity takes.
  • 34:15And so clearly we know what we do on the external side,
  • 34:20what our research staff does
  • 34:21when they reach out to the peer support specialist
  • 34:22or the social worker,
  • 34:24what we don't know is what happens on their side.
  • 34:25And so we have these check-in calls with them,
  • 34:28it might just be 15 minutes just to say like,
  • 34:30what did you do this week? Who did you talk to?
  • 34:33And that's really essential
  • 34:34because to ask people to complete this type of tracker
  • 34:37would be really difficult to do.
  • 34:40We actually also adapted this
  • 34:42so that we could add in some education
  • 34:44and academic detailing outreach to this
  • 34:47so that we didn't have to have more than one tracker.
  • 34:50(audience member speaking indistinctly)
  • 34:52I'll come to that. (laughs)
  • 34:55(audience member speaking indistinctly)
  • 34:57Yeah, no, this is really, thank you for...
  • 34:59'Cause I was gonna say this
  • 35:00and what has been so essential about this
  • 35:02is because we know what type of personnel it is,
  • 35:04we know how much time they spend,
  • 35:05we can actually estimate their salary
  • 35:08and we know how much every facilitation activity took.
  • 35:11Yes, we're doing that.
  • 35:12It was actually a requirement of the project to do that.
  • 35:15So we've created our own sort of Excel dashboard,
  • 35:18it's not really a dashboard
  • 35:19because it's not updated automatically,
  • 35:22but we're sort of tracking every project.
  • 35:24This is the PIE project with its six sites
  • 35:27to see whether people are in pre-implementation
  • 35:28and implementation.
  • 35:29So we're sort of using these trackers
  • 35:31to find out where those are.
  • 35:33And then here is a snapshot
  • 35:35of how we can sort of examine
  • 35:37those different facilitation activities.
  • 35:39The pie chart on the left
  • 35:40shows us the different support staff
  • 35:43who are involved in the facilitation activities.
  • 35:45So we are working with either the social worker
  • 35:48or the peer support specialist,
  • 35:49but then they're going on and working with other people too.
  • 35:52And then we see on the right side,
  • 35:55all the different activities
  • 35:56that's happening in facilitation,
  • 35:57and you can see that some are more popular than others,
  • 36:00and so this is why we know that not all are getting used.
  • 36:04And then the bottom left shows us
  • 36:05how much time in minutes is being spent.
  • 36:09And so obviously site one and four are doing great,
  • 36:13like they're really spending a lot of time on this
  • 36:15and the other sites are spending less time,
  • 36:17that doesn't necessarily mean that they're worse or better,
  • 36:19it's just there's so many different dynamics
  • 36:22that go into any organization in any site.
  • 36:24And so, you know, this'll be something
  • 36:27that we'll have to examine when we do our analysis,
  • 36:29but it could be really important to know,
  • 36:31does more time lead to better outcomes? Who knows?
  • 36:37So how are we gonna assess these outcomes?
  • 36:39So again, remember guided by a theory model framework,
  • 36:42multiple time points as outcomes change over time
  • 36:45and involving pragmatic measures is really key.
  • 36:48So on the left column is all the pragmatic measures
  • 36:51that we said and/or how we are going to conceptualize
  • 36:54some of these things.
  • 36:55So at the top we have an organizational readiness
  • 36:57for implementing change scale,
  • 36:59we have the three quantitative assessments
  • 37:02of acceptability, appropriateness and feasibility,
  • 37:05which align with the Proctor model.
  • 37:07We are looking at four of the re-aim outcomes
  • 37:11in terms of implementation outcomes,
  • 37:13reach, adoption, fidelity to the implementation,
  • 37:16and then sustainability.
  • 37:18And so each of these is being assessed in different ways,
  • 37:21slightly,
  • 37:22the measures stay the same across all three studies,
  • 37:25but the re-aim ones
  • 37:27are a little bit different depending on the project.
  • 37:30So these are the two scales that we're using,
  • 37:34scale packages,
  • 37:35organizational readiness for implementing change,
  • 37:38and then the implementation outcome measures.
  • 37:41Yes, oh.
  • 37:44(audience member speaking indistinctly)
  • 37:51They're really kind of basic.
  • 37:53(audience member speaking indistinctly)
  • 38:00Oh, interesting. Yeah, I haven't heard that section.
  • 38:05I did reach out to him at one point,
  • 38:06because I work with a lot of clinical trials,
  • 38:10they were like, is there a cutoff point for these scales,
  • 38:12and Brian was like,
  • 38:13"Oh no, we're nowhere near having that kind of data."
  • 38:16But I agree that they're really basic
  • 38:19and we do use them at three time points,
  • 38:22luckily they are not exactly next to each other,
  • 38:26like they're like a six months to 12 months apart.
  • 38:29(audience member speaking indistinctly)
  • 38:30Well, each of those has four questions.
  • 38:33Yeah, this has 12.
  • 38:35(audience member speaking indistinctly)
  • 38:51We've also, I think,
  • 38:53only given the feasibility to some people,
  • 38:55I don't think the feasibility
  • 38:56has been relevant for everybody.
  • 38:57Oh, there's something in the chat.
  • 38:59(audience member speaking indistinctly)
  • 39:07ORCHA's big, this is 12 item.
  • 39:10Yeah, there's some statement that they're having,
  • 39:13people are having trouble hearing the questions.
  • 39:14So the questions are about just the relevance
  • 39:18and the usefulness of these implementation outcome measures
  • 39:21because people find them very repetitive
  • 39:24and not really informative,
  • 39:28and I think there's more work to be done
  • 39:30in this space for sure.
  • 39:32(audience member speaking indistinctly)
  • 39:44Oh, yeah, that's great to think about.
  • 39:48<v Speaker>You have an international audience,</v>
  • 39:49people participating from all over the world.
  • 39:52<v ->Oh, wonderful. I'm glad this time works. Thank you.</v>
  • 39:55So in our dashboard we are collecting the data,
  • 39:59so this we can do it automatically.
  • 40:01So we send out a red cap survey to people, they complete it,
  • 40:05and it transitions into our Excel spreadsheet automatically.
  • 40:10So this is actually some very good part
  • 40:12of having these three projects
  • 40:14and being able to collect the data that way.
  • 40:16So we have sort of some assessments,
  • 40:18this is by site of where people are with their mean scores,
  • 40:22and then we can sort of compare the scores
  • 40:25across the different projects as well.
  • 40:29The challenge here is that
  • 40:31everybody's not doing the same measurement at the same time,
  • 40:33so we try to stay on top of this and sort of remind people
  • 40:37that it's a little bit more challenging
  • 40:39than I hoped it would be.
  • 40:40Just looking at reach for HOPE, you know,
  • 40:44maybe the bottom is the best to look at.
  • 40:46We just had a technical expert panel meeting
  • 40:48on November 7th,
  • 40:49so a lot of these more recent information
  • 40:52on reach and adoption, implementation,
  • 40:54just came from that November 7th meeting.
  • 40:56So you can see that the percent
  • 40:58of eligible veterans offered
  • 41:00versus six months post implementation
  • 41:03has definitely gone up in the opioid education
  • 41:08and naloxone distribution.
  • 41:09So this all very positive, no statistics here,
  • 41:13we're just tracking it at the moment.
  • 41:15We're also looking at how many people were offered it,
  • 41:17how many case managers trained,
  • 41:19and one site has just started implementation in this case.
  • 41:23<v ->Go back a second?</v> <v ->Yeah.</v>
  • 41:24<v Speaker>There's a very interesting issue</v>
  • 41:26that arises in looking this data,
  • 41:29which is, you know,
  • 41:30we always think about privacy of individual,
  • 41:32but actually in some of the work I've done now,
  • 41:36we found that there are privacy issues
  • 41:38all concerns by facilities
  • 41:41where like it could be embarrassing to say Palo Alto
  • 41:44that they were only at 19%
  • 41:47and they could even get in trouble or lose their funding.
  • 41:50And so I'm wondering-
  • 41:51<v ->I shoulda probably. (laughs)</v>
  • 41:55Yeah, we probably should've DM'd.
  • 41:56Maybe I'll go on, so no one's looking at that anymore.
  • 42:01Yeah, well, I mean I probably should have done it here too.
  • 42:04So yeah, I mean I think it's like, you know...
  • 42:08I think everyone knows that people are working really hard.
  • 42:13We've done a lot of qualitative work
  • 42:15to show that the reasons why people don't offer it,
  • 42:18like veterans get offended that you're offering it to them,
  • 42:22like, is one, you know, like other social workers said
  • 42:26that it was not within their scope of work,
  • 42:28you know, like scope of practice.
  • 42:30So we have a lot of barriers that we've identified
  • 42:33that we're trying to address,
  • 42:34and obviously some of those things
  • 42:35we need to raise to a higher level
  • 42:36to say like we go to the National Social Work Agency
  • 42:39and say, "How can you help us?"
  • 42:42Because we want social workers to be able to do this,
  • 42:44but they don't think it's in their scope of practice.
  • 42:45So yeah, so those are the types things
  • 42:47that we are working on.
  • 42:49Then we look at how many veterans
  • 42:51have been released from jail in the PIE Project,
  • 42:54the Post-Incarceration Engagement,
  • 42:56and we can see how many we actually served in our project.
  • 42:59You know, there's six sites on the bottom,
  • 43:01so this is more just to give you an idea,
  • 43:03you don't have to look at the details,
  • 43:04but just to how we're trying to track things.
  • 43:06And then we also look at
  • 43:09the different total encounters post release.
  • 43:12So it's just, you know, some projects are just starting,
  • 43:15like we know, so we don't have very much data in them there.
  • 43:20And then in MISSION,
  • 43:22we are tracking who is trained at each of the sites.
  • 43:25And this is actually a really difficult one
  • 43:27because we're working with a community organization,
  • 43:29which is the veteran treatment court,
  • 43:30and so we're now going completely outside of the VA
  • 43:33to do this, so it's really challenging.
  • 43:36Who's trained, who's implementing
  • 43:38of those who have been trained,
  • 43:39how many veterans have been served,
  • 43:41and how many mission encounters have happened,
  • 43:43and mission encounters are pretty complex.
  • 43:45So the fact that there are over a thousand already
  • 43:49after year three is really great news.
  • 43:51<v Speaker>Can you remind us what a mission encounter is?</v>
  • 43:53<v ->Yeah, mission encounter is,</v>
  • 43:55I'll just quickly scroll back
  • 43:57'cause I won't be able to remember it all.
  • 43:58it's several different evidence-based practices.
  • 44:04There's a critical time intervention, pro-social change,
  • 44:08dual recovery therapy, peer support sessions.
  • 44:11So lots of things happening,
  • 44:13and this is to keep veterans out of the jail basically.
  • 44:19Sorry, close your eyes as I scroll back down.
  • 44:26<v Donna>It's all so interesting,</v>
  • 44:27but we have 10 minutes left.
  • 44:28<v ->Okay, yeah,</v>
  • 44:29so here are some of our effectiveness outcomes,
  • 44:31which we're not assessing yet,
  • 44:33but we're gonna look at linkage to care, overdose rates,
  • 44:35criminal recidivism, et cetera.
  • 44:39So from planning for sustainability,
  • 44:41I'll just go and just give you a high level overview
  • 44:44of what we're trying to do to do this.
  • 44:45So we just started year four,
  • 44:47we've decided this is the right time
  • 44:49to really start engaging our partners.
  • 44:50We talked about this at the November 7th meeting.
  • 44:53And so just as a paper,
  • 44:55this is a really useful paper to look at
  • 44:57in terms of thinking about how do you design
  • 44:59post dissemination and sustainability.
  • 45:01I really have learned a lot from this group.
  • 45:05And we're collecting a lot of qualitative data, as I said,
  • 45:08we're actually putting that into our dashboard
  • 45:12to sort of see what qualitative data emerges
  • 45:14from each of the three phases of pre-implementation,
  • 45:18implementation and sustainment,
  • 45:19all guided by The Dynamic Sustainability Framework.
  • 45:24This was a poster presented last year at the DNI conference.
  • 45:27And we're using this tool,
  • 45:29the Program Sustainability Assessment Tool,
  • 45:31which is freely available online,
  • 45:33developed by Doug Luke at Wash U.
  • 45:35We do not have people fill out this 40 item survey,
  • 45:38and this is like for our program partners.
  • 45:40What we do instead is when we have our conversations
  • 45:43like we just had with the technical expert panel meeting,
  • 45:46we'll choose a few of these concepts to talk about.
  • 45:49So what are we gonna need
  • 45:51in terms of organizational capacity to keep this running?
  • 45:54What is the funding going to be like?
  • 45:56How do we adapt this to continue to make it useful?
  • 45:59What information do you still need?
  • 46:02So we are using this more in a conceptual way,
  • 46:04and I do this with a lot of my NIH funded projects too.
  • 46:08This is a very short and sweet pragmatic measure
  • 46:11called PRESS to get at sustainment use.
  • 46:14So these three questions, Donna, are being asked to people.
  • 46:19This is a fairly new measure.
  • 46:22So we are trying to see,
  • 46:24when we're done with the implementation effort,
  • 46:26are people using PIE, are people using HOPE,
  • 46:28are people using MISSION
  • 46:30now that we're no longer actively implementing?
  • 46:34And then just as a cost piece that you brought up,
  • 46:36we are using the coins,
  • 46:38which is built on the sticks of coins
  • 46:40as for cost of implementing new strategies.
  • 46:42The stick is the stages of implementation completion,
  • 46:45both have been developed by Lisa Saldana.
  • 46:47So we are actually taking these eight steps
  • 46:51of the stick through our tracker,
  • 46:54we added them to our facilitation tracker,
  • 46:56and we're deciding which of our activities
  • 46:58are in pre-implementation, implementation and sustainment,
  • 47:02and then we already have that data
  • 47:04on how many hours and minutes, the personnel involved,
  • 47:07and we are capturing those costs.
  • 47:09I'll just say that we use the same exact approach
  • 47:11in a Cory funded paper.
  • 47:12This just came for Cory funded project,
  • 47:14this paper just came out.
  • 47:16I only leave it here just to show
  • 47:18like a completed cost of implementation.
  • 47:22You know, I'm not an economist, I led this,
  • 47:25it took me out of my comfort zone,
  • 47:27but I've decided that if I can do it, anyone can do it.
  • 47:31And so really just wanna wrap up
  • 47:33and say this is a gigantic village project,
  • 47:37I think team science is everything.
  • 47:39Everyone on here has something to contribute
  • 47:41and it's absolutely not me,
  • 47:45it's all of these people that we have made this possible,
  • 47:49and we'll have our final results in two years.
  • 47:52So we'll stay tuned
  • 47:52to see how effective everything has been.
  • 47:55So questions.
  • 47:57(audience member speaking indistinctly)
  • 48:05<v ->The question here is,</v>
  • 48:07can we hear about power consideration
  • 48:09for effectiveness versus implementation outcomes
  • 48:11with hybrid studies one through three.
  • 48:14Specifically, how much should we consider power
  • 48:16for implementation outcomes in hybrid two and three studies?
  • 48:19We did actually, not me,
  • 48:24we did do power analysis on our hybrid threes.
  • 48:27We estimated how many veterans
  • 48:31we would need to include across all sites,
  • 48:35so not just one site because we're aggregating data,
  • 48:40and I think we have met that bar,
  • 48:43but I would be very happy
  • 48:46to put you more in touch with our statistician if you...
  • 48:50I'm gonna capture this person's name
  • 48:53and I'll put them in touch with our statistician.
  • 48:55But yes, it's necessary to power
  • 48:59because we are going to be using these regression analysis
  • 49:02to determine whether which types of strategies
  • 49:05actually led to increased uptake,
  • 49:07but also the uptake is one implementation outcome,
  • 49:10but we wanna look at the other implementation outcomes too.
  • 49:15(audience member speaking indistinctly)
  • 49:23Yeah, I think I actually have it in my proposal
  • 49:27in my laptop somewhere that I could look up,
  • 49:29but I think we had to do a milestone report
  • 49:35of how many providers or people we thought we would train,
  • 49:39peer support specialists, social workers,
  • 49:41so we have that for every project,
  • 49:43how many veterans that they would then serve.
  • 49:46So we have that and we kind of have it by time,
  • 49:50so after year one, year two, year three,
  • 49:52of course when we wrote this proposal,
  • 49:54COVID had not happened.
  • 49:54We just submitted this in December of 2019,
  • 49:58so we absolutely got behind on that.
  • 50:01So we didn't follow the milestones
  • 50:04despite best efforts as best as we can.
  • 50:07Yes, so the provider piece is important,
  • 50:09but then we also are learning all about their challenges
  • 50:13with talking to veterans.
  • 50:14Like the fact that we could get a provider
  • 50:17really bought into this, trained, willing to implement it,
  • 50:20but if their initial conversations
  • 50:22with veterans are not positive, then that's a challenge too.
  • 50:26So we're trying to interview veterans to learn more too.
  • 50:33(audience member speaking indistinctly)
  • 50:41Oh, I'll repeat.
  • 50:43I'll repeat the question.
  • 50:46(audience member speaking indistinctly)
  • 51:07So the question is about...
  • 51:09Thank you, Jen.
  • 51:11The question is about how do you build trust by showing up
  • 51:15when you have real world challenges of,
  • 51:19you know, you aren't funded on a project at that point.
  • 51:22So in this project were funded to do this,
  • 51:25but we also had...
  • 51:29Often, by the time you get to a hybrid three,
  • 51:30we've already built relationships.
  • 51:33These sites are new to us,
  • 51:34but our program partners are not new,
  • 51:36and so we had their backing to help us.
  • 51:38But for people who are just starting out doing this,
  • 51:42I often have had calls from, you know,
  • 51:44a full professor saying,
  • 51:46"I've never done this before. How do I do it?"
  • 51:48And I literally say to them,
  • 51:50"Please go have coffee with someone."
  • 51:52Like, they'll say, "The only thing I've ever done
  • 51:55is I've gone to a clinic
  • 51:56and I've hung up flyers for my project."
  • 51:58That's the extent of their engagement, and I get it,
  • 52:03like all they needed is to recruit people from that site.
  • 52:05That site had to say,
  • 52:06"Sure, you can recruit people, but we're not gonna help you.
  • 52:09This is something you do on your own."
  • 52:10But I say, you know, yes, it takes time and effort,
  • 52:15but try to meet with a clinic head, somebody,
  • 52:19and just do not tell them what you wanna do.
  • 52:22Go meet with them and say,
  • 52:23"I would love to learn more about what matters to you.
  • 52:27What are you trying to work on? What are your priorities?
  • 52:30What keeps you up at night? What would you love to address?"
  • 52:32And they'll tell you seven things right off the bat.
  • 52:35One of those things might already align
  • 52:37with what you wanna do.
  • 52:39And at that point though, they feel like you're listening,
  • 52:42you are, and then you can say, well, I have this idea,
  • 52:44and you can start that conversation.
  • 52:46If, for example, they don't say anything
  • 52:48related to what you wanna do,
  • 52:50then I say, then step back and go,
  • 52:53well, clearly there's a mismatch.
  • 52:55You know, and they're the ones who are living,
  • 52:57breathing this day to day,
  • 52:58and maybe your idea needs to change a little bit.
  • 53:01But I love the idea of starting with asking questions
  • 53:06and showing up as opposed to
  • 53:08coming in with a fully developed specific aims page
  • 53:10and saying, I really wanna do this, yeah.
  • 53:13Is that what you meant? Yeah.
  • 53:17And someone says,
  • 53:19it'll be interesting to have an in-depth session
  • 53:21on the methodologies you...
  • 53:23Sorry, I can't read this thing, there's a little thing.
  • 53:28Oh, that you'll apply for analysis
  • 53:31of the stepped wedge design.
  • 53:33Yes, luckily that isn't me,
  • 53:37but so in our implementation core,
  • 53:39we have a qualitative core and we have a quantitative core,
  • 53:42and I meet with them.
  • 53:44I'm a stronger qualitative person,
  • 53:47that doesn't mean I haven't done quantitative analyses,
  • 53:49but our quantitative statistician is Dr. Tom Berry,
  • 53:54he's at BU,
  • 53:56also our head economist Dr. Laura Saban is at BU,
  • 54:00and so we meet with them regularly
  • 54:03to talk through the different issues.
  • 54:04But Tom has the homeless opioid use
  • 54:10incarceration perspective
  • 54:12as well as incredible statistical knowledge.
  • 54:16So it's a great partnership.
  • 54:20(indistinct)
  • 54:22<v Donna>And so maybe we have to end,</v>
  • 54:23but I can say that Rani has provided her email address
  • 54:26as you can see here
  • 54:27so I'm sure you'll welcome further comments and questions.
  • 54:31<v ->Yes, absolutely, I'd be very happy to.</v>
  • 54:34<v ->Thank you so much.</v> <v ->Thank you.</v>
  • 54:36Thanks, everyone.
  • 54:37I really appreciate it, great to see you all.