Innovations in Use of Mixed Methods in Implementation Research
July 07, 2023Speaker: Lawrence Palinkas
March 10, 2022, at 12 pm EST
CMIPS Qualitative Methods Innovation Program seminar. Co-sponsored by the Department of Social and Behavioral Sciences, Yale School of Public Health; the Yale Child Study Center, Yale School of Medicine; and the NIH T32 Training Grant “Implementation Science Research and Methods.
Information
- ID
 - 10111
 - To Cite
 - DCA Citation Guide
 
Transcript
- 00:00<v ->Dr. Palinkas, and so briefly,</v>
 - 00:02I'll just share that this seminar
 - 00:05is sponsored by the Center for Methods
 - 00:08and Implementation and Prevention Science,
 - 00:10our qualitative methods innovation program
 - 00:13at the Yale School of Public Health,
 - 00:14our Department of Social and Behavioral Sciences,
 - 00:17and the Yale Child Study Center
 - 00:19and our NIH T32 training grant
 - 00:22for implementation science research methods.
 - 00:26And so our qualitative methods innovation program,
 - 00:29this is the second seminar that we've had,
 - 00:32we're deeply grateful and lucky
 - 00:35to have Prof. Palinkas here.
 - 00:37So he's a distinguished professor of social policy
 - 00:40and this Suzanne Dworak-Peck School of Social Work
 - 00:44at the University of Southern California.
 - 00:47He holds secondary appointments in anthropology
 - 00:50and public health sciences at USC.
 - 00:53And as a medical anthropologist myself,
 - 00:58Dr. Palinkas' contributions to the field
 - 01:01of implementation science have allowed
 - 01:04for younger scholars like myself and others
 - 01:08to robustly integrate ethnographic
 - 01:11and other innovative methods to help illuminate,
 - 01:13improve and inform healthcare delivery.
 - 01:17Among many innovations, he's developed and packaged
 - 01:19the rapid assessment procedure for clinical ethnography,
 - 01:22and he worked to develop and make accessible
 - 01:27important approaches to improve the implementation
 - 01:30of brief interventions for trauma survivors,
 - 01:31for adolescents accessing mental health services
 - 01:34and for mental health services
 - 01:36that more recently are deployed
 - 01:37in acute care settings during COVID.
 - 01:40And so his current research encompasses
 - 01:42the implementation of child
 - 01:44and adolescent mental health services,
 - 01:46the sustainment of prevention programs and initiatives
 - 01:49and effects of climate change on vulnerable populations.
 - 01:53And I'm sure he'll share with us some of the new ideas
 - 01:57and projects that he has on his mind.
 - 01:59And we look forward to discussions about that
 - 02:02during and after the talk.
 - 02:04And so we're deeply appreciative of him taking the time
 - 02:07to come all the way here and spend the day with us.
 - 02:10And so, I'll hand it over to him.
 - 02:12The title of his seminar is Innovations
 - 02:14and the Use of Mixed Methods and Implementation Research.
 - 02:20<v ->Well, thank you, Ashley.</v>
 - 02:22And it is indeed a pleasure to be here.
 - 02:25In fact, last time I was here was almost 50 years ago,
 - 02:32and that was even before
 - 02:33there was a Yale School of Public Health.
 - 02:35<v ->Oh, wow.</v>
 - 02:36<v ->So it is exciting to be able to be here</v>
 - 02:41and to spend this time with you all.
 - 02:49I was asked to talk about some of the things
 - 02:50that we've been working on
 - 02:54with respect to the use of mixed methods
 - 02:58in implementation research.
 - 03:01And so what I will focus on is,
 - 03:06and just to give you a brief overview about
 - 03:10how mixed methods have been used in implementation research,
 - 03:13and then highlight three particular projects
 - 03:17that I've been working on that illustrate
 - 03:20the use of these methods in addressing important issues
 - 03:25related to implementation of evidence-based interventions,
 - 03:30policies, and programs.
 - 03:33So let me first start by talking about
 - 03:37what mixed methods are.
 - 03:40And typically we call them at a particular methodology,
 - 03:45even though we have methods implies plural.
 - 03:49But it is a methodology for collecting, analyzing,
 - 03:52and mixing both quantitative and qualitative data
 - 03:56in a single study or series of studies.
 - 03:59The idea being that when you combine
 - 04:02the two sets of methods,
 - 04:04you're able to get a much better understanding
 - 04:06of a research problem than either research approach alone.
 - 04:12In combining the methods,
 - 04:16which is the key element to a mixed method,
 - 04:19as opposed to a multi-method study.
 - 04:22It's not merely parallel play where you have somebody
 - 04:26who's doing the quantitative study
 - 04:27and somebody doing the qualitative study
 - 04:29with no interaction.
 - 04:31It's really based on the interaction.
 - 04:34So in a sense, you can think of it as a model of,
 - 04:37as well as a model for interdisciplinary
 - 04:41and even transdisciplinary research.
 - 04:44It also allows you to simultaneously answer confirmatory
 - 04:49and exploratory questions,
 - 04:51thereby you can both generate a theory
 - 04:55and verify it in the same studies.
 - 04:59The elements of mixed methods depend on both the structure,
 - 05:05the function, and the operation.
 - 05:07So in terms of the structure,
 - 05:09how you connect the data in a mixed method study
 - 05:12may depend on timing and the weight and authority
 - 05:16that you assign to each type of method.
 - 05:19You can collect the data simultaneously
 - 05:22as so that you're collecting both quantitative
 - 05:25and qualitative data at the same time.
 - 05:28Or sequentially, where you use one method
 - 05:31followed by the other.
 - 05:33You can also vary the priority
 - 05:35that you assign to each method,
 - 05:37so that if you're giving priority the qualitative method,
 - 05:40it's indicated by QUAL being in capital letters.
 - 05:44And that similarly,
 - 05:46if you're giving priority to the quantitative methods,
 - 05:48the QUAN is and capital methods, capital letters,
 - 05:52or you can give equal priority to both methods,
 - 05:55even though there are some people who think
 - 05:57that that's not really possible.
 - 06:01The other aspect of mixed methods is the iterative process
 - 06:06of data collection and analysis,
 - 06:09so that you may begin with quantitative methods
 - 06:13to collect the data and analyze it
 - 06:16leading to the collection or analysis of qualitative data,
 - 06:20which leads to further quantitative
 - 06:23data collection and analysis.
 - 06:27This chart shows you the five major uses of mixed methods
 - 06:34in implementation research.
 - 06:36Similar to the typology of mixed method designs
 - 06:41that Creswell and Plano Clark,
 - 06:43who written the stamp, the bible of mixed method research.
 - 06:50There are five major types of mixed method uses
 - 06:54in implementation science.
 - 06:56Convergence, where you are corroborating data
 - 07:00from different sources to come to either similar conclusions
 - 07:06or the quantization of qualitative data.
 - 07:13Complementarity intends to understand a phenomenon
 - 07:19more completely by focusing on the breadth of understanding
 - 07:25through quantitative analysis
 - 07:26but a depth of understanding through qualitative analysis.
 - 07:30Expansion is often used to help explain
 - 07:35the findings from one study.
 - 07:37So you may get a finding from a quantitative analysis
 - 07:42of a survey that produces unexpected results
 - 07:46follow that up with a qualitative study
 - 07:49to come to some explanation to answer the question why
 - 07:53that a quantitative study alone is not designed to answer.
 - 07:59We also use mixed methods for exploration
 - 08:02and development.
 - 08:04Oftentimes, we will use qualitative methods
 - 08:07to identify the way to ask questions in a survey
 - 08:10or to develop hypotheses to be tested
 - 08:14or a framework that guides that hypothesis testing,
 - 08:19and then the quantitative methods
 - 08:22to test the hypothesis or validate the framework.
 - 08:27And then finally, we may use it for sampling,
 - 08:30so that oftentimes on the basis of quantitative data,
 - 08:35we may select participants for qualitative study,
 - 08:39either focus groups or semi-structured interviews.
 - 08:43We can also reverse the process and use qualitative data
 - 08:47to create categories
 - 08:49that can then be compared quantitatively,
 - 08:51which I will show you later.
 - 08:53Each of those functions carries with it
 - 08:58a variation of timing of data collection,
 - 09:02so it may be sequential or concurrent.
 - 09:06And the analysis can occur both,
 - 09:10or the mixing of the data can occur both in data collection
 - 09:15through convergence or analysis and interpretation
 - 09:20through the other methods
 - 09:22or throughout through the sampling.
 - 09:25And they may involve the combination of equal weights of
 - 09:30quantitative and qualitative data
 - 09:32or priority being given to one or the other.
 - 09:39Now, how to decide which function to use.
 - 09:43I usually recommend that when you're seeking answers
 - 09:46to the same question,
 - 09:47use convergence as a strategy for mixing the methods.
 - 09:53When you're seeking answers to related questions,
 - 09:56you may use it for the purpose of complementarity
 - 10:01to gain a comprehensive understanding.
 - 10:04When the findings based on one method raises questions
 - 10:08that can answer be answered by the other method.
 - 10:11The function is expansion.
 - 10:14When the findings based on one method are prerequisite
 - 10:18for the use of another method, such as developing a survey,
 - 10:22then that's development.
 - 10:24And when one method can use to define
 - 10:26or identify participant samples
 - 10:28for collecting and analyzing data,
 - 10:31representing the other method, that is sampling.
 - 10:35There are three ways of mixing quantitative
 - 10:38and qualitative data.
 - 10:40You can merge the data in which you bring
 - 10:42the two types of data to develop your results.
 - 10:47You can connect the data where you take one data
 - 10:52from one method to generate and assist
 - 10:56and generation of data from another method
 - 10:59to obtain your results.
 - 11:01Or you can embed the data, as is typically the case
 - 11:05in randomized controlled trials
 - 11:07where qualitative data may be used
 - 11:09to help explain the process
 - 11:12by which an intervention works or implementation occurs.
 - 11:17And the quantitative data can be used
 - 11:19to describe the outcomes.
 - 11:21<v ->How is that different than merging?</v>
 - 11:23<v ->Pardon?</v>
 - 11:24<v ->How is that different than merging?</v>
 - 11:27<v ->Okay, a good example of merging the data</v>
 - 11:29would be triangulation of quantitative and qualitative data,
 - 11:34whereas embedding the data is each dataset
 - 11:39has a different function.
 - 11:41They're asking different sets of questions,
 - 11:43whereas merging the data is asking the same question.
 - 11:45<v ->I understand. Okay.</v>
 - 11:48<v ->And in fact, as the next slide shows</v>
 - 11:50and answers your question,
 - 11:51merging the data when you're seeking answers
 - 11:53to the same question, connecting it when answering questions
 - 11:57to relate, you're answering related questions sequentially
 - 12:01or embedding it when you're answering questions
 - 12:04that are related simultaneously.
 - 12:07So, you can use mixed methods for a variety of reasons
 - 12:11in implementation research.
 - 12:14We often use them, for example,
 - 12:17to measure intervention or implementation outcomes
 - 12:21in the qualitative methods, as I said earlier,
 - 12:25to measure process.
 - 12:26Or we can use the qualitative methods
 - 12:29to explore the steps of the intervention
 - 12:31and generate a conceptual model
 - 12:33along with testable hypotheses,
 - 12:36and then test those hypotheses
 - 12:37with the quantitative methods.
 - 12:40Many times we use the quantitative measures
 - 12:42to examine the content of an intervention
 - 12:45or its implementation and the qualitative methods
 - 12:49to examine the context in which it occurs.
 - 12:52We can use the quantitative methods
 - 12:55to incorporate the perspectives of the researcher
 - 12:58and the qualitative methods to incorporate the perspectives
 - 13:03of our collaborators, usually the consumers
 - 13:08of the interventions that we're implementing.
 - 13:11And then finally, we often use one set of methods
 - 13:15to address the limitations of the other.
 - 13:17So in implementation research, for example,
 - 13:21when the unit of analysis is a clinic or organization
 - 13:25and issues of power may be compromised
 - 13:28by these limited number of available clinics for analysis,
 - 13:33then validating or confirming the results
 - 13:38from a quantitative analysis using qualitative data
 - 13:42is another rule that mixed methods can play.
 - 13:50So I'm gonna tell you how these methods
 - 13:52were mixed in three particular studies.
 - 13:56The first being a study that we did on the development
 - 14:00of a measure of sustainment
 - 14:03of prevention programs and initiatives,
 - 14:06a study that was funded
 - 14:08through the National Institute Drug Abuse,
 - 14:10where we merged and connected data
 - 14:14using a structure beginning with qualitative data collection
 - 14:19and an analysis to develop a quantitative scale,
 - 14:23testing that quantitative scale,
 - 14:26and then evaluating predictors of sustainment
 - 14:30using qualitative comparative analysis.
 - 14:34The functions being development of a scale or instrument,
 - 14:38convergence of qualitative data from different data sets.
 - 14:43And expansion, using the qualitative data
 - 14:47to explain quantitative findings.
 - 14:50The second study is an implementation
 - 14:53effectiveness hybrid trial that targeted the use
 - 14:58of evidence-based interventions for screening
 - 15:02and brief treatment of post-traumatic stress disorder
 - 15:06and substance use disorders in patients
 - 15:10presenting in trauma centers.
 - 15:13There we embedded and merged the data in a randomized,
 - 15:18what was it, pragmatic clinical trial
 - 15:20with a focus on quantitative data collection
 - 15:23and simultaneously qualitative data collection
 - 15:27for complementarity and sampling.
 - 15:30The third, I forgot to put the title in,
 - 15:33is a study looking at the impact of the COVID pandemic
 - 15:38on policy and practice implementation
 - 15:40of mental health services for children and adolescents
 - 15:44where we merged the data collecting both quantitative
 - 15:50and qualitative data for the purpose of convergence.
 - 15:56From the first study, we were able to, you know,
 - 16:02we focused on the fact that government agencies like SAMHSA,
 - 16:08Substance Abuse Mental Health Services Agency
 - 16:11fund hundreds of projects that are designed
 - 16:15to deliver drug and HIV prevention programs
 - 16:20as well as mental health services like suicide prevention
 - 16:25and treatment of conduct disorders.
 - 16:29But being able to sustain these programs,
 - 16:34even though they're explicitly told to include a plan
 - 16:38for sustainment in the project application
 - 16:42is always an open question because generally we have no way
 - 16:46of determining the likelihood of sustainment
 - 16:49or providing feedback and to agencies
 - 16:53that are trying to sustain their programs.
 - 16:56So the aim of this project was to look at core components
 - 17:01of sustainment and how they relate to one another
 - 17:04across times, so that we can increase the likelihood
 - 17:08of providing useful information that will result in
 - 17:15successful sustainment of these programs.
 - 17:18In this particular project,
 - 17:20we designed a measurement system for monitoring
 - 17:23and giving feedback within SAMHSA and then pilot testing
 - 17:28the predictability of that system
 - 17:30and its feasibility and acceptability.
 - 17:33So in this study, we essentially began
 - 17:36with a series of qualitative interviews
 - 17:40with 45 participants of 10 different SAMHSA funded programs.
 - 17:46And we collected information
 - 17:47using traditional semi-structured interviews,
 - 17:51a free list exercise, which is often used in anthropology
 - 17:57to identify semantic domains that are relevant to the people
 - 18:01that we're working with or studying.
 - 18:05And then a checklist of the consolidated framework
 - 18:11of implementation research.
 - 18:14The results from each of those forms of data collection
 - 18:18were then merged to identify relevant domains
 - 18:22of sustainment for SAMHSA funded grantees.
 - 18:26We use those domains to create a scale
 - 18:30known as the sustainment measurement system scale.
 - 18:35had 42 items, one subscale describing sustainment outcomes,
 - 18:42and then six scales describing determinants of sustainment.
 - 18:48In the next phase of the study,
 - 18:50we then evaluated the validity and reliability of the scale
 - 18:59by collecting data from 200 SAMHSA grantees
 - 19:04representing 145 different organizations that were funded
 - 19:09across seven different SAMSA funded programs.
 - 19:12What we found was a measure that had pretty high
 - 19:18inter-item reliability of 0.93,
 - 19:20but varying degrees of reliability generally satisfactory
 - 19:26to excellent for each of the subscales.
 - 19:31We were also able to distinguish the difference
 - 19:36between each of the predictors
 - 19:40as well as outcomes of sustainability,
 - 19:44particularly the outcomes and whether the program
 - 19:47continued to exist, but were adapted
 - 19:52and continuing to exist in the same form.
 - 19:56And then in the third phase of the study,
 - 19:59we used the methodology of qualitative comparative analysis
 - 20:04to identify pathways of predictors
 - 20:09associated with sustainment.
 - 20:14And we found that as a unit, there were two combinations
 - 20:19that were significant predictors.
 - 20:21So essentially what you're doing
 - 20:23is taking the quantitative data
 - 20:27that we had collected from the 200 participants
 - 20:31in the 145 programs,
 - 20:34and then use the qualitative structured qualitative process
 - 20:39known as QCA to identify community responsiveness
 - 20:46and organizational capacity
 - 20:49when combined with the CFIR process domain
 - 20:53or community responsiveness and organizational capacity
 - 20:57when combined with coalitions, networks, partnerships.
 - 21:01So the reason why this was of interest to us
 - 21:05is because while frameworks like the CFIR
 - 21:09can identify domains of factors
 - 21:13that are predictive of successful sustainment,
 - 21:16they don't prioritize those domains.
 - 21:19And the priority assigned to them
 - 21:21may vary from one context to the next.
 - 21:25<v Participant>Larry, can I just ask,</v>
 - 21:27I mean, wouldn't you prioritize them
 - 21:28based on the strength of their association?
 - 21:30Or maybe I'm not fully understanding.
 - 21:33<v ->Like, so the strength of association alone, you know,</v>
 - 21:36that may tell you independent of everything else,
 - 21:39this predicts for your outcome.
 - 21:43But the reality is that they don't exist independently,
 - 21:48they exist in combinations.
 - 21:50And the QCA is able to mirror that
 - 21:53or to take that into account.
 - 21:55<v Participant>Thanks.</v>
 - 21:56<v ->Can you talk a little bit more about the process of QCA?</v>
 - 22:00<v ->I could.</v>
 - 22:03Essentially, it takes a series of configurations.
 - 22:12So the advantage to QCA
 - 22:15is that you can work with limited samples,
 - 22:19you know, as few as eight to 10, for example.
 - 22:24And it can take either quantitative or qualitative data.
 - 22:30The outcome can be either categorical
 - 22:33in which it can be one form of QCA,
 - 22:40I'm blanking on the type now.
 - 22:43Or it can be inter an interval level measure,
 - 22:47which it's a fuzzy-set analysis.
 - 22:52But it essentially identifies necessary
 - 22:56and sufficient characteristics or conditions
 - 23:02by which combinations of variables
 - 23:05predict the outcome variable.
 - 23:11I could give an entire lecture on QCA,
 - 23:13but since we're getting short on time here,
 - 23:16I thought I'd move on
 - 23:18to what I really wanted to spend time on,
 - 23:21which is a technique now,
 - 23:25which is a mixed method approach to collecting information
 - 23:32and analyzing it in a much shorter period of time
 - 23:35than typically occurs in most implementation research.
 - 23:40So in the context of the next study I'm going to describe,
 - 23:44we developed a process known as a Rapid Assessment
 - 23:49Procedure-Informed Clinical Ethnography or RAPICE for short.
 - 23:56And RAPICE essentially takes two traditions,
 - 24:01often used in anthropology.
 - 24:03The RAPICE assessment procedures,
 - 24:05which is a way of collecting and analyzing information
 - 24:09in a short period of time with clinical ethnography,
 - 24:13a traditional approach to understanding clinical issues
 - 24:16or issues of clinical significance by having clinicians
 - 24:23act as ethnographers or participant observers.
 - 24:27This was intended to meet the requirements
 - 24:30for time-efficient data collection
 - 24:33in pragmatic trials, clinical trials
 - 24:37where you want to have minimal participant burden
 - 24:42and collect qualitative data fairly quickly.
 - 24:49The key to this is that rather than being done
 - 24:52by a single individual, it's done as a team.
 - 24:56So the interaction between ethnographically
 - 25:00trained clinicians or community members
 - 25:04act in the role of participant observers.
 - 25:07And then you have a clinically trained social scientist
 - 25:10who acts as a mixed method consultant or analyst.
 - 25:16It's that combination that occurs in a series of steps
 - 25:21that is intended to provide some consistency
 - 25:25or rigor to the process of data collection and analysis.
 - 25:29So, why do we use RAPICE?
 - 25:32If we were to do it the way that ethnographers
 - 25:36were traditionally done, it could take up to a year
 - 25:39just to become familiar with the setting,
 - 25:42learning the language usually done alone
 - 25:45and collecting a lot of data, not all
 - 25:47of which is particularly relevant to the kind of questions
 - 25:52that we ask in implementation science.
 - 25:55It also provides a balance between the role
 - 25:59of the participant and the role of the observer.
 - 26:02So oftentimes we find in ethnography,
 - 26:05someone playing more of a role of one versus the other
 - 26:11and having an imbalance.
 - 26:13And the benefit of ethnographic research,
 - 26:15which is to combine perspectives
 - 26:18that of the insider or emic perspective
 - 26:21and that of the outsider, or etic perspective.
 - 26:25In doing so, the advantage to RAPICE
 - 26:27is that it empowers study participants
 - 26:30this particularly valued for underrepresented groups.
 - 26:36It is now assisting in moving the field
 - 26:40of implementation science to addressing health equity
 - 26:44in a way that it wasn't able to before
 - 26:47because those who are the survivors of disparities are,
 - 26:55have equal weight, carry equal representation
 - 26:58in the process of data collection and analysis.
 - 27:02We now have two versions of RAPICE.
 - 27:04One for clinical settings and one for community settings.
 - 27:09The process of doing it begins with a participant observer
 - 27:13or observers who conducted formal interviews,
 - 27:17do site visits and clinics or communities,
 - 27:21and they may interact with study participants,
 - 27:25attend meetings, observe clinical procedures,
 - 27:29and collect data through informal
 - 27:32and semi-structured interview with participants.
 - 27:37They record that data through field notes,
 - 27:40through logs of data collection activities, field jottings,
 - 27:46and they can digitally record semi-structured interviews
 - 27:51for later transcription.
 - 27:53This information is then presented
 - 27:56to the mixed method consultant who reviews it
 - 27:59and queries the participant observers
 - 28:02to gain a better insight into the data
 - 28:05and its context.
 - 28:07It may also enable the consultant
 - 28:09to ask additional questions that the observer
 - 28:12hadn't thought to ask, for example,
 - 28:15and in an iterative fashion,
 - 28:18enable further data collection.
 - 28:21In the next phase, depending upon the context,
 - 28:24what resources you have available to mixing the methods.
 - 28:31The qualitative data can be subjected
 - 28:33to two phases of analysis.
 - 28:37The first being immersion crystallization,
 - 28:39where you get a holistic representation of the setting,
 - 28:44the activities, the phenomenon of interest,
 - 28:47followed by a more focused thematic content analysis
 - 28:51and perhaps a template analysis if you're doing comparisons
 - 28:56across settings or across groups of individuals.
 - 29:00The participant observer develops
 - 29:03a preliminary interpretation of the meaning
 - 29:06and significance of that data
 - 29:08organized in terms of a set of apriority themes
 - 29:13based on the interview guide or emergent themes
 - 29:16that come from the data collected
 - 29:19and a description of their inner relationships.
 - 29:22The mixed method consultant does something very similar.
 - 29:26And then the two,
 - 29:28the participant observers and the consultant
 - 29:30identified points of convergence and divergence,
 - 29:35and then go through a process of reaching consensus
 - 29:40in much the same way that a team approach
 - 29:42to qualitative data analysis occurs.
 - 29:46If it's not achieved, follow up interviews
 - 29:50or returns to the field site may be necessitated
 - 29:53to collect additional data.
 - 29:55If it is achieved, the consultant may recommend
 - 30:00identification of disconfirming cases
 - 30:04in which additional data collection occurs.
 - 30:09In the end, the interpretation of the study findings
 - 30:13is presented to the participants to confirm validity
 - 30:17and comprehensiveness equivalent to member checking
 - 30:20in qualitative data analysis.
 - 30:24Analyzing the qualitative data using RAPICE
 - 30:27is then integrated with quantitative data
 - 30:31to provide a comprehensive understanding
 - 30:33of implementation process and outcomes.
 - 30:37That way we can use that information
 - 30:40as I will explain later,
 - 30:42to improve the likelihood of successful outcomes.
 - 30:46So in two studies where we applied the RAPICE approach,
 - 30:52we used both the clinical ethnography
 - 30:56and the community ethnography version.
 - 31:00The first study used the clinical ethnography
 - 31:02to look at interventions
 - 31:06targeting post-traumatic stress disorder comorbidity
 - 31:09in trauma care settings.
 - 31:12And this gives you sort of a justification
 - 31:17or the rationale for why we did this study
 - 31:19because each year between the main and a half
 - 31:23and two and a half million people
 - 31:25require inpatient hospitalizations due to injuries,
 - 31:31but they also carry with them
 - 31:32frequently multiple comorbidities including PTSD,
 - 31:37alcohol and drug abuse problems, depression,
 - 31:40chronic medical conditions
 - 31:42that are endemic to this population.
 - 31:46So the aim of this study was to enhance
 - 31:49the implementation of evidence-based screening
 - 31:52and interventions for PTSD and comorbidity
 - 31:56in 25 level 1 trauma centers nationwide.
 - 31:59We also wanted to impact clinical effectiveness
 - 32:04of patient outcomes while also targeting
 - 32:08national trauma center implementation policies
 - 32:12recommended by the American College of Surgeons.
 - 32:16The focus of this study was on implementation outcomes
 - 32:20using the RE-AIM framework.
 - 32:23Reach, effectiveness, adoption,
 - 32:26implementation and maintenance.
 - 32:28And so what we did was collect both qualitative data
 - 32:33using the RAPICE methodology of having clinicians
 - 32:37act as participant observers and work with myself
 - 32:43to interpret or analyze the data that they collected,
 - 32:48as well as quantitative data
 - 32:50through the National Trauma Center Behavioral health surveys
 - 32:55to identify or create a matrix
 - 32:58of American College of Surgeons policy
 - 33:02and its implementation,
 - 33:05so that the different reach categories
 - 33:09were assessed using both quantitative and qualitative data.
 - 33:14At the same time, we were also using the qualitative data
 - 33:19that we collected through RAPICE
 - 33:21to create categories of implementation quality.
 - 33:27So the qualitative data became quantified
 - 33:31in the assigned scores based on dimensions
 - 33:34of the intervention itself, the leadership engagement,
 - 33:39the adherence to regulatory standards.
 - 33:44So, we had four categories of implementation quality.
 - 33:48Excellent, good, fair and poor.
 - 33:51When we combined the good
 - 33:55and excellent forms of implementation,
 - 34:01what we found is essentially no difference
 - 34:07in the scores that were assigned
 - 34:12to individuals post-treatment
 - 34:17indicating very poor clinical outcomes in conditions
 - 34:22where the implementation of the guidelines was,
 - 34:28actually, it's the exact opposite, we got great outcomes
 - 34:33under good and excellent implementation,
 - 34:36very poor outcomes as indicated by the disparity
 - 34:39between the two sets of measures
 - 34:41under conditions of fair and poor implementation.
 - 34:47The finding was that the clinical outcomes
 - 34:51associated with implementing these guidelines
 - 34:55for screening and treating PTSD and comorbid conditions
 - 35:00produced much better outcomes
 - 35:02when their implementation quality was good or excellent
 - 35:06than when it was fair or poor.
 - 35:10So finally the third study is that had to do a, as I said,
 - 35:15with the impact of the COVID pandemic on child
 - 35:19and adolescent mental health and practice implementation.
 - 35:23As you know, mental health issues
 - 35:26have become of increasing concern
 - 35:30in child and adolescent populations
 - 35:32even before the pandemic.
 - 35:35When the pandemic occurred, those concerns skyrocketed.
 - 35:40The increase was very dramatic, so that there were reports
 - 35:46that up to half of the population of children,
 - 35:50adolescents living in the United States
 - 35:53were experiencing symptoms of severe depression and anxiety.
 - 36:00Visits to emergency room
 - 36:02for mental health crises skyrocketed.
 - 36:05Yet the understanding of how to respond to these issues
 - 36:12by mental health service systems was very limited.
 - 36:16So the intention of this study
 - 36:18was to look at the impact of the pandemic
 - 36:21on implementation of policy and practices at the state level
 - 36:26for preventing and treating mental health problems
 - 36:29in this population,
 - 36:31and then look at the current need and demand for services
 - 36:34as well as the capacity to deliver them.
 - 36:37And how state mental health authorities
 - 36:40were addressing these needs and demand
 - 36:44with a particular focus on telehealth
 - 36:47and its use to deliver services.
 - 36:51So while the last study relied on the RE-AIM framework
 - 36:55to evaluate implementation outcomes,
 - 36:59this study utilized the consolidated framework
 - 37:02for implementation research to look at the process
 - 37:07of implementing evidence-based policies and practice,
 - 37:13We began with conducting semi-structured interviews
 - 37:17with 29 state mental health authorities
 - 37:20and representatives from 21 randomly selected states,
 - 37:25and then using a subgroup of those as participant observers
 - 37:30in their respective states.
 - 37:32So they were not only involved
 - 37:34in collecting data in their states,
 - 37:36but also assisting us in the analysis of that state data.
 - 37:43So, this is a community ethnography approach.
 - 37:47We also stratified the data according to two criteria,
 - 37:54level of unmet need for services
 - 37:57as described by a study that was done
 - 38:02two years prior to this study
 - 38:06and the positivity rate for the coronavirus
 - 38:09at the time that we conducted this study,
 - 38:11which was in the fall of 2020.
 - 38:16What we found, and part of this data involved,
 - 38:23you know, looking at features of the qualitative data
 - 38:29and comparing them across the categories of states
 - 38:35based on unmet need for mental health services
 - 38:38as well as coronavirus positivity.
 - 38:42And some of it was used to provide in-depth understanding
 - 38:47of the process of implementation.
 - 38:51So what you see here is, even though we had 21 states,
 - 38:57the increase in demand for services
 - 39:00was high in all of the states
 - 39:03that fell in the high positivity, high level of unmet need,
 - 39:08whereas the lowest rate of increase in demand
 - 39:13occurred in states with low levels of positivity
 - 39:17and low levels of unmet need,
 - 39:20which is pretty much what you would expect.
 - 39:24In terms of capacity, we found that in states
 - 39:28with high unmet need, the decrease in capacity
 - 39:32occurred much higher in those states
 - 39:36than in states with low unmet need.
 - 39:40So we found a disparity in the supply and demand
 - 39:45for mental health services through this study
 - 39:49in that states with high positivity and high unmet need
 - 39:53had the highest increase in demand
 - 39:56for mental health services,
 - 39:58but the lowest capacity for delivering those services.
 - 40:03When we look at the barriers and facilitators
 - 40:07to implementation using the CFIR domains,
 - 40:11we found issues related to telehealth
 - 40:14that presented challenges
 - 40:16to the state mental health authorities,
 - 40:18such as limited access to broadband or internet
 - 40:22or the technology needed for telehealth,
 - 40:25like laptop computers, reluctance to participate,
 - 40:29especially among families because they were unfamiliar
 - 40:33with the practice or not comfortable using the technology
 - 40:37or preferred face-to-face interactions.
 - 40:40At the same time, facilitators included Medicaid waivers
 - 40:44to allow billing for services,
 - 40:47provider training for its use,
 - 40:49information for families on how to use it
 - 40:52and grant funding to provide client access,
 - 40:56either through expanding access to the internet
 - 41:01or access to the technology.
 - 41:05We also found that many providers
 - 41:09intended to continue using these telehealth
 - 41:13or virtual mental health services
 - 41:17because it resulted in fewer appointment cancellations
 - 41:21or no-shows, greater family engagement
 - 41:24and reduce time traveling to provide services.
 - 41:29So I'm just gonna end with a description
 - 41:32of some of the new things that we're doing.
 - 41:37One of the potential for using RAPICE
 - 41:42and other kinds of mixed methods
 - 41:45is not just documenting implementation process and outcomes,
 - 41:51but actually facilitating implementation as a strategy,
 - 41:56much like any of the other strategies
 - 41:58that we employ to ensure successful implementation.
 - 42:04So a formative evaluation, you know,
 - 42:06judges the worth of a program
 - 42:08while the program is in progress,
 - 42:11it can be conducted at any phase of a study
 - 42:14and it focuses on the process itself,
 - 42:18but it can influence the outcomes
 - 42:22if there's feedback from the process
 - 42:26of conducting the formative evaluation.
 - 42:29So its main purpose is to detect deficiencies
 - 42:34in implementation as soon as possible,
 - 42:36so that adjustments can be made to ensure better outcomes.
 - 42:42And it's, you know,
 - 42:43the kind of preliminary research that you do
 - 42:45is also considered formative,
 - 42:47but this is something completely different.
 - 42:51This is formative evaluation.
 - 42:53So this kind of evaluation can be done
 - 42:56either by members of the research team
 - 42:59who have knowledge about the intervention
 - 43:01and performance expectation
 - 43:03or can be done by independent observer
 - 43:06who provides so-called objective assessments.
 - 43:11But perhaps the best approach like RAPICE
 - 43:13is to include both in the process of evaluation.
 - 43:19This diagram gives you an idea of how that would work.
 - 43:22So in a randomized controlled trial
 - 43:25where you're evaluating a intervention
 - 43:28and it's implementation.
 - 43:30With each formative evaluation,
 - 43:33you can influence and potentially improve the outcomes
 - 43:38at the next data collection point,
 - 43:40so that the outcomes are optimal,
 - 43:44optimally constructed by the time the trial ends.
 - 43:50So there are a number of methods
 - 43:51that are out there for doing this.
 - 43:54It's semi-structured interviews with participants,
 - 43:57investigators, service providers,
 - 43:59or ethnographic field observation.
 - 44:03But we're now working on using the RAPICE technique.
 - 44:09We're planning to do that in three major projects
 - 44:14that we've got underway now.
 - 44:15The first being implementation projects on prevention,
 - 44:20treatment, harm reduction
 - 44:21and recovery of opioid use disorders.
 - 44:25A research center that's focused on developing
 - 44:30and implementing a multi-level intervention
 - 44:33to increase vaccination rates in under-resourced communities
 - 44:40for HPV.
 - 44:42And then the third,
 - 44:43a stepped care approach to delivering mental health services
 - 44:48in the aftermath of climate related natural disasters,
 - 44:53extreme weather events,
 - 44:55focusing on wildfires in California and Australia
 - 45:00and typhoons in small island developing states
 - 45:04in the Pacific.
 - 45:06So, that's pretty much where we are.
 - 45:09I hope it gives you some ideas of the potentials
 - 45:13for not only using quantitative and qualitative methods,
 - 45:18but being a little creative in their use
 - 45:23to address important problems
 - 45:25related to implementation for use.
 - 45:27<v ->Ah, thank you so much.</v>
 - 45:32For people, will we open it up for questions on the laptop?
 - 45:44So, we'll open it up for questions
 - 45:46and Mona hopefully we can hear it or whoever has questions.
 - 45:53<v ->There's nobody online.</v>
 - 45:57<v Participant>I have a question.</v>
 - 45:59So hopefully everybody online can hear the question.
 - 46:01So, thank you so much.
 - 46:02I really enjoyed hearing about the RAPICE technique.
 - 46:04It's really eyeopening.
 - 46:05It reminds me a little bit of this idea
 - 46:08of community based participatory research
 - 46:10and I wonder to what degree that idea comes in,
 - 46:12in other words, the participant observers,
 - 46:14to what degree do they set the purpose
 - 46:17for the research question versus just working
 - 46:21under the forgetting now the name,
 - 46:23the mixed methods consultant to kind of carry out
 - 46:26the designing of the interview guides
 - 46:30or analysis, et cetera.
 - 46:32<v ->So the community based version of RAPICE</v>
 - 46:36is much more explicit in that it does occur
 - 46:40in the clinical ethnography as well.
 - 46:44But in both instances we've engaged community members
 - 46:50or clinicians in identifying the questions to be asked,
 - 46:54the issues to be addressed
 - 46:56and participating in the analysis.
 - 47:00So they, the term co-creation
 - 47:04has become very popular these days.
 - 47:08We have in a community setting adopted what's called
 - 47:11the community partner participatory research approach,
 - 47:15so that it's not just based in the community,
 - 47:19but that the community members are equal partners.
 - 47:24And we've used this not just in implementation studies,
 - 47:28recently we used it in New Orleans and South Louisiana
 - 47:35to look at how community-based organizations
 - 47:38in low income neighborhoods like the Lower Ninth Ward
 - 47:43were and preparing for hurricane season
 - 47:46during the COVID pandemic,
 - 47:49how COVID had impacted their ability to prepare for
 - 47:54and respond to an increased frequency
 - 47:59of more severe hurricanes.
 - 48:00That involved having a community advisory board
 - 48:05from the community, help us design the interviews,
 - 48:09identify people to interview,
 - 48:13and then participate in the analysis of the transcripts
 - 48:17from those interviews.
 - 48:20You know, as I said, one of the things
 - 48:24that we see as a real value to RAPICE
 - 48:28is that it empowers communities.
 - 48:31Rather than simply being passive participants,
 - 48:35they're actively engaged in the process.
 - 48:38<v ->I'm curious to learn a little more in RAPICE,</v>
 - 48:41how are you following the quality of field observations
 - 48:48and field notes and or, you know, from both ends,
 - 48:54from the mixed method consultant
 - 48:56and also the participant observers
 - 48:58that might be newly trained in ethnography
 - 49:01or like conducting interviews and writing field notes
 - 49:03and things like that.
 - 49:05What is of the process?
 - 49:07<v ->So the iterative nature of that is that we,</v>
 - 49:14on a regular schedule review field notes
 - 49:18and any data that's collected.
 - 49:21I then meet with the participant observers
 - 49:24or the consultant meets with the participant observers
 - 49:31and queries them and makes recommendations at that point
 - 49:35about the kinds of information.
 - 49:37I mean, we begin, actually, I should say
 - 49:40begin actually by training them
 - 49:42on how to do participant observation.
 - 49:45So the who, what, when, where, why observation,
 - 49:50how to collect information, how to record it in field notes,
 - 49:56what we expect to see in field notes,
 - 49:59the different types of observation and reflection.
 - 50:04And then we use the information,
 - 50:08the analyst uses the information that is provided to them
 - 50:12to ask additional questions to get a better understanding
 - 50:17of what was observed or what was heard or seen.
 - 50:20From the analyst standpoint,
 - 50:24the check is, the member checking.
 - 50:28So when we come up with a preliminary analysis,
 - 50:30we present it to a group of clinicians
 - 50:34who participated in this study,
 - 50:36who were observed for example,
 - 50:39or we presented to community members
 - 50:42to get their reflections, to get their feedback.
 - 50:47So in a member check, what the analyst does
 - 50:52is review through a member checking process essentially.
 - 51:02Any questions from the ethernet?
 - 51:05(Ashley chuckling)
 - 51:09<v ->It's like class, just a lot of black boxes.</v>
 - 51:13Okay, well, it's one o'clock so I'm mindful
 - 51:16that folks likely need to head off to their next thing.
 - 51:23But please do let us know if you're not on
 - 51:27any of our email lists or interested in learning more about
 - 51:33our qualitative methods innovation program
 - 51:36or just more about CMIPS, contact William Tootle.
 - 51:40And yeah, you can join me one more time
 - 51:44in thanking Prof. Palinkas for his wonderful talk.
 - 51:46Yeah, so thank you, everyone.
 - 52:02Thank you so much. I have so many questions. (chuckles)
 - 52:06<v ->I guess that worked out okay</v>
 - 52:08in spite of the technical challenges.
 - 52:10<v Participant>No, I think it was great. Yeah.</v>
 - 52:11<v ->I have to say that shared.</v>