Assessing Innovation-Practice Fit: A New Measure

Join us as this presenter discusses this poster live on May 26, 2021 | Track B at 12:15 PM Mountain

PRESENTER
BRYAN R. GARNER, PhD
RTI International
BACKGROUND
The “research-to-practice gap” is a major public health problem that dissemination and implementation research seeks to address. According to the theory of implementation effectiveness, implementation effectiveness (i.e., the consistency and quality of implementation) is a function of two key constructs. The first is implementation climate, which is defined as the extent to which implementation of an innovation is expected, supported, and rewarded. The second is the fit between the innovation and the values of the targeted practice setting. Two widely used measures for assessing implementation climate include the 6-item measure developed by Jacobs, Weiner, and Bunger (2014) and the 18-item measure developed by Ehrhart, Aarons, and Farahnak (2014). In contrast, there is not a widely used measure to assess the fit between the innovation developed as part of research and the practice setting of interest. This presentation/poster introduces the 6-item innovation-practice fit developed and used as part of the Substance-Treatment-Strategies for HIV care (STS4HIV) Project. Setting/Population: Although developed for assessing the fit between evidence-based treatment interventions for substance use disorders (i.e., the innovation) and HIV service organizations (i.e., the practice setting), this measure may be adapted for any innovation-practice combination. Methods: In May 2020, 253 HIV service organizations from across the United States were invited to participate in a study focused on identify the most promising evidence-based substance use disorder treatments for integration within HIV service organizations. Nine evidence-based treatments for substance use disorders were assessed. Using a standardized format, an infographic and video was developed for each evidence-based treatment. For each evidence-based treatment, the HIV service organization’s respondent was first shown the video and infographic and then asked to rate (0=not at all; 1=to a minor extent; 2=to a moderate extent; 3=to a major extent) the extent to which the evidenced-based treatment was fundable, implementable, retainable, sustainable, scalable, and timely for their HIV service organization. Innovation-practice fit was calculated for each evidence-based treatment by taking the sum of these six items (possible range is 0-18).
RESULTS
Of 253 HIV service organizations invited to participate, 203 (80%) completed participation. The average innovation-practice score for motivational interviewing (11.42; SD=0.29) was significantly (p<.05) higher than the other eight evidence-based treatments and was the only one rated higher than the measure’s midpoint. The average innovation-practice fit score for cognitive behavioral therapy was at the midpoint (9.5; SD=.31). The average innovation-practice fit score was below the midpoint for the other six evidence-based treatments assessed.
CONCLUSIONS
A new measure for assessing innovation-practice fit has been developed.
POSTER

Loader Loading...
EAD Logo Taking too long?

Reload Reload document
| Open Open in new tab

Combining Qualitative Interviewing with Systems Science to Understand How Practice Facilitators Tailor Implementation Support to Context

Join us as this presenter discusses this poster live on May 24, 2021 at 11:15 AM Mountain

PRESENTER
ERIN KENZIE, PhD
Oregon Health & Science University
BACKGROUND
A complex array of factors affect the ability of primary care clinics to successfully integrate evidence-based practices into routine care. Models like i-PARIHS (integrated Promoting Action on Research Implementation in Health Services) identify factors related to the intervention, recipients (motivation, skill), and multiple levels of context including local (workflows, past experience), organization (culture, structure), and external (policy drivers). To effectively support clinics, practice facilitators-individuals trained to build the capacity of primary care practices-must accurately assess clinics’ needs and identify corresponding means of implementation support. Examining how this tailoring happens is key to evaluating program outcomes and maximizing program success.
SETTING
This research is being conducted as part of the ANTECEDENT study, an AHRQ-funded EvidenceNOW unhealthy alcohol use project led by the Oregon Rural Practice-based Research Network (ORPRN). In ANTECEDENT, ORPRN practice facilitators provide technical assistance and supportive services to primary care clinics to adopt or improve evidence-based methods of addressing unhealthy alcohol use through screening, brief intervention, and medication assisted treatment (MAT). Efforts are aligned with the state’s Medicaid quality incentive metric for SBIRT (screening, brief intervention, and referral to treatment) and in partnership with SBIRT Oregon (www.sbirtoregon.org).
METHODS
In this mixed methods evaluation, we combine qualitative interviews with causal-loop diagramming, a systems science method for describing complex interrelationships. This poster will outline how we are using causal-loop diagramming to enhance our qualitative analysis and structure our understanding of how practice facilitators respond to clinic needs. We will describe our approach for generating causal-loop diagrams illustrating practice facilitators’ mental models of practice change from qualitative interviews.
RESULTS
Preliminary results from baseline analyses will be presented by illustrating causal-loop diagrams of practice facilitators’ mental models of practice change and tailoring implementation support to context. By analyzing the structure and content of the diagrams, insight can be gained about the range of perspectives held by practice facilitators. Strengths and limitations of this approach to modeling from qualitative data will be identified.
CONCLUSIONS
System dynamics, and causal-loop diagramming in particular, is well suited for enhancing qualitative analysis. Our novel approach provides a framework to specify documented or assumed cause-and-effect relationships. This approach can illustrate the mental models of practice facilitators or researchers and help improve evaluation as well as implementation outcomes.
POSTER

Loader Loading...
EAD Logo Taking too long?

Reload Reload document
| Open Open in new tab

Implementing a Pragmatic, Multi-Site Effectiveness Trial Examining Pediatric Anxiety Treatments During COVID-19: Challenges and Lessons

Join us as this presenter discusses this poster live on May 25, 2021 | Track A at 1:00 PM Mountain

PRESENTER
HANIYA SALEEM SYEDA, MPH
Boston Medical Center
BACKGROUND
Anxiety disorders are the most prevalent mental health problems affecting children, yet very few children with anxiety receive treatment. Kids FACE FEARS (Face-to-face and Computer-Enhanced Formats Effectiveness study for Anxiety and Related Symptoms) is a hybrid effectiveness-implementation randomized trial comparing therapist-led cognitive-behavioral therapy (CBT) versus self-administered online CBT for anxious youth in pediatric health settings. Screening for anxiety and implementing CBT directly within pediatric health settings, and using online formats, can overcome many traditional barriers to care. Recruitment began in 5 months before the COVID-19 pandemic began, and the study is still underway. This presentation describes the challenges of continuing implementation of a pragmatic clinical trial during a pandemic.
SETTING/POPULATION
Eligible youth (ages 7-18 years; N=300) are being identified via universal screening and/or anxiety referral in large pediatric health care networks serving primarily racial/ethnic minority children across four regions: Boston, Miami, Baltimore, and Seattle. To maximize generalizability, English and Spanish speaking families are eligible.
METHODS
To gather information on impacts of the pandemic on implementation of the trial, regional investigators and coordinators provided weekly updates to PIs on: 1) local COVID-related IRB research regulations and changes, 2) COVID-related changes in policies and clinical care at recruitment sites, 3) changes in state legislation related to insurance coverage for telehealth delivery of behavioral healthcare, and 4) barriers to recruitment. Changes to the study design and COVID-19 related challenges were also discussed with the Scientific Steering Committee, as well as with a Parent and Family Advisory Board for feedback.
RESULTS
Adaptions were made to the study design in response to needs identified by investigators, committee members, stakeholders, and the funder which included: 1) allowing therapist-led study arm to allow for inclusion of either telehealth and office-based CBT; 2) including a hybrid model of office-based and telehealth for therapist-led delivery to account for patient preference; 3) flexibility to use secure videoconferencing software for study visits based on participant preference; and 4) the need for assessment questions regarding exposure to COVID-19. Key factors that impacted regional sites’ recruitment success included: 1) state regulation of clinical care settings; 2) lack of IRB restrictions on recruitment and study activities; and 3) clinic’s previous experience with telehealth systems.
CONCLUSIONS
The COVID-19 pandemic created unique challenges in continued implementation of a multi-site national pragmatic trial. To improve implementation and foster more patient-centered research, it was crucial to collaborate with all study team members and stakeholders to identify necessary adjustments to the study design.
POSTER

Loader Loading...
EAD Logo Taking too long?

Reload Reload document
| Open Open in new tab