Precision Implementation: Developing and Validating Predictive Models of Information Technology Tool Adoption

Join us as this presenter discusses this poster live on MAy 26, 2021 | Track B at 12:15 PM Mountain

PRESENTER
NATHALIE HUGUET
Oregon Health & Science University
BACKGROUND
There is strong evidence that different implementation support strategies (e.g., facilitation, audit and feedback, performance benchmarking) can help clinical practices with adoption and maintenance of evidence-based guidelines. There are, however, costs to both providing and receiving implementation support. While evidence demonstrates the effectiveness of various implementation strategies, relatively little is known about which practices will benefit most from a particular implementation strategy, how much assistance a practice might need, or if practices could improve on their own. New methods are needed to predict which practices may implement targeted changes with less support and which will need more. The objective of this study is to develop and validate predictive models that estimate the likelihood of adoption of an electronic health record (EHR)-related tool.
SETTING/POPULATION
EHR data from 351 community health centers (CHCs) from the OCHIN Network in which an insurance support information technology (IT) tool was implemented 05/01/2018. The insurance support tool was designed for clinic eligibility specialists to document health insurance assistance provided and assist with HRSA required reporting.
METHODS
We used LASSO penalized logistic regression to develop and validate models predicting adoption and sustainability of the tool. Predictive performance was assessed using ROC curve (AUC). Adoption is defined as any instance of tool use within the 12-month follow-up period (up to 6/30/2019). Sustainability is defined as at least one tool use in the last four months of the follow-up period. Variables included in the models were geographic variables, number and type of departments/clinics, patient panel, patient panel demographic characteristics, type and number of encounters, payer distribution, provider type, number of encounters with eligibility specialists.
RESULTS
About 42% of CHCs adopted the tool and 25% demonstrated sustained use. Models for adoption (AUC= 0.784; 95%CI: 0.710 – 0.858) and sustainability (AUC=0.829; 95%CI: 0.746 – 0.912) show high classification accuracy. Out of the 25 variables entered in the model, three predicted adoption (years in EHR, total number of visits, and percent of visits that were ambulatory) and one (total number of visits) predicted sustainability.
CONCLUSIONS
EHR data can be used to predict EHR tool use. This work is the next step toward advancing the science of ‘precision implementation’ and how to efficiently tailor and deploy implementation support strategies for IT innovations.
POSTER

Loader Loading...
EAD Logo Taking too long?

Reload Reload document
| Open Open in new tab

A Stakeholder Engagement Method Navigator Webtool for Clinical and Translational Science

Join us as this presenter discusses this poster live on May 25, 2021 | Track A at 1:00 PM Mountain

PRESENTER
JENNA RENO, PhD
University of Colorado Anschutz Medical Campus, School of Medicine
INTRODUCTION
Stakeholder engagement is increasingly expected by funders and valued by researchers in clinical and translational science, yet many researchers lack access to expert consultation or training in selecting appropriate stakeholder engagement methods. Scalable infrastructure could support improvements in stakeholder-engaged research, and self-directed, web-based interactive tools are emerging solutions across clinical and translational research. We undertook an iterative process of design, development, and testing of an interactive web-based tool (henceforth “webtool”) to guide researchers in learning about, selecting and using a variety of methods for stakeholder-engaged research for their grant writing, protocol planning, implementation, and evidence dissemination.
SETTING/POPULATION
Design Thinking methods were used to engage stakeholders, including investigators from the Anschutz Medical campus and Colorado Clinical and Translational Sciences Institute communities, who are interested in using stakeholder engagement methods for research.
METHODS
The design and development of the engagement methods webtool was guided by user-centered design processes. We followed the Design Thinking stages described by Ideo.org: Empathize, Define, Ideate, Prototype, and Test. Design Thinking stages are iterative in nature – such that progress from one stage to another often returns to prior stages with new insights. We conducted an environmental scan and literature review, along with investigator interviews, surveys, and engagement expert facilitated group discussion. We formally reviewed and catalogued 29 distinct engagement methods. To develop the webtool we used an iterative design process that included a contextual inquiry approach (low fidelity prototype user testing) and a ‘Think Aloud’ approach (high fidelity webtool prototype user testing) to produce webtool V1.0.
RESULTS
As prioritized during user testing, the Stakeholder Engagement Navigator webtool both educates and guides investigators in selecting an engagement method based on key criteria. Insights from the Empathize and Define stages included: Researchers understand that stakeholder engagement is valuable and want to include it in their research design and implementation; however, researchers are not familiar with stakeholder engagement methods. Based on the environmental scan of comparable tools from the Empathize stage, we developed a modification of an interactive “bubble” feature that displayed results. The V1.0 Navigator webtool filters methods first by purpose of engagement (noted by 62% of users as the highest priority criteria), then by budget, time per stakeholder interaction, and total interactions.
CONCLUSIONS
The Stakeholder Engagement Navigator webtool is a user-centered, interactive webtool suitable for use by researchers seeking guidance on appropriate stakeholder engagement methods for clinical and translational research projects.
POSTER

Loader Loading...
EAD Logo Taking too long?

Reload Reload document
| Open Open in new tab

Virtual Ethnographic Approaches to Facilitate Community Engaged Implementation Research

Join us as this presenter discusses this poster live on May 25, 2021 | Track A at 1:00 PM Mountain

PRESENTER
LINDA SALGIN
San Ysidro Health, University of California San Diego, San Diego State University
BACKGROUND
Meaningful engagement of stakeholders is at the heart of successful program development and implementation. Community and Scientific Advisory Boards (CSABs) have been frequently used to engage diverse sets of stakeholders to inform research projects. Traditionally, CSABs meet in person, however, in light of the COVID-19 pandemic, many CSABs have moved into the virtual realm, raising questions about the quality of engagement and data collection processes. Our objective is to describe our approach and preliminary findings to adapting ethnographic methods to assess stakeholder engagement in virtual CSABs.
SETTING/POPULATION
CSAB meetings were hosted via an online video conferencing platform. A total of 33 stakeholders across two CSABs participated in 16 sessions. Seven undergraduate students and two masters-level research staff were trained as direct observers.
METHODS
Documentation forms were developed to assess the following in each meeting: attendees, time spent speaking and language (English or Spanish), modality used (computer, phone, or both), and types of stakeholder interactions (e.g., interruptions, sharing or requesting information). Documenters participated in a two-hour interactive training led by three implementation scientists along with ongoing debrief meetings after each CSAB for quality assurance and process refinement. Each CSAB meeting lasted two hours and was facilitated by the Global Action Research Center, a social change organization. Documenters were assigned to observe specific CSAB sub-groups and used a combination of live and recorded meetings to complete their documentation forms.
RESULTS
Debriefing sessions led to a number of identified challenges and subsequent methodological refinements. The primary challenges were: ability to accurately document content and technical issues, with recording information about the content of the discussions being the most challenging. The virtual format of the meetings limited ability to document body language and behavioral nuances and lack of ability to record all breakout rooms. Pre-assigning documenters to focus on specific CSAB sub-groups along with the ability to record CSAB meetings for repeated review made documentation more feasible. Data derived from the preliminary content analysis of the documentation forms indicated that the majority (60-70%) of interruptions or comments were related to technical issues.
CONCLUSIONS
As research continues to expand use of virtual platforms, we highlight key lessons learned to adapt ethnographic methods to facilitate community engagement through virtual CSAB contexts. Assessing stakeholder engagement virtually allowed for the collection of rich ethnographic data but these adapted methods presented unique obstacles. Prior to embarking on a virtual ethnographic journey, we recommend ongoing trainings including debriefing sessions, and thorough investigation into the functions of virtual platforms before selection.
POSTER

Loader Loading...
EAD Logo Taking too long?

Reload Reload document
| Open Open in new tab