Abstract
Implementation science theories, models, and frameworks (TMF) should help users understand complex issues in translating research into practice, guide selection of appropriate implementation strategies, and evaluate implementation outcomes. They should also be sensitive to evidence from projects that apply the framework, evolve based on those experiences, and be accessible to a range of users. This paper describes these issues as they relate to the Practical, Robust Implementation and Sustainability Model (PRISM). PRISM was created to assess key multilevel contextual factors related to the reach, effectiveness, adoption, implementation, and maintenance (RE-AIM) outcomes of health interventions. We describe key aspects of PRISM and how it has been applied, evolved, and adapted across settings, time, and content areas. Since its development in 2008 PRISM has been used in over 200 publications, with increased use in recent years. It has been used for a wide variety of purposes and more recent applications have focused on increasing its accessibility for non-researcher groups and more rapid and iterative application for use in learning heath systems. PRISM has been applied to address health equity issues including representation, representativeness, and co-creation activities in both US and non-US settings. We describe common types of adaptations made by implementation teams when applying PRISM to fit with the resources and priorities of diverse and low-resource settings. We conclude by summarizing lessons learned and providing recommendations for future research and practice using PRISM.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
Theories, models, and frameworks (TMF) are an integral part of implementation science (Strifler et al., 2018; Tabak et al., 2013) and there are well over 150 different TMFs (Rabin et al., 2014–2024). Although there is a multitude of TMFs, there is also a high level of agreement among them on several key issues; (1) the importance of considering multilevel context and that for both interventions and implementation strategies ‘one size does not fit all’ (Aarons et al., 2012; Kirk et al., 2020; Mody et al., 2023); (2) for successful implementation there is a need to adapt implementation strategies (and sometimes context and interventions) (Chambers & Norton, 2016; Moore et al., 2021; Nilsen & Bernhardsson, 2019); and (3) there are important processes and implementation outcomes that lead to multiple service and client/patient outcomes (Proctor et al., 2011) that should be measured. We posit that frameworks should help users understand complex issues in translating research into practice, guide selection of appropriate implementation strategies, and iteratively evaluate implementation outcomes (Glasgow et al., 2022; Pfadenhauer et al., 2017). Furthermore, we agree with Kislov et al. (2019), that implementation science TMFs should also be sensitive to evidence from applications of the TMF, evolve based on those experiences, and be accessible to a range of users (Glasgow et al., 2019a, b; Holtrop et al., 2021a, 2021b).
Despite their usefulness, there are challenges in applying implementation science TMFs given their complexity, frequent lack of accessibility to non-experts, perceptions of inflexibility, and limited ability to be acted on rapidly enough to inform learning health systems (Khan et al., 2021; Kilbourne et al., 2017; Trinkley et al., 2022). As defined by the National Academy of Medicine, in a learning health system, “science, informatics, incentives, and culture are aligned for continuous improvement and innovation, with best practices seamlessly embedded in the delivery process and new knowledge captured as an integral by-product of the delivery experience” (National Academy of Medicine, 2024). Often it is not clear how to practically apply implementation science frameworks to address dynamic challenges in translation, including health disparities (Baumann & Cabassa, 2020; Chambers et al., 2013; Fort et al., 2023; Trinkley et al., 2022). Although these issues apply to all implementation science TMFs, we discuss how they have played out with one particular framework, the Practical, Robust Implementation and Sustainability Model (PRISM). PRISM is the contextually expanded version of the widely used Reach, Effectiveness, Adoption, Implementation, and Maintenance (RE-AIM) framework (Feldstein & Glasgow, 2008; Glasgow et al., 2019a, b; Trinkley et al., 2024). It is important to describe how frameworks evolve over time to explain why and how they have changed, and to increase odds that researchers will use current, updated versions of the framework rather than the original version (Glasgow et al., 1999, 2019a, b; Kislov et al., 2019).
As discussed elsewhere, the RE-AIM component of PRISM has been used for planning, evaluation, and more recently iterative implementation (including assessing and guiding adaptations) (Glasgow et al., 2019a, b, 2022). It has also been used to address health equity issues, costs, and sustainment (Eisman et al., 2020; Jones Rhodes et al., 2018; Shelton et al., 2020). Recent applications of RE-AIM focus on outcomes prioritized by community and clinical partners, the overall public health or population impact, and representativeness or equity of outcomes (Fort et al., 2023). In these applications we have realized that it would substantially improve iterative RE-AIM application, planning, and evaluation if we systematically incorporate context. Thus, we increasingly integrate key PRISM contextual factors into our work.
As discussed below and in detail elsewhere (Feldstein & Glasgow, 2008; Trinkley et al., 2024), PRISM consists of two main components—the contextual domains and the RE-AIM outcomes. Figure 1 illustrates how PRISM adds explicit attention to multilevel contextual factors that impact RE-AIM outcomes. There are four contextual PRISM domains, each of which is multilevel. These are: (1) recipient characteristics (e.g., at the multiple levels of individuals (e.g., workers, patients, students), delivery staff and settings, organizational decision makers and communities); (2) multilevel recipient perspectives on the intervention and implementation strategies (e.g., perceived feasibility, history with similar programs, relationships, mental models); (3) implementation and sustainability infrastructure (e.g., resources, capacity; staff roles and responsibilities; monitoring and evaluation systems); and (4) external environment (e.g., policies, guidelines, health and social system structures). Over the past decade, we have found it challenging to communicate that PRISM includes RE-AIM. PRISM is the contextually expanded version of RE-AIM which adds contextual domains to the RE-AIM outcomes (Glasgow et al., 2019a, b; Rabin et al., 2022; RE-AIM Workgroup, 2021).
The purposes of this paper are to: (1) describe the original PRISM and its key components; (2) discuss its use to date and range of application; (3) summarize the evolution of PRISM from 2008 to 2024 including use to address health equity and iterative use; (4) summarize guidance, resources, and tools for the optimal and practical application of PRISM; and (5) discuss the strengths and limitations of, and future directions for the use of PRISM in research and practice.
Original PRISM
PRISM added a multilevel and multi-perspective context to RE-AIM. The RE-AIM framework was developed in 1998 with the intention to identify a set of key outcomes that are associated with the population health relevance and impact of a program. RE-AIM grew to be one of the most widely used and cited public health and implementation science TMFs (Tabak et al., 2023; Vinson et al., 2018). We attribute the broad success and uptake of RE-AIM to its intuitiveness for both researchers and practitioners. PRISM (Feldstein & Glasgow, 2008; Glasgow et al., 2019a, b) emerged from observations in earlier applications of RE-AIM that there were different outcomes under different conditions, even with the same intervention delivered in the same way by the same staff. Some, but not most of these differential effects were associated with patient characteristics (Glasgow et al., 1990; Lichtenstein et al., 1996; Stevens et al., 2000). We wanted to account for factors that might explain more of these differential results. In reviewing the literature on innovations, we drew upon the work of Rogers on characteristics of the ‘innovation’ or intervention (Rogers, 2003) and Wagner and colleagues on key features of quality chronic illness care (Wagner et al., 2001). It is important to note that PRISM focuses on the perspectives of different types and levels of invested partners. These are perceptions based on a person’s experience and positionality. For example, burden or fit with existing workflow is not viewed as an objective characteristic of the intervention but rather as the perspective or perception of that person.
PRISM was developed as a pragmatic and intuitive model to improve translation of research-tested interventions into health systems practice and ultimately population health impact (Feldstein & Glasgow, 2008). The original PRISM can be considered a process, determinant, and evaluation framework in the classification system developed by Nilsen (2015). As Fig. 1 illustrates, PRISM considers how perspectives of the program, policy, or intervention; the external environment; the implementation and sustainability infrastructure; and the characteristics of multiple levels of “recipients” (e.g., implementers, beneficiaries) influence program adoption, implementation, and maintenance. Within the program or intervention domain, PRISM incorporates the perspectives of both the patients (recipients or participants) and the organizational members with different roles (e.g., top leadership, mid-level managers, and frontline staff) (Ehrhart et al., 2014; Weiner et al., 2020; Williams & Glisson, 2020) to help understand what factors within and external to the implementation setting need to be considered and addressed for successful implementation and sustainment of complex interventions (Kwan et al., 2022; Skivington et al., 2021). Inclusion of the Implementation and Sustainability Infrastructure domain was based on experience in healthcare settings in which those settings that were able to implement and sustain programs most consistently had the types of infrastructure, support processes and resources noted.
Space limitations preclude detailed review of the more widely known RE-AIM outcome dimensions of Reach, Effectiveness, Adoption, Implementation, and Maintenance (Sustainment). The RE-AIM framework and its evolution are covered in Glasgow et al. (1999, 2019a, b), Holtrop et al., (2021a, 2021b), and Kwan.et al. (2019). However, we highlight two RE-AIM outcome issues that are especially important in the application of PRISM. The first is that of representativeness or equity of results on the various RE-AIM outcomes. It is important to stress that representativeness, or equity, is important across all RE-AIM dimensions, not just reach as is commonly reported (Fort et al., 2023; RE-AIM Workgroup, 2021).
The second issue concerns adaptations. In RE-AIM, adaptation is a component of its Implementation dimension and refers to modifications that are made to initial or intended interventions or implementation strategies (occasionally it is necessary to adapt context also). Adaptations were added to the RE-AIM Implementation dimension to balance its original focus on fidelity. The goal in RE-AIM is to have fidelity to the core functions (e.g., principles, objectives) while permitting or even encouraging modifications to the “forms” or specific strategies through which these functions are achieved (Miller et al., 2020; Movsisyan et al., 2019; Perez Jolles et al., 2019).
Use of PRISM to Date
In late 2022, Rabin and colleagues published a review of the use of PRISM (Rabin et al., 2022). They found a steady increase in use of the model from 3 publications in 2008 to 31 in 2019 (the last full year of literature reviewed). Of the 180 publications identified that included PRISM in some way, only 31 publications representing 23 studies were found to use PRISM in an ‘integrated’ manner and were included in the detailed analyses. The review found that PRISM was used to study a number of health issues in a wide range of settings, including three in low- and middle-income countries. Topics included primary or secondary prevention, mental health, veterans’ health, cancer, infectious disease, reproductive health, clinical guidelines and other conditions. The majority (74%) of studies were conducted in the United States and in healthcare settings, especially outpatient clinics. We were pleased to find that 78% addressed equity in some way, most often through a focus on low-resource settings or underserved populations. PRISM was used primarily for study evaluation, planning and development, and implementation, with fewer studies addressing sustainment and dissemination. Over half of the studies reported on most or all of the PRISM domains. PRISM context domains were operationalized most often using qualitative methods followed by quantitative, mixed methods, and multi-methods approaches. One study used a narrative approach. Of special note, most studies treated PRISM and RE-AIM as separate rather than one integrated framework. Only a minority of the studies that reported on PRISM domains also included RE-AIM outcomes and only two reported relationships between PRISM domains and RE-AIM outcomes.
Range of Applications
Originally developed with healthcare and broad public health impact in mind, PRISM (and its RE-AIM dimensions) has been used and adapted for diverse settings, populations, and topics, including projects outside of health. PRISM has also been used in combination with other TMFs, including technology and equity-focused TMFs (Trinkley et al., 2020, 2021). It is being increasingly used iteratively and in more diverse settings including those with low resources and in countries beyond the USA. Table 1 provides examples of the diverse topics, settings, and purposes for which PRISM has been used and summarizes how it was adapted to fit different contexts. Four of the nine examples in Table 1 are in non-US settings and the examples address diverse topics including hypertension management in Indonesia, public health wildfire smoke communication in Canada, EHR decision support and technology, high risk youth in community school settings in Uganda, and social risk screening. We provide the examples in Table 1 to encourage thoughtful adaptations to and pragmatic use of PRISM especially for diverse, new, or low-resource settings. PRISM should continue to evolve over time and its constructs should not be used rigidly or robotically. When adaptations are needed, documentation and transparent reporting of why and how adaptations are made is important for interpretation, replication, and rigor.
One area in which PRISM has been applied fairly extensively in recent years is with health technology interventions of various formats (e.g., mHealth, EHR-based alerts) (Glasgow et al., 2021; Maw et al., 2022; Trinkley et al., 2020, 2021), for various audiences (e.g., physical therapists, ambulatory patients, primary care clinicians) and settings [e.g., single and multi-site academic and community health organizations, learning health systems (LHS), and federally-qualified health clinics]. PRISM is often used with human or user-centered design approaches and other frameworks (e.g., Theory of mHealth) that are traditionally used in the technology sector (Bull & Ezeanochie, 2016; Glasgow et al., 2021; Trinkley et al., 2020, 2021). In these integrated approaches to technology-based implementations, PRISM serves as the overarching framework to capture representation of multilevel perspectives and drive sustainability, generalizability, and equity whereas the traditional technology approaches and frameworks provide more focused attention to socio-technical issues and methods to optimize usability and local relevance. In applying this integrated approach to technologies within LHS, PRISM has been adapted in multiple ways. First, given limitations of EHR and other available data, not all of PRISM’s RE-AIM outcomes are able to be assessed and, in some cases, the original definitions of these outcomes were adapted. For example, with interruptive EHR alerts, the clinician-level definition of adoption has been adapted to reflect clinician responses to the alert given that the interruptive nature of the alert supersedes clinician choice to use the alert (Trinkley et al., 2021).
LHS timeline expectations, priorities, or resource constraints influence how comprehensively the components of and approaches to using PRISM are applied. For example, the scope, frequency, and depth of engagement to assess PRISM’s contextual constructs is often limited by time or other resource constraints within LHS (Trinkley et al., 2022). In other instances, automation (e.g., dashboards) has been used to enable efficient, near real-time iterative assessment and feedback on contextual issues and RE-AIM outcomes to rapidly identify needed adaptations for LHS applications (Maw et al., 2022).
PRISM Evolution and Recent Applications
There has been significant expansion in the range of applications of PRISM and integration of PRISM into implementation science since the Rabin et al. (2022) review. Although an additional review is not yet warranted, there are important new directions worthy of discussion. These activities include (1) translating the original PRISM health systems focus on an intervention to a more generalizable implementation science conceptualization emphasizing implementation strategies; (2) using PRISM to guide adaptations; (3) using PRISM to conceptualize and address health inequities and low-resource settings; and (4) using PRISM iteratively across implementation phases. These topics are each discussed below.
Integration of Context, Intervention, Implementation Strategies, and Outcomes
To clarify the central implementation science issue of alignment of implementation strategies with relevant context and the intervention or program being applied, we recently revised the core PRISM figure to illustrate that PRISM contextual factors (such as setting and patient characteristics, infrastructure processes and resources, policy and reimbursement issues) work through their fit or alignment with both the intervention and implementation strategies, rather than directly determining RE-AIM outcomes (Fig. 2). As can be seen, it is the combined influence of these three factors—intervention, context, and implementation strategies—and their alignment that impact RE-AIM outcomes. This logic model type characterization of PRISM is more compatible with current conceptualizations of implementation science and the centrality of implementation strategies (Rabin et al., 2022, Rabin et al., 2014–2024; Smith et al., 2020; Trinkley et al., 2024).
Using PRISM to Guide Adaptations
Research on PRISM has advanced from simply documenting contextual factors or implementation outcomes to using PRISM to understand, identify the need for, and guide adaptations. The PRISM adaptation process works best when employed with implementation teams having strong representation of the various persons responsible for making funding decisions, those supervising or implementing the program, and those receiving the services. As with any community engagement approach, being aware of and addressing issues of differential power, respecting other perspectives, and facilitating participation from all members is essential (Minkler, 2010; Ramanadhan et al., 2018).
To obtain unbiased input from all members when using PRISM, each team member can independently and confidentially complete the PRISM survey questions in Table 2 that ask respondents to report their perceptions, using all data available to them, as well as their personal experiences. Whenever possible we encourage use of objective data such as enrollment records to assess Reach, but it is important to use subjective and qualitative data when such measures are not available. The first 15 questions in Table 2 are the RE-AIM questions to assess implementation outcomes including equity, and the last 6 assess PRISM contextual factors. The questions in Table 2 are worded for use during the planning stage of a program. Those for other phases are presented in Gomes et al. (2022). The content of the PRISM (and RE-AIM outcomes) survey questions is the same across implementation phases, but the wording and ‘referents’ differ across phases. For example, questions during the planning phase ask respondents to estimate the degree of alignment of their current plans with their multi-level organizational context and the projected impact of the planned program on each of the RE-AIM outcomes. During the implementation phase respondents rate the extent to which the program is proving to be aligned with context and the RE-AIM outcomes are being achieved.
Addressing Equity
From its inception RE-AIM has focused on equity by its assessment of not only overall levels of each RE-AIM outcome dimension—such as overall percentage of smokers who respond to an invitation to participate in a tobacco cessation program (Reach), but also the representativeness of participants (Fort et al., 2023; Glasgow, 2013; Glasgow et al., 1999, 2019a, b; Henderson et al., 2020). For example, are those who participate representative of all patients in the implementation setting on social needs or health literacy? These representativeness issues apply across all RE-AIM dimensions. For example, are the settings and staff that adopt the program or maintain it after the research evaluation representative of those that do not? The focus on representativeness allows for assessment and reporting of equitable outcomes.
There are also systems issues (Northridge & Metcalf, 2016; Sterman, 2006) involving relationships among RE-AIM outcomes that are described in the paper by Fort et al. (2023). Since the RE-AIM outcomes are conceptually distinct, but not independent, changes in one can produce unintended negative (or positive) impacts on another. For example, disparities in reach may in fact be increased by selecting high intensity interventions to increase effectiveness; or prioritizing ease of delivery over meeting needs within the population, addressing structural inequities, or adapting to local context. Fort et al. (2023) describe the importance of a continuous process of addressing community priorities and responding to capacity and infrastructure needs and changes. It is often not possible to address all PRISM contextual factors or RE-AIM outcomes, and in such cases, having the community decide which issues to prioritize is strongly recommended (Glasgow & Estabrooks, 2018).
Another key issue in applying PRISM and other participatory strategies is its emphasis on representation—in addition to representativeness addressed above (Adsul et al., 2022; Baumann & Cabassa, 2020; Minkler, 2010). In applying PRISM to proactively increase equity, it is important to assess which groups, perspectives, and priorities should be included in different activities such as planning, priority setting, implementation, and evaluation. With respect to evaluation, PRISM’s RE-AIM outcomes should consider representation of the many people involved and informed by their diverse perspectives and priorities. It is equally important to assess who is not represented and understand why not. It is not sufficient to only engage community members and implementers most eager to be involved or share similar backgrounds with the research team. Limited representation in the guidance of implementation is likely to perpetuate societal inequities (Ginther et al., 2011).
As discussed in Fort et al. (2023) and illustrated in Fig. 3, there are multiple ways that implementation strategies can address PRISM contextual domains, RE-AIM outcomes, or both to assess and address equity issues. Various strategies and actions to enhance equity are listed on the right- and left-hand side of this figure. Arrows point to the PRISM issue that is most likely to be impacted by that strategy, realizing that many actions will impact more than one contextual factor or implementation outcome. For example, a monitoring and evaluation system is likely to most impact the Implementation and Sustainability Infrastructure and also multiple RE-AIM outcomes. Adaptation and co-creation activities are likely to impact all components of the PRISM framework as indicated in the far-left side of Fig. 3. Recently, Perez Jolles and colleagues described how PRISM can be used as part of a co-creation process. They show how PRISM can be integrated with the function-form approach to working with implementers to balance fidelity (to key functions or purposes) while allowing for and helping to guide adaptations (forms) to fit local context (Jolles et al., 2024; Pérez Jolles et al., 2022).
The extent to which PRISM components overlap with other TMFs to address health equity is a complex issue. Several of the key contextual domains are similar to those in models such as Woodward et al. (2019) but equity specific models are generally more comprehensive in their assessment of factors such as structural racism (Adsul et al., 2022; Baumann & Cabassa, 2020; Shelton et al., 2020). For this reason, such equity focused models are often used with PRISM to provide both broad consideration of contextual issues and more specific attention to the equity-related context. The RE-AIM components of PRISM specify equity impacts across a broader range of implementation outcomes (e.g., representativeness across all five RE-AIM dimensions) than most equity models. Further examination of this issue could be informed by use of the ‘special topics’ section of the dissemination-implementation.org webtool on health equity that compares TMFs on the extent to which they include various constructs relevant to health equity.
Iterative Use of PRISM
Our use of PRISM built upon experiences using RE-AIM iteratively. Initial work with iterative RE-AIM involved using it to assess progress at various times during the implementation process and periodically reset priorities. Using the RE-AIM questions (or slight variants) in Table 2, progress in addressing goals on each of the RE-AIM outcomes is assessed by the implementation team. Following review of the data from perspectives of all team members and identifying priorities for the next time period, specific implementation strategies are selected to enhance results on the one or two RE-AIM outcomes selected based upon the ‘gap’ between current priority and current progress. Experience using iterative RE-AIM was positive, but we realized that results could be enhanced by paying specific attention to and addressing alignment with contextual factors. Thus, we incorporated assessment of PRISM contextual factors into the iterative adaptation process.
An example use of PRISM in an iterative manner is its application in a colorectal cancer screening, follow-up, and referral study that was funded as part of the Accelerating Colorectal Cancer Screening and Follow Up through Implementation Science (ACCSIS) Initiative (Accelerating Colorectal Cancer Screening and follow-up through Implementation Science (ACCSIS), 2024). In the San Diego ACCSIS Program (Castañeda et al., 2023), iterative PRISM assessment was conducted during the planning (pre-implementation) and the implementation phases of the project. Representatives of participating federally qualified health centers (implementation partners), a linkage agency facilitating the academic–health center partnership, and academic partners completed a set of questions linked to each PRISM component. During pre-implementation, questions assessed the likelihood of achieving PRISM alignment and prioritized RE-AIM outcomes; during the implementation phase perceived progress toward these same outcomes was assessed. Comments for each rating were requested in the form of comment boxes. Questions were programmed in a REDCap survey system and shared via email. Results from the surveys were summarized and presented to the participants in a virtual meeting eliciting discussion and inviting participants to prioritize outcomes based on progress and engage in developing strategies to improve outcomes and alignment. Findings from the surveys, discussion, and specific linked action items taken by the research team and linkage agency were also shared with each health center in a summary document. Use of the iterative PRISM was perceived as a positive activity that contributed to a sense of meaningful bidirectional collaboration and learning. A number of adaptations were undertaken to address concerns identified as part of the iterative process.
Steps in Implementing Iterative PRISM
To clarify and enable others to replicate the iterative PRISM process we summarize below the steps involved. More detailed instructions, examples and materials for iterative PRISM are found in Gomes et al., (2022).
Step 1: Identify and engage partners to provide context for and logistics of engaging in the iterative PRISM process. Highlight the main purpose and potential benefits of iterative PRISM emphasizing the need to learn from diverse partners about how implementation will or is going and incorporate this learning into improvement activities. The format might be a brief meeting with key partners who will engage in the iterative PRISM process using a presentation with minimal academic jargon to describe the process and decide the best way to implement iterative PRISM in their setting. Participants should include a combination of implementation partners and research team members.
Step 2: Share the iterative PRISM survey questions via the group’s preferred modality. Possible data collection methods include use of REDCap, Qualtrics or other survey program, paper-based data collection or the iPRISM webtool discussed below. Collection of qualitative explanations for ratings is critically important and the use of comment boxes should be emphasized and encouraged, especially in global health applications and other settings where cultural issues may be especially relevant for understanding responses.
Step 3: Summarize data from surveys and comments to share with participating partners. Use of visuals and easy to follow summary tables are the best way to convey this information. A team meeting with your implementation research team should prioritize PRISM components with less than perfect scores and highlight comments that should be discussed during the follow-up meeting.
Step 4: Reconvene partners to review and discuss survey and comment data and to act on these results. Keep presentations brief and allow for plenty of discussion during the meeting. Do not attempt to present all data, instead focus on key areas identified in Step 3. Make the full data available to partners during or after the meeting. When discussing findings, inquire about lower scores, issues on which there are significant differences of perspective, or identified barriers. Engage partners in developing feasible strategies to address key challenges and specify how these strategies will be implemented (i.e., who, when, what, etc.). This group reflection, open discussion, collaborative prioritizing, and identification of feasible action strategies are the heart of the iterative PRISM process.
Step 5: Follow-up to assess progress in key areas prioritized and repeat the process. Check-ins can take the form of a standing meeting, e-mail updates, or follow-up meetings. The iterative PRISM process should be repeated periodically. The timing and frequency of iterations should be decided upon by the partners but tailored to the project, resources, burden and how rapidly the behaviors being targeted change. We recommend 3–4 iterations during a project 6–12 months in length depending on the above factors and on an ongoing basis for projects that are institutionalized. If at all possible, assessments and goal setting should occur during pre-implementation, mid-implementation and as part of sustainment planning.
Variations of iterative PRISM have now been evaluated in at least four studies (Castañeda et al., 2023; Glasgow et al., 2022; Maw et al., 2022; Pittman et al., 2021; Trinkley et al., 2023) including one that applied it in three separate projects (Glasgow et al., 2020). As described in these reports, iterative PRISM has consistently been found to be broadly applicable and helpful in prioritizing and setting improvement goals and engaging partners who represent diverse perspectives on program implementation barriers, facilitators, and progress.
Web-Based Application
To expand the accessibility and efficiency of using PRISM, the iPRISM webtool (https://prismtool.org) was developed and is available in English and Spanish (Trinkley et al., 2023). This tool uses the same process and measures as iterative use described above but has advantages for both research and practice, notably its ability to standardize and simplify the application of PRISM and optimize efficiency of use by teams. It provides a guided experience in how to apply PRISM, including embedded education and prompts that dynamically facilitate: the assessment of a project’s contextual alignment and actual or anticipated RE-AIM outcomes using the 21 assessment questions in Table 2; identification and prioritization of feasible and high impact implementation strategies or adaptations to improve alignment and outcomes; and action planning via use of example strategies and templates. The webtool promotes efficiency by automatically generating summary figures and tabulated data of both individual and team member responses making it easy to immediately identify areas of lower mean ‘scores’ and variability in team member ‘scores.’ Users can view their results based on their preference of a graphic ‘radar’ plot, a simple bar chart, or a table. Although the iPRISM webtool can be completed by individuals, its ability to efficiently support team-based use is possibly its greatest advantage, and facilitates data capture, analysis, and representation of each team member’s perceptions.
The embedded education includes an introductory video illustrating use of the tool, quick start guides for general use and for team facilitators, and ‘hover’ or ‘info button’ features to provide definitions and examples of PRISM terms. The education is designed such that research and practice-based users with and without implementation science expertise can tailor their experience and get ‘just in time’ support throughout the process of applying PRISM. For those with implementation science expertise, specific terminology and discussion of nuances may be key to interpretation and the webtool allows access to this detail if desired, but the jargon is minimized to optimize ease of understanding for those without this expertise. The PRISM webtool was developed using user centered design procedures and adapted for different settings and user groups as described in Trinkley et al. (2023). As shown in Table 1, it is currently being used in low resource settings including community health centers and globally as part of the St. Jude Network and Proyecto EVAT, an implementation of pediatric early warning signs in 36 oncology centers (Agulnik et al., 2022). The webtool has not, however, been evaluated with different levels of facilitation.
Resources, Tools, and Guidance for the Optimal Use of PRISM
There is an increasing number of resources available to facilitate informed use of PRISM. Table 3 summarizes and provides descriptions of these resources. The single best and most frequently updated source is the www.re-aim.org website (RE-AIM Workgroup, 2021) which contains a wide variety of resources on both the overall PRISM and its RE-AIM components (which are often used without PRISM contextual components). The website has undergone recent enhancements, expansion and improvements to navigation based on user testing which will be described in a future publication. Table 3 also summarizes the increasing variety of other resources that provide examples and guidance on applying PRISM for different purposes and using different modalities (e.g., workshops, podcasts, reference guides). These resources should be useful for both practitioners and researchers interested in applying PRISM. They have been developed over time in response to questions and requests for assistance received; informal feedback from users indicates their helpfulness.
Discussion
In this paper, we provided an overview of PRISM, summarized its use and evolution, discussed key innovations and recent uses, and provided guidance and resources for applying PRISM. In this section we summarize key lessons learned in applying PRISM and share recommendations for its future use. First, it is important to emphasize that PRISM builds upon, includes, and expands RE-AIM so almost all the characteristics, uses, strengths and limitations of RE-AIM discussed in depth elsewhere also apply to PRISM (Glasgow & Estabrooks, 2018; Glasgow et al., 2019a, b; Harden et al., 2015; Holtrop et al., 2021a, 2021b; Holtrop et al., 2021a, 2021b). The use of PRISM contextual factors in addition to RE-AIM outcomes has increased substantially and relatively consistently since the first publication on PRISM in 2008. As with most TMFs, many applications of PRISM have not included some key issues (e.g., representativeness, perspectives of all key parties, use in analyses) and application has been incomplete: Rabin found only 18% of publications they identified made comprehensive or ‘integrated’ use of PRISM. This is not unusual, but does suggest that additional guidance, resources, and examples such as those recently produced and summarized in Table 3 are needed. Hopefully the discussion, guidance, and resources provided here and available online will enhance the consistency and comprehensiveness of PRISM use across program planning, assessment, iteration, and sustainment phases.
PRISM appears broadly applicable. It has been used across multiple clinical, community and public health content areas and diverse settings including low resource clinics, some low- and middle-income countries, and different cultures and languages (Glasgow et al., 2019a, b; Harden et al., 2015; Kwan et al., 2019; Rabin et al., 2022). PRISM and RE-AIM outcomes have been assessed using both qualitative and quantitative methods: we recommend mixed methods applications to understand not only what happened, but why and how (Holtrop et al., 2018; Qualitative Methods in Implementation Science, 2020). Recent applications of PRISM facilitate (but do not ensure) equitable engagement through a focus on representation, including perspectives of all implementation partners, and on assessing and enhancing representativeness across all RE-AIM outcomes. Recent guidance (Fort et al., 2023; Gomes et al., 2022; RE-AIM Workgroup, 2021) provides specific assessments and implementation strategies to enhance equity and prevent, detect, and minimize unintended adverse consequences.
Limitations
Despite the broad application of PRISM, the model and accompanying measures and resources need additional work. The overall PRISM and especially its RE-AIM outcomes are less complex than many other implementation science TMFs which contain many more constructs and especially ‘determinants’ (Damschroder et al., 2022). Still, continued work is needed to enhance its accessibility to non-implementation science audiences. We are hopeful that more diverse examples, brief instructional videos, podcasts and the like will help address frequently asked questions. We are continuing to reduce the amount of implementation science jargon and considering if we can produce effective tools for general use with less PRISM terminology.
More work is needed to validate and provide norms and other characteristics of PRISM measures. Studts et al. (2023) recently reported data showing that most RE-AIM measures produced predicted relationships to several ‘service’ (Proctor et al., 2011) or quality of care outcomes such as equity, safety, efficiency, and effectiveness. There are few data on the relationship of PRISM domains to each other or to RE-AIM outcomes. This is complicated because such relationships also depend on the context, the implementation strategies used, the intervention, and likely other moderating factors (Pawson, 2013). Some PRISM domains and measures may need to be substantially modified or replaced. Conversely, further research may identify gaps and other factors that need to be added to PRISM.
Future Directions
Like its RE-AIM component and as recommended by Kislov et. al. (2019) (Glasgow et al., 2019a, b), PRISM will and should continue to evolve. Key steps for future use include addressing the limitations outlined above. In particular, the new iPRISM webtool presents opportunities to further validate and adapt PRISM assessments. It efficiently collects, automatically tabulates, and summarizes responses to standardized questions about PRISM contextual domains and RE-AIM outcomes, as well as characteristics of the project (e.g., setting, content area, team member roles). With broader use, this database will allow establishment of norms for the individual questions, summary scores, and different content areas. These data can then be used to facilitate comparisons across projects and to create norms based on different domains, type and phases of projects, and settings. As data accrue on both contextual determinants and pragmatic outcomes, they could also be used for causal predictions of a project’s likelihood of success and sustainability as well as to better understand mechanisms of effect (Geng et al., 2022; Lewis et al., 2022; Mody et al., 2023).
The lists below summarize other recommendations for both future research and practice for PRISM based upon our research and implementation experience, the literature review by Rabin et al. (2022), and lessons learned operationalizing PRISM for various purposes across diverse settings, populations, and topics.
For research, we recommend:
-
(1)
Development and validation of more quantitative measures of PRISM, especially those that meet pragmatic criteria for measurement such as recommended by Glasgow and Riley and the PAPERS criteria (Glasgow & Riley, 2013; Lewis et al., 2021). Studies should use more standardized PRISM definitions, assessments, and criteria to allow for cross-project comparisons.
-
(2)
Although it is preferable to use all PRISM components and consider PRISM across all phases of a program to comprehensively incorporate and advance implementation science, for pragmatic use it is not necessary to include all components or to use PRISM across all program phases. When PRISM is not comprehensively used, authors should briefly and transparently state why certain components were not used or why PRISM was not used across all time points. Justifications might include lack of community priority or relevance for certain components or time points or lack of resources to complete more frequent or broader assessments.
-
(3)
Comparisons of PRISM to and in combination with other TMFs and creation of crosswalks between PRISM and other TMFs, such as has been done for RE-AIM outcomes (Lewis et al., 2023; Reilly et al., 2020).
-
(4)
Comparison of use of PRISM contextual domains with other determinants TMFs to relate to and understand RE-AIM outcomes.
-
(5)
Assessment of the time and resources required for various applications of PRISM for both researchers and implementation partners.
For implementation practice, we recommend:
-
(1)
Broader use of PRISM with different types of implementation partners and in multi-sector research. Specifically, more use in sectors such as business, civic groups, and governmental policy; and disciplines such as environmental science and economics is indicated.
-
(2)
Increased application of PRISM in global health and especially low- and middle-income countries. Specific recommendations include evaluating the usefulness of PRISM contextual categories in low- and middle-income countries and if assessment and iterative implementation strategies can be successfully conducted in these settings.
-
(3)
Use of PRISM in logic models and participatory modeling to identify implementation strategies that best address community prioritized outcomes.
-
(4)
Adaptation of PRISM terminology (e.g., recipients) to make it more user-friendly and relevant in the context of local implementation teams.
-
(5)
Continued development and usability evaluations of the current iPRISM webtool and additional interactive tools and resources including videos that illustrate and guide use of PRISM.
Conclusion
PRISM is now being applied widely and has generally been found useful across an increasingly wide variety of settings and problems. It is still a work in progress, however, and newer applications will inform future modifications to PRISM. We hope that this paper along with other resources such as the Guidebook, the iPRISM webtool, and the RE-AIM website can enhance informed and pragmatic application for both research and practice. We invite others to apply, report on, and help improve the model.
Data availability
Not applicable.
References
Aarons, G. A., Green, A. E., Palinkas, L. A., Self-Brown, S., Whitaker, D. J., Lutzker, J. R., Silovsky, J. F., Hecht, D. B., & Chaffin, M. J. (2012). Dynamic adaptation process to implement an evidence-based child maltreatment intervention. Implementation Science, 7(1), 32. https://doi.org/10.1186/1748-5908-7-32
Adsul, P., Chambers, D., Brandt, H. M., Fernandez, M. E., Ramanadhan, S., Torres, E., Leeman, J., Baquero, B., Fleischer, L., Escoffery, C., Emmons, K., Soler, M., Oh, A., Korn, A. R., Wheeler, S., & Shelton, R. C. (2022). Grounding implementation science in health equity for cancer prevention and control. Implementation Science Communication, 3(1), 56. https://doi.org/10.1186/s43058-022-00311-4
Agulnik, A., Gonzalez Ruiz, A., Muniz-Talavera, H., Carrillo, A. K., Cárdenas, A., Puerto-Torres, M. F., Garza, M., Conde, T., Soberanis Vasquez, D. J., Méndez Aceituno, A., Acuña Aguirre, C., Alfonso, Y., Álvarez Arellano, S. Y., Argüello Vargas, D., Batista, R., Blasco Arriaga, E. E., Chávez Rios, M., Cuencio Rodríguez, M. E., Fing Soto, E. A., . . ., EVAT Study Group. (2022). Model for regional collaboration: Successful strategy to implement a pediatric early warning system in 36 pediatric oncology centers in Latin America. Cancer, 128(22), 4004–4016. https://doi.org/10.1002/cncr.34427
Baumann, A. A., & Cabassa, L. J. (2020). Reframing implementation science to address inequities in healthcare delivery. BMC Health Services Research, 20, 1–9.
Bull, S., & Ezeanochie, N. (2016). From Foucault to Freire through Facebook: Toward an integrated theory of mHealth. Health Education and Behavior, 43(4), 399–411. https://doi.org/10.1177/1090198115605310
Castañeda, S. F., Gupta, S., Nodora, J. N., Largaespada, V., Roesch, S. C., Rabin, B. A., Covin, J., Ortwine, K., Preciado-Hidalgo, Y., Howard, N., Halpern, M. T., & Martinez, M. E. (2023). Hub-and-spoke centralized intervention to optimize colorectal cancer screening and follow-up: A pragmatic, cluster-randomized controlled trial protocol. Contemporary Clinical Trials, 134, 107353. https://doi.org/10.1016/j.cct.2023.107353
Chambers, D. A., Glasgow, R. E., & Stange, K. C. (2013). The Dynamic Sustainability Framework: Addressing the paradox of sustainment amid ongoing change. Implementation Science, 8(1), 1–11.
Chambers, D. A., & Norton, W. E. (2016). The Adaptome: Advancing the science of intervention adaptation. American Journal of Preventative Medicine, 51(4 Suppl 2), S124-131. https://doi.org/10.1016/j.amepre.2016.05.011
Choy, A., Shellington, E. M., Rideout, K., Roushorne, M., Joshi, P., & Carlsten, C. (2023). Engaging interested parties to optimize wildfire smoke communication in Canada: Challenges with initiating change. Frontiers in Public Health, 11, 1268249. https://doi.org/10.3389/fpubh.2023.1268249
Damschroder, L., Reardon, C. M., Widerquist, M. A. O., & Lowery, J. C. (2022). The updated consolidated framework for implementation research: CFIR 2.0. Implementation Science, 17, 75.
Ehrhart, M. G., Aarons, G. A., & Farahnak, L. R. (2014). Assessing the organizational context for EBP implementation: The development and validity testing of the implementation climate scale (ICS). Implementation Science, 9, 157. https://doi.org/10.1186/s13012-014-0157-1
Eisman, A. B., Kilbourne, A. M., Dopp, A. R., Saldana, L., & Eisenberg, D. (2020). Economic Evaluation in Implementation Science: Making the business case for implementation strategies. Psychiatry Research, 283, 112433. https://doi.org/10.1016/j.psychres.2019.06.008
Ekawati, F. M., Emilia, O., Gunn, J., Licqurish, S., & Lau, P. (2020). The elephant in the room: An exploratory study of hypertensive disorders of pregnancy (HDP) management in Indonesian primary care settings. BMC Family Practice, 21(1), 242. https://doi.org/10.1186/s12875-020-01303-w
Ekawati, F. M., Licqurish, S., Emilia, O., Gunn, J., Brennecke, S., & Lau, P. (2019). Developing management pathways for hypertensive disorders of pregnancy (HDP) in Indonesian primary care: A study protocol. Reproductive Health, 16(1), 12. https://doi.org/10.1186/s12978-019-0674-0
Feldstein, A. C., & Glasgow, R. E. (2008). A practical, robust implementation and sustainability model (PRISM) for integrating research findings into practice. The Joint Commission Journal on Quality and Patient Safety, 34(4), 228–243.
Fort, M. P., Manson, S. M., & Glasgow, R. E. (2023). Applying an equity lens to assess context and implementation in public health practice and research using the PRISM framework. Frontiers in Health Services. https://doi.org/10.3389/frhs.2023.1139788
Geng, E. H., Baumann, A. A., & Powell, B. J. (2022). Mechanism mapping to advance research on implementation strategies. PLoS Medicine, 19(2), e1003918. https://doi.org/10.1371/journal.pmed.1003918
Ginther, D. K., Schaffer, W. T., Schnell, J., Masimore, B., Liu, F., Haak, L. L., & Kington, R. (2011). Race, ethnicity, and NIH research awards. Science, 333(6045), 1015–1019. https://doi.org/10.1126/science.1196783
Glasgow, R. E. (2013). What does it mean to be pragmatic? Pragmatic methods, measures, and models to facilitate research translation. Health Education and Behavior, 40(3), 257–265.
Glasgow, R. E., Battaglia, C., McCreight, M., Ayele, R., Maw, A. M., Fort, M. P., Holtrop, J. S., Gomes, R. N., & Rabin, B. A. (2022). Use of the reach, effectiveness, adoption, implementation, and maintenance (RE-AIM) framework to guide iterative adaptations: Applications, lessons learned, and future directions (Original Research). Frontiers in Health Services. https://doi.org/10.3389/frhs.2022.959565
Glasgow, R. E., Battaglia, C., McCreight, M., Ayele, R. A., & Rabin, B. A. (2020). Making implementation science more rapid: Use of the RE-AIM framework for mid-course adaptations across five health services research projects in the Veterans Health Administration. Frontiers in Public Health, 8, 194.
Glasgow, R. E., & Estabrooks, P. E. (2018). Pragmatic applications of RE-AIM for health care initiatives in community and clinical settings. Preventing Chronic Disease, 15, E02. https://doi.org/10.5888/pcd15.170271
Glasgow, R. E., Harden, S. M., Gaglio, B., Rabin, B. A., Smith, M. L., Porter, G. C., Ory, M. G., & Estabrooks, P. A. (2019a). RE-AIM Planning and Evaluation Framework: Adapting to new science and practice with a 20-year review. Frontiers in Public Health, 7, 64.
Glasgow, R. E., Hollis, J. F., Ary, D. V., & Lando, H. A. (1990). Employee and organizational factors associated with participation in an incentive-based worksite smoking cessation program. Journal of Behavioral Medicine, 13(4), 403–418. https://doi.org/10.1007/bf00844887
Glasgow, R. E., Huebschmann, A. G., Krist, A. H., & Degruy, F. V. (2019b). An adaptive, contextual, technology-aided support (ACTS) system for chronic illness self-management. Milbank Quarterly, 97(3), 669–691. https://doi.org/10.1111/1468-0009.12412
Glasgow, R. E., Knoepke, C. E., Magid, D., Grunwald, G. K., Glorioso, T. J., Waughtal, J., Marrs, J. C., Bull, S., & Ho, P. M. (2021). The NUDGE trial pragmatic trial to enhance cardiovascular medication adherence: Study protocol for a randomized controlled trial. Trials, 22(1), 528. https://doi.org/10.1186/s13063-021-05453-9
Glasgow, R. E., & Riley, W. T. (2013). Pragmatic measures: What they are and why we need them. American Journal of Preventative Medicine, 45(2), 237–243. https://doi.org/10.1016/j.amepre.2013.03.010
Glasgow, R. E., Vogt, T. M., & Boles, S. M. (1999). Evaluating the public health impact of health promotion interventions: The RE-AIM framework. American Journal of Public Health, 89(9), 1322–1327. https://doi.org/10.2105/ajph.89.9.1322
Gomes, R., Battaglia, C., Fort, M. P., Maw, A., McCreight, M., Rabin, B., Robertson, E., Studts, C., Trinkley, K. E., & Glasgow, R. E. (2022). A guidebook to the pragmatic and iterative use of PRISM and RE-AIM for planning, implementation, and sustainment. University of Colorado School of Medicine. https://medschool.cuanschutz.edu/docs/librariesprovider94/di-docs/guides-and-tools/iprism-and-reaim-guidebook_wip.pdf?sfvrsn=adea27bb_1
Guerin, R. J., Glasgow, R. E., Tyler, A., Rabin, B. A., & Huebschmann, A. G. (2022). Methods to improve the translation of evidence-based interventions: A primer on dissemination and implementation science for occupational safety and health researchers and practitioners. Safety Science, 152, 105763. https://doi.org/10.1016/j.ssci.2022.105763
Guerin, R. J., Okun, A. H., Barile, J. P., Emshoff, J. G., Ediger, M. D., & Baker, D. S. (2019). Preparing teens to stay safe and healthy on the job: A multilevel evaluation of the talking safety curriculum for middle schools and high schools. Prevention Science, 20(4), 510–520. https://doi.org/10.1007/s11121-019-01008-2
Harden, S. M., Gaglio, B., Shoup, J. A., Kinney, K. A., Johnson, S. B., Brito, F., Blackman, K. C., Zoellner, J. M., Hill, J. L., Almeida, F. A., Glasgow, R. E., & Estabrooks, P. A. (2015). Fidelity to and comparative results across behavioral interventions evaluated through the RE-AIM framework: A systematic review. Systematic Reviews, 4, 155. https://doi.org/10.1186/s13643-015-0141-0
Henderson, V., Tossas-Milligan, K., Martinez, E., Williams, B., Torres, P., Mannan, N., Green, L., Thompson, B., Winn, R., & Watson, K. S. (2020). Implementation of an integrated framework for a breast cancer screening and navigation program for women from underresourced communities. Cancer, 126(Suppl 10), 2481–2493. https://doi.org/10.1002/cncr.32843
Holtrop, J. S., Estabrooks, P. A., Gaglio, B., Harden, S. M., Kessler, R. S., King, D. K., Kwan, B. M., Ory, M. G., Rabin, B. A., Shelton, R. C., & Glasgow, R. E. (2021a). Understanding and applying the RE-AIM framework: Clarifications and resources. Journal of Clinical and Translational Science, 5(1), e126. https://doi.org/10.1017/cts.2021.789
Holtrop, J. S., Rabin, B. A., & Glasgow, R. E. (2018). Qualitative approaches to use of the RE-AIM framework: Rationale and methods. BMC Health Services Research, 18(1), 177. https://doi.org/10.1186/s12913-018-2938-8
Holtrop, J. S., Scherer, L. D., Matlock, D. D., Glasgow, R. E., & Green, L. A. (2021b). The importance of mental models in implementation science. Frontiers in Public Health, 9, 680316. https://doi.org/10.3389/fpubh.2021.680316
Jolles, M. P., Fort, M. P., & Glasgow, R. E. (2024). Aligning the planning, development, and implementation of complex interventions to local contexts with an equity focus: Application of the PRISM/RE-AIM Framework. International Journal of Equity Health, 23(1), 41. https://doi.org/10.1186/s12939-024-02130-6
Jones Rhodes, W. C., Ritzwoller, D. P., & Glasgow, R. E. (2018). Stakeholder perspectives on costs and resource expenditures: Tools for addressing economic issues most relevant to patients, providers, and clinics. Translational Behavioral Medicine, 8(5), 675–682. https://doi.org/10.1093/tbm/ibx003
Khan, S., Chambers, D., & Neta, G. (2021). Revisiting time to translation: Implementation of evidence-based practices (EBPs) in cancer control. Cancer Causes and Control, 32(3), 221–230. https://doi.org/10.1007/s10552-020-01376-z
Kilbourne, A. M., Elwy, A. R., Sales, A. E., & Atkins, D. (2017). Accelerating research impact in a learning health care system: VA’s Quality Enhancement Research Initiative in the Choice Act Era. Medical Care, 55(7 Suppl 1), S4-s12. https://doi.org/10.1097/mlr.0000000000000683
Kirk, M. A., Moore, J. E., Wiltsey Stirman, S., & Birken, S. A. (2020). Towards a comprehensive model for understanding adaptations’ impact: The model for adaptation design and impact (MADI). Implementation Science, 15(1), 56. https://doi.org/10.1186/s13012-020-01021-y
Kislov, R., Pope, C., Martin, G. P., & Wilson, P. M. (2019). Harnessing the power of theorising in implementation science. Implementation Science, 14(1), 103. https://doi.org/10.1186/s13012-019-0957-4
Kwan, B. M., Brownson, R. C., Glasgow, R. E., Morrato, E. H., & Luke, D. A. (2022). Designing for dissemination and sustainability to promote equitable impacts on health. Annual Review of Public Health, 43, 331–353.
Kwan, B. M., McGinnes, H. L., Ory, M. G., Estabrooks, P. A., Waxmonsky, J. A., & Glasgow, R. E. (2019). RE-AIM in the real world: Use of the RE-AIM framework for program planning and evaluation in clinical and community settings. Frontiers in Public Health, 7, 345. https://doi.org/10.3389/fpubh.2019.00345
Lewis, C. C., Klasnja, P., Lyon, A. R., Powell, B. J., Lengnick-Hall, R., Buchanan, G., Meza, R. D., Chan, M. C., Boynton, M. H., & Weiner, B. J. (2022). The mechanics of implementation strategies and measures: Advancing the study of implementation mechanisms. Implementation Science Communications, 3(1), 114. https://doi.org/10.1186/s43058-022-00358-3
Lewis, C. C., Matthews, K., Proctor, E., & Brownson, R. (2023). Measurement issues in dissemination and implementation research. In R. Brownson, G. Colditz & E. Proctor (Eds.), Dissemination and implementation research in health: Translating science to practice (3rd ed., pp. 327–344). Oxford Academic.
Lewis, C. C., Mettert, K. D., Stanick, C. F., Halko, H. M., Nolen, E. A., Powell, B. J., & Weiner, B. J. (2021). The psychometric and pragmatic evidence rating scale (PAPERS) for measure development and evaluation. Implementation Research and Practice, 2, 26334895211037390. https://doi.org/10.1177/26334895211037391
Lichtenstein, E., Hollis, J. F., Severson, H. H., Stevens, V. J., Vogt, T. M., Glasgow, R. E., & Andrews, J. A. (1996). Tobacco cessation interventions in health care settings: Rationale, model, outcomes. Addictive Behaviors, 21(6), 709–720. https://doi.org/10.1016/0306-4603(96)00030-5
Maw, A. M., Morris, M. A., Glasgow, R. E., Barnard, J., Ho, P. M., Ortiz-Lopez, C., Fleshner, M., Kramer, H. R., Grimm, E., Ytell, K., Gardner, T., & Huebschmann, A. G. (2022). Using iterative RE-AIM to enhance hospitalist adoption of lung ultrasound in the management of patients with COVID-19: An implementation pilot study. Implementation Science Communication, 3(1), 89. https://doi.org/10.1186/s43058-022-00334-x
McCreight, M. S., Rabin, B. A., Glasgow, R. E., Ayele, R. A., Leonard, C. A., Gilmartin, H. M., Frank, J. W., Hess, P. L., Burke, R. E., & Battaglia, C. T. (2019). Using the practical, robust implementation and sustainability model (PRISM) to qualitatively assess multilevel contextual factors to help plan, implement, evaluate, and disseminate health services programs. Translational Behavioral Medicine, 9(6), 1002–1011. https://doi.org/10.1093/tbm/ibz085
McKay, M. M., Sensoy Bahar, O., & Ssewamala, F. M. (2020). Implementation science in global health settings: Collaborating with governmental and community partners in Uganda. Psychiatry Research, 283, 112585. https://doi.org/10.1016/j.psychres.2019.112585
Miller, C. J., Wiltsey-Stirman, S., & Baumann, A. A. (2020). Iterative decision-making for evaluation of adaptations (IDEA): A decision tree for balancing adaptation, fidelity, and intervention impact. Journal of Community Psychology, 48(4), 1163–1177. https://doi.org/10.1002/jcop.22279
Minkler, M. (2010). Linking science and policy through community-based participatory research to study and address health disparities. American Journal of Public Health, 100(S1), S81–S87. https://doi.org/10.2105/ajph.2009.165720
Mody, A., Filiatreau, L. M., Goss, C. W., Powell, B. J., & Geng, E. H. (2023). Instrumental variables for implementation science: Exploring context-dependent causal pathways between implementation strategies and evidence-based interventions. Implementation Science Communications, 4(1), 157. https://doi.org/10.1186/s43058-023-00536-x
Moore, G., Campbell, M., Copeland, L., Craig, P., Movsisyan, A., Hoddinott, P., Littlecott, H., O’Cathain, A., Pfadenhauer, L., Rehfuess, E., Segrott, J., Hawe, P., Kee, F., Couturiaux, D., Hallingberg, B., & Evans, R. (2021). Adapting interventions to new contexts-the ADAPT guidance. BMJ, 374, n1679. https://doi.org/10.1136/bmj.n1679
Movsisyan, A., Arnold, L., Evans, R., Hallingberg, B., Moore, G., O’Cathain, A., Pfadenhauer, L. M., Segrott, J., & Rehfuess, E. (2019). Adapting evidence-informed complex population health interventions for new contexts: A systematic review of guidance. Implementation Science, 14(1), 105. https://doi.org/10.1186/s13012-019-0956-5
National Academy of Medicine. (2024). The learning health system series. National Academy of Medicine. Retrieved June 19, from https://nam.edu/programs/value-science-driven-health-care/learning-health-system-series/
National Cancer Institute. (2020). Qualitative methods in implementation science. NCI. https://cancercontrol.cancer.gov/sites/default/files/2020-09/nci-dccps-implementationscience-whitepaper.pdf
National Cancer Institute. (2024, January 5). Accelerating colorectal cancer screening and follow-up through implementation science (ACCSIS). National Cancer Institute. Retrieved January 12, 2024, from https://healthcaredelivery.cancer.gov/accsis/
Nilsen, P. (2015). Making sense of implementation theories, models and frameworks. Implementation Science, 10, 53. https://doi.org/10.1186/s13012-015-0242-0
Nilsen, P., & Bernhardsson, S. (2019). Context matters in implementation science: A coping review of determinant frameworks that describe contextual determinants for implementation outcomes. BMC Health Services Research, 19(1), 189. https://doi.org/10.1186/s12913-019-4015-3
Northridge, M. E., & Metcalf, S. S. (2016). Enhancing implementation science by applying best principles of systems science. Health Research Policy and Systems, 14(1), 74. https://doi.org/10.1186/s12961-016-0146-8
Pawson, R. (2013). The science of evaluation: A realist manifesto. SAGE Publications Ltd.
Perez Jolles, M., Lengnick-Hall, R., & Mittman, B. S. (2019). Core functions and forms of complex health interventions: A patient-centered medical home illustration. Journal of General Internal Medicine, 34(6), 1032–1038. https://doi.org/10.1007/s11606-018-4818-7
Pérez Jolles, M., Willging, C. E., Stadnick, N. A., Crable, E. L., Lengnick-Hall, R., Hawkins, J., & Aarons, G. A. (2022). Understanding implementation research collaborations from a co-creation lens: Recommendations for a path forward. Frontiers in Health Services, 2, 942658.
Pfadenhauer, L. M., Gerhardus, A., Mozygemba, K., Lysdahl, K. B., Booth, A., Hofmann, B., Wahlster, P., Polus, S., Burns, J., Brereton, L., & Rehfuess, E. (2017). Making sense of complexity in context and implementation: The context and implementation of complex interventions (CICI) framework. Implementation Science, 12(1), 21. https://doi.org/10.1186/s13012-017-0552-5
Pittman, J. O. E., Lindamer, L., Afari, N., Depp, C., Villodas, M., Hamilton, A., Kim, B., Mor, M. K., Almklov, E., Gault, J., & Rabin, B. (2021). Implementing eScreening for suicide prevention in VA post-9/11 transition programs using a stepped-wedge, mixed-method, hybrid type-II implementation trial: A study protocol. Implementation Science Communications, 2(1), 46. https://doi.org/10.1186/s43058-021-00142-9
Proctor, E., Silmere, H., Raghavan, R., Hovmand, P., Aarons, G., Bunger, A., Griffey, R., & Hensley, M. (2011). Outcomes for implementation research: Conceptual distinctions, measurement challenges, and research agenda. Administration and Policy in Mental Health and Mental Health Services Research, 38(2), 65–76. https://doi.org/10.1007/s10488-010-0319-7
Rabin, B. A., Cakici, J., Golden, C. A., Estabrooks, P. A., Glasgow, R. E., & Gaglio, B. (2022). A citation analysis and scoping systematic review of the operationalization of the practical, robust implementation and sustainability model (PRISM). Implementation Science, 17(1), 1–26.
Rabin, B. A., Swanson, K., Glasgow, R. E., Ford, B., Huebschmann, A. G., Gomes, R., Tabak, R. G., Brownson, R., Malone, S., & Padek, M. (2014–2024). Dissemination and implementation models in health. University of Colorado Denver. https://dissemination-implementation.org/
Ramanadhan, S., Davis, M. M., Armstrong, R., Baquero, B., Ko, L. K., Leng, J. C., Salloum, R. G., Vaughn, N. A., & Brownson, R. C. (2018). Participatory implementation science to increase the impact of evidence-based cancer prevention and control. Cancer Causes and Control, 29(3), 363–369. https://doi.org/10.1007/s10552-018-1008-1
RE-AIM Workgroup. (2021). Reach effectiveness adoption implementation maintenance (RE-AIM) website. RE-AIM Workgroup. Retrieved February 2, from www.re-aim.org
Reilly, K. L., Kennedy, S., Porter, G., & Estabrooks, P. (2020). Comparing, contrasting, and integrating dissemination and implementation outcomes included in the RE-AIM and implementation outcomes frameworks. Frontiers in Public Health, 8, 430.
Rogers, E. M. (2003). Diffusion of innovations (5th ed.). Free Press.
Shelton, R. C., Chambers, D. A., & Glasgow, R. E. (2020). An extension of RE-AIM to enhance sustainability: Addressing dynamic context and promoting health equity over time. Frontiers in Public Health, 8, 134.
Skivington, K. M. L., Simpson, S. A., Craig, P., Baird, J., Blazeby, J. M., Boyd, K. A., Craig, N., French, D. P., McIntosh, E., Petticrew, M., Rycroft-Malone, J., White, M., & Moore, L. (2021). A new framework for developing and evaluating complex interventions: Update of Medical Research Council guidance. BMJ, 374, n2061. https://doi.org/10.1136/bmj.n2061
Smith, J. D., Li, D. H., & Rafferty, M. R. (2020). The implementation research logic model: A method for planning, executing, reporting, and synthesizing implementation projects. Implementation Science, 15, 1–12.
Ssewamala, F. M., Sensoy Bahar, O., McKay, M. M., Hoagwood, K., Huang, K. Y., & Pringle, B. (2018). Strengthening mental health and research training in Sub-Saharan Africa (SMART Africa): Uganda study protocol. Trials, 19(1), 423. https://doi.org/10.1186/s13063-018-2751-z
Sterman, J. D. (2006). Learning from evidence in a complex world. American Journal of Public Health, 96(3), 505–514. https://doi.org/10.2105/ajph.2005.066043
Stevens, V. J., Glasgow, R. E., Hollis, J. F., & Mount, K. (2000). Implementation and effectiveness of a brief smoking-cessation intervention for hospital patients. Medical Care, 38(5), 451–459. https://doi.org/10.1097/00005650-200005000-00002
Strifler, L., Cardoso, R., McGowan, J., Cogo, E., Nincic, V., Khan, P. A., Scott, A., Ghassemi, M., MacDonald, H., Lai, Y., Treister, V., Tricco, A. C., & Straus, S. E. (2018). Scoping review identifies significant number of knowledge translation theories, models, and frameworks with limited use. Journal of Clinical Epidemiology, 100, 92–102. https://doi.org/10.1016/j.jclinepi.2018.04.008
Studts, C., Ford, B., & Glasgow, R. E. (2023). RE-AIM implementation outcomes and service outcomes: What’s the connection? Results of a cross-sectional survey. BMC Health Services Research, 23(1), 1417. https://doi.org/10.1186/s12913-023-10422-w
Tabak, R. G., Chambers, D., Khoong, E. C., & Brownson, R. C. (2013). Models in dissemination and implementation research: Useful tools in public health services and systems research. Retrieved March 18, from http://uknowledge.uky.edu/cgi/viewcontent.cgi?article=1012&context=frontiersinphssr&seiredir=1&referer=http://www.bing.com/search?q=Models+in+dissemination+and+implementationresearch%3A+useful+tools+in+public+health+services+andsystems+research&form=DLRD
Tabak, R. G., Nilsen, P., Woodward, E. N., & Chambers, D. A. (2023). The conceptual basis for dissemination and implementation research: Lessons from existing theories, models, and frameworks. In R. C. Brownson, E. Proctor & G. A. Colditz (Eds.), Dissemination and implementation research in health: Translating science to practice (3rd ed., pp. 86–105). Oxford University Press. https://doi.org/10.1093/oso/9780197660690.001.0001
Trinkley, K. E., Glasgow, R. E., D’Mello, S., Fort, M. P., Ford, B., & Rabin, B. A. (2023). The iPRISM Webtool: An interactive tool to pragmatically guide the iterative use of the practical, robust implementation and sustainability model in public health and clinical settings. Implementation Science Communications, 4(1), 116. https://doi.org/10.1186/s43058-023-00494-4
Trinkley, K. E., Guerin, R. J., Pittman, J. O. E., Huebschmann, A. G., Glasgow, R. E., & Rabin, B. (2024). Applying the practical robust implementation and sustainability model (PRISM). In P. Nilsen (Ed.), Theory and application of implementation science. Routledge.
Trinkley, K. E., Ho, P. M., Glasgow, R. E., & Huebschmann, A. G. (2022). How dissemination and implementation science can contribute to the advancement of learning health systems. Academic Medicine, 97(10), 1447–1458.
Trinkley, K. E., Kahn, M. G., Bennett, T. D., Glasgow, R. E., Haugen, H., Kao, D. P., Kroehl, M. E., Lin, C. T., Malone, D. C., & Matlock, D. D. (2020). Integrating the practical robust implementation and sustainability model with best practices in clinical decision support design: Implementation science approach. Journal of Medical Internet Research, 22(10), e19676. https://doi.org/10.2196/19676
Trinkley, K. E., Kroehl, M. E., Kahn, M. G., Allen, L. A., Bennett, T. D., Hale, G., Haugen, H., Heckman, S., Kao, D. P., Kim, J., Matlock, D. M., Malone, D. C., Page Nd, R. L., Stine, J., Suresh, K., Wells, L., & Lin, C. T. (2021). Applying clinical decision support design best practices with the practical robust implementation and sustainability model versus reliance on commercially available clinical decision support tools: Randomized controlled trial. JMIR Medical Informatics, 9(3), e24359. https://doi.org/10.2196/24359
Vinson, C. A., Stamatakis, K. A., & Kerner, J. F. (2018). Dissemination and implementation research in community and public health settings. In R. C. Brownson, G. A. Colditz & E. Proctor (Eds.), Dissemination and implementation research in health (2nd ed., p. 363). Oxford University Press. https://doi.org/10.1093/oso/9780190683214.001.0001
Wagner, E. H., Austin, B. T., Davis, C., Hindmarsh, M., Schaefer, J., & Bonomi, A. (2001). Improving chronic illness care: Translating evidence into action. Health Affairs (Millwood), 20(6), 64–78. https://doi.org/10.1377/hlthaff.20.6.64
Weiner, B. J., Clary, A. S., Klaman, S. L., Turner, K., & Alishahi-Tabriz, A. (2020). Organizational readiness for change: What we know, what we think we know, and what we need to know. In B. Albers, A. Shlonsky & R. Mildon (Eds.), Implementation science 3.0 (pp. 101–144). Springer. https://doi.org/10.1007/978-3-030-03874-8_5
Williams, N. J., & Glisson, C. (2020). Changing organizational social context to support evidence-based practice implementation: A conceptual and empirical review. In B. Albers, A. Shlonsky & R. Mildon (Eds.), Implementation science 3.0 (pp. 145–172). Springer. https://doi.org/10.1007/978-3-030-03874-8_6
Woodward, E. N., Matthieu, M. M., Uchendu, U. S., Rogal, S., & Kirchner, J. E. (2019). The health equity implementation framework: Proposal and preliminary study of hepatitis C virus treatment. Implementation Science, 14(1), 26. https://doi.org/10.1186/s13012-019-0861-y
Acknowledgements
Rebekah Gomes assisted with content and creation of figures and tables; Meredith Fort and members of the RE-AIM Research Consortium have led papers and research that has advanced PRISM over the years; Cathy Battaglia and Marina McCreight have led application of PRISM in a Large VA Quality Enhancement Research Project; and the Colorado Implementation Science Center in Cancer Control Team has collaborated in numerous ways including reading and commenting on different sections and leading some of the research efforts described as part of our current P50 center activities.
Funding
Funding support provided in part by Grants National Cancer Institute P50CA244688; National Institutes of Health Grant Award Numbers: 5UH3CA233314-03 (Martinez, Gupta, Roesch); Health Services Research and Development Merit Award:1I01HX003079-01A1 (Pittman); National Heart, Lung, Blood, and Sleep Institute 1K23HL161352; and U.S. Department of Veterans Affairs Quality Enhancement Research Initiative (QUERI) Program (Triple Aim QUERI Program, QUERI 15-268).
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
There are no conflicts of interest to disclose.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/.
About this article
Cite this article
Glasgow, R.E., Trinkley, K.E., Ford, B. et al. The Application and Evolution of the Practical, Robust Implementation and Sustainability Model (PRISM): History and Innovations. Glob Implement Res Appl (2024). https://doi.org/10.1007/s43477-024-00134-6
Received:
Accepted:
Published:
DOI: https://doi.org/10.1007/s43477-024-00134-6