Skip to main content
  • Study protocol
  • Open access
  • Published:

Testing the implementation and sustainment facilitation (ISF) strategy as an effective adjunct to the Addiction Technology Transfer Center (ATTC) strategy: study protocol for a cluster randomized trial



Improving the extent to which evidence-based practices (EBPs)—treatments that have been empirically shown to be efficacious or effective—are integrated within routine practice is a well-documented challenge across numerous areas of health. In 2014, the National Institute on Drug Abuse funded a type 2 effectiveness–implementation hybrid trial titled the substance abuse treatment to HIV Care (SAT2HIV) Project. Aim 1 of the SAT2HIV Project tests the effectiveness of a motivational interviewing-based brief intervention (MIBI) for substance use as an adjunct to usual care within AIDS service organizations (ASOs) as part of its MIBI Experiment. Aim 2 of the SAT2HIV Project tests the effectiveness of implementation and sustainment facilitation (ISF) as an adjunct to the Addiction Technology Transfer Center (ATTC) model for training staff in motivational interviewing as part of its ISF Experiment. The current paper describes the study protocol for the ISF Experiment.


Using a cluster randomized design, case management and leadership staff from 39 ASOs across the United States were randomized to receive either the ATTC strategy (control condition) or the ATTC + ISF strategy (experimental condition). The ATTC strategy is staff-focused and includes 10 discrete strategies (e.g., provide centralized technical assistance, conduct educational meetings, provide ongoing consultation). The ISF strategy is organization-focused and includes seven discrete strategies (e.g., use an implementation advisor, organize implementation team meetings, conduct cyclical small tests of change). Building upon the exploration–preparation–implementation–sustainment (EPIS) framework, the effectiveness of the ISF strategy is examined via three staff-level measures: (1) time-to-proficiency (i.e., preparation phase outcome), (2) implementation effectiveness (i.e., implementation phase outcome), and (3) level of sustainment (i.e., sustainment phase outcome).


Although not without limitations, the ISF experiment has several strengths: a highly rigorous design (randomized, hypothesis-driven), high-need setting (ASOs), large sample size (39 ASOs), large geographic representation (23 states and the District of Columbia), and testing along multiple phases of the EPIS continuum (preparation, implementation, and sustainment). Thus, study findings will significantly improve generalizable knowledge regarding the best preparation, implementation, and sustainment strategies for advancing EBPs along the EPIS continuum. Moreover, increasing ASO’s capacity to address substance use may improve the HIV Care Continuum.

Trial registration NCT03120598.


Background and rationale for the implementation and sustainment facilitation experiment

Improving the extent to which evidence-based practices (EBPs)—treatments that have been empirically shown to be efficacious or effective—are integrated within routine practice is a well-documented challenge across numerous areas of health [1,2,3,4,5]. A comprehensive systematic review of studies on the costs and efficiency of integrating HIV/AIDS services with other health services noted, “Unfortunately, few of the studies found adequately address the central questions currently concerning many program managers at this moment in time: not whether to integrate, but when to, how to and which model is most efficient in which setting?” [6]. The need to address these central questions about the integration of substance use disorder (SUD) services within HIV care settings is particularly pressing, given the high prevalence of substance use [7,8,9] and associated problems among individuals living with HIV/AIDS [10,11,12,13,14,15,16,17].

In 2013, the National Institute on Drug Abuse (NIDA) sought to fund research that would advance understanding of how best to improve the integration of SUD treatment services within HIV/AIDS service delivery settings [18]. In 2014, NIDA funded a type 2 effectiveness–implementation hybrid trial called the substance abuse treatment to HIV Care (SAT2HIV) Project [19]. As shown in Fig. 1, Aim 1 of the SAT2HIV Project tests the effectiveness of a motivational interviewing-based brief intervention (MIBI) for substance use as an adjunct to usual care within AIDS service organizations (ASOs) as part of its multisite MIBI Experiment [20]. Aim 2 of the SAT2HIV Project tests the effectiveness of implementation and sustainment facilitation (ISF) as an adjunct to the Addiction Technology Transfer Center’s (ATTC) model for training staff in motivational interviewing as part of its ISF Experiment. The current paper describes the study protocol for the ISF Experiment and has been written in accordance with the SPIRIT guidelines [21, 22] (see Additional file 1). A cluster randomized design with staff randomized within clusters of ASOs was used to minimize the likelihood of contamination across study conditions. Importantly, although randomization was at the cluster level (i.e., organization level), our objective and hypotheses pertain to staff-level outcomes. The study protocol for the MIBI Experiment, also written in accordance with the SPIRIT guidelines, has been published separately [20]. With this background, we describe below the objective, design, and methods for the SAT2HIV Project’s ISF Experiment.

Fig. 1: Conceptual overview of the ISF experiment within the context of the parent SAT2HIV Project.
figure 1

Note: MIBI motivational interviewing-based brief intervention; ISF implementation and sustainment facilitation; UC usual care; bolded arrows represent hypothesized relationships; dashed arrows represent interactions and cross-level interactions that will be examined

Rationale for the ISF Experiment’s EBP, outcomes, and strategies

Rationale for the targeted EBP

The selection of motivational interviewing as the to-be-implemented EBP was based on several factors, including (a) research reviews supporting the effectiveness of motivational interviewing in reducing substance use [23,24,25], (b) the availability of psychometrically sound measures for assessing the extent to which motivational interviewing was implemented with adherence and competence [26], and (c) a research review suggesting that HIV care settings have been receptive to implementing motivational interviewing for HIV medication adherence [27].

Rationale for the primary outcomes

Proctor et al. [28] defined “implementation outcomes” as the effects of deliberate and purposeful actions to implement new treatments, practices, and services. However, our interest in comparing the effectiveness of the two strategies during the preparation phase, implementation phase, and sustainment phase of the exploration–preparation–implementation–sustainment (EPIS) continuum [29] required selection of unique preparation, implementation, and sustainment outcomes. Building on prior preparation research [30], days-to-proficiency was selected as the ISF experiment’s primary preparation outcome. Klein and Sorra’s implementation effectiveness construct (i.e., the consistency and quality of targeted organizational members’ use of an innovation) [31] was selected as the ISF experiment’s primary implementation outcome. Implementation effectiveness is important, given it has been hypothesized to be a function of implementation strategies and implementation climate [32,33,34]. Finally, building on sustainment research that has used raw units (e.g., number of staff trained, number of clients served) to operationalize sustainment outcomes [35], the raw unit of MIBIs delivered during the project’s sustainment phase was selected as the ISF experiment’s primary sustainment outcome.

Rationale for the strategies tested

Guidance for strategy selection was drawn from the research of Miller et al. [30], which experimentally compared strategies for training individuals in motivational interviewing. Relative to the other conditions examined (e.g., workshop training, workshop plus feedback, workshop plus coaching), the most effective condition for helping individuals demonstrate proficiency in motivational interviewing was the workshop training plus feedback plus coaching condition. Given its empirical support, each of these discrete strategies is encompassed within the overarching strategy of centralized technical assistance that ATTCs across the United States use in training individuals in motivational interviewing [36] (hereafter referred to as the ATTC strategy).

Although the staff-focused ATTC strategy is viewed as necessary for helping staff learn motivational interviewing, we argue that it may be insufficient on its own for optimizing the preparation, implementation, and sustainment processes. As such, we sought to identify an effective adjunct to the ATTC strategy. Each of the discrete strategies identified by Powell et al. [37] were considered as potential adjuncts to the ATTC strategy. Use of an improvement or implementation advisor was selected as the overarching strategy-to-be-tested, as Gustafson et al. [38] found that of the strategies compared, clinic-level coaching (i.e., use of an improvement advisor) was the best strategy for decreasing patient wait-time and increasing the number of new patients. In addition, six other discrete strategies (develop tools for quality improvement, organize implementation team meetings, identify and prepare champions, assess for readiness and identify barriers, conduct local consensus discussions, and conduct cyclical small tests of change) were packaged with the implementation advisor and branded together as the ISF strategy.

The ISF Experiment’s Objective and Scientific Hypotheses

Testing the effectiveness of the ISF strategy as an adjunct to the ATTC strategy is the ISF Experiment’s key objective. Table 1 lists the planned scientific hypotheses for the ISF Experiment, which were guided by use of a decomposed-first strategy [39] that advocates for starting with moderation-focused hypotheses to avoid biases associated with conflated effects.

Table 1 Planned scientific hypotheses


Participants, interventions, and outcomes

Study setting

The ISF experiment was conducted in community-based ASOs (N = 39; i.e., clusters) located across the United States in 23 states and the District of Columbia. ASOs conduct HIV prevention efforts and provide medical and nonmedical case management services (e.g., retention in care, medication adherence, referral to social services and specialty treatment) to individuals living with HIV/AIDS. ASOs are distinct from HIV primary care organizations, which provide medical services including prescriptions for antiretroviral therapy (ART), CD4 T-lymphocyte testing, and HIV viral load testing [40].

Eligibility criteria

To be eligible to participate, an ASO (i.e., the cluster) had to (1) serve a minimum of 100 individuals living with HIV/AIDS per year, (2) have at least two case management staff who were willing to be trained in the MIBI for substance use (hereafter referred to as BI staff) [20], and (3) have at least one leadership staff (e.g., supervisor, manager, director) willing to help ensure BI staff were given sufficient time for project participation. There were no exclusion criteria.

Intervention: preparation, implementation, and sustainment strategies

As highlighted by Proctor et al. [41], despite the importance of providing full and precise descriptions of implementation strategies (i.e., the methods or techniques used to enhance the adoption, implementation, and sustainment of a clinical program or practice) used or tested, few studies provide adequate detail in their publications. Thus, Proctor et al.’ recommended guidelines were used to identify, define, and operationalize the ATTC strategy (see Table 2) and the ISF strategy (see Table 3) along six key dimensions: actor, actions, targets of the actions, temporality, implementation outcomes affected, and justification. Complementing Tables 2 and 3, dose (i.e., frequency and intensity) of the ATTC strategy and ISF strategy is detailed for each of the three project phases: preparation phase (see Table 4; see Additional File 2 for single page version), implementation phase (see Table 5; see Additional File 3 for single page version), and sustainment phase (see Table 6; see Additional File 4 for single page version).

Table 2 Specification overview of the multifaceted Addiction Technology Transfer Center (ATTC) strategy
Table 3 Specification overview of the multifaceted implementation and sustainment facilitation (ISF) Strategy
Table 4 Dose for each overarching strategy during the preparation phase (months 1–6)
Table 5 Dose for each overarching strategy during the implementation phase (months 7–12)
Table 6 Dose for each overarching strategy during the sustainment phase (months 13–18)

Addiction Technology Transfer Center strategy

Although the ATTC strategy has been used in addiction treatment settings, its use in HIV/AIDS service delivery settings is novel and thus one of the project’s innovations. The ATTC strategy represents a “blended strategy,” the term reserved for instances in which several discrete strategies are packaged together and protocolized or branded [37]. Centralized technical assistance is the overarching strategy of the ATTC strategy. Encompassed within the ATTC strategy are an additional nine discrete strategies. Descriptions of each, which supplement the specifications provided as part of Table 2, are provided here.

  1. (A)

    Centralized technical assistance Consistent with prior research [36, 42,43,44], centralized technical assistance was operationalized as an individualized, hands-on approach to building an entity’s capacity for quality implementation of innovations. Squires et al. [36] successfully used this strategy to implement contingency management in substance use disorder treatment organizations.

  2. (B)

    Develop educational materials Educational materials, such as intervention manuals, have been found to be useful for learning [45, 46]. Thus, we developed an online introduction to motivational interviewing course [47] and a training manual for the MIBI protocol [48].

  3. (C)

    Develop and organize quality monitoring system Building on prior research [49,50,51], a Web-based quality monitoring system was developed. Key functions of this system were: (a) secure uploads of session recordings by BI staff, (b) efficient adherence and competence rating of session recordings by trained raters, (c) automated sending of session quality rating feedback to BI staff, and (d) generation of custom summary reports (e.g., by organization, by month) of session quality ratings.

  4. (D)

    Develop tools for quality monitoring The Independent Tape Rater Scale (ITRS) [26, 52, 53] was developed and validated for monitoring the level of adherence and competence of 10 core motivational interviewing skills (e.g., open-ended questions, reflective statements, fostering collaboration).

  5. (E)

    Distribute educational materials Consistent with research supporting the importance of using multiple dissemination strategies [45, 46, 54], the educational materials were distributed to BI staff. BI staff were emailed links to the online educational course [47] and printed copies of the MIBI protocol manual [48] were hand-delivered to staff at the in-person workshop training.

  6. (F)

    Conduct educational meetings Research has not found educational materials by themselves to be sufficient for learning motivational interviewing [30, 55]. Thus, Web-based and in-person educational meetings were also provided, including a two-day in-person workshop training for BI staff on the MIBI protocol.

  7. (G)

    Make training dynamic Role plays that enable trainees to practice with other trainees and facilitate understanding of the EBP from both the staff and client perspectives have been found to make motivational interviewing training more dynamic [30, 55, 56]. In addition to using role plays multiple times during the in-person workshop training, trainees were given role plays to complete during the week after the workshop training.

  8. (H)

    Audit and provide feedback There is support for audit and feedback as an effective strategy, both in general [57,58,59,60] and specifically with learning motivational interviewing [30]. Thus, standardized feedback reports based on ratings using the validated Independent Tape Rater Scale [26] were provided to BI staff for all sessions completed and recorded.

  9. (I)

    Provide ongoing consultation Providing ongoing consultation following workshop training has been supported as an important strategy to facilitate learning of psychosocial interventions [30, 36, 61]. During the 10-week post-workshop-training practice period, each trainee was allowed up to four individual consultation sessions with a member of the Motivational Interviewing Network of Trainers (MINT) [62].

  10. (J)

    Create a learning collaborative The use of a learning collaborative has been identified as an important method of learning [63,64,65]. Thus, each month during the 6-month implementation phase, a motivational interviewing expert from MINT [62] organized and moderated two 1-h learning collaborative meetings, one for the ATTC only condition and one for the ATTC plus ISF condition.

Implementation and sustainment facilitation strategy

Encompassed within the overarching strategy of using an implementation advisor are six additional discrete strategies. Supplementing the specifications in Table 3, descriptions of each of these strategies are provided here.

  1. (K)

    Use an improvement/implementation advisor Consistent with prior research [38, 66,67,68,69], use of an implementation advisor was operationalized as an individual external to the organization who utilized interactive problem-solving and support to help the organization identify and achieve improvement and implementation goals.

  2. (L)

    Develop tools for quality improvement Five quality improvement tools were developed and are described below.

First, the past implementation effort exercise, based on research emphasizing the importance of using past performance to improve future practice [70], was developed to facilitate organizations sharing with their advisor a past experience implementing an innovation. In addition to describing the past effort, organizations discussed the extent to which the effort was ultimately successful, unsuccessful, or had mixed results. Advisors used reflective listening skills to highlight the importance of the organization’s past implementation effort and how learning from the past may help them successfully achieve the goals of the current project’s preparation, implementation, and sustainment phases.

Next, the decisional-balance exercise was developed based on supporting research [71] and sought to evoke reasons behind the organization’s decision to implement the MIBI for substance use and to identify potential barriers.

Third, the ISF Workbook (a Microsoft Excel-based electronic workbook) was developed to standardize ISF strategy implementation, the lack of standardization having been a criticism of many implementation studies [41]. The ISF Workbook has five worksheets: (1) a project charter worksheet that lists the project’s goals, staff working on the project (SWOP) team members, and the implementation advisor’s name and contact info; (2) a meeting attendees and notes worksheet with a placeholder for documenting the date of all expected ISF meetings, the SWOP team members that attended each meeting, summary notes from the meeting, and a link to the meeting recording; (3) a preparation phase worksheet that includes the goals of the preparation phase and the ISF strategy’s performance review, evaluation, and planning exercise; (4) an implementation phase worksheet that includes the goals of the implementation phase and the ISF strategy’s performance review, evaluation, and planning exercise; and (5) a sustainment phase worksheet that includes a placeholder for entering what (if anything) the organization chooses as their goal(s) for the sustainment phase and the ISF strategy’s performance review, evaluation, and planning exercise.

Next, the process walk through exercise was developed based on prior research that has found walking through the steps of a process to be a helpful quality improvement tool [38, 72]. The process walk through exercise was conducted by having the SWOP team review a detailed process flow diagram with the following four key questions emphasized throughout the exercise: What is working well? What needs improvement? What is the plan for improving what needs improvement? What is the plan for maintaining what is working well? Although time was spent on what was working well and the plans for maintaining what was working well, explicit emphasis was on identifying what needs improvement and plans to enact improvements.

Last, the implementation climate evaluation exercise was developed to standardize an advisor’s process of evaluating the implementation climate for the MIBI (i.e., the extent to which it is expected and supported). Implementation climate has been hypothesized as a key mechanism of change for implementation strategies’ impact on implementation effectiveness [31,32,33, 73]. When there was no consensus on the implementation climate or when the implementation climate was poor, the ISF advisor sought to evoke reasons for the current beliefs, find ways to better align staff members’ beliefs, and develop plans to optimize the implementation climate. In contrast, when there was consensus on the implementation climate or when the implementation climate was strong, advisors facilitated discussion around maintaining or improving it.

  1. (M)

    Organize implementation team meetings Organizing implementation team meetings that SWOP team members were willing and able to regularly attend was one of the most important strategies [74, 75]. ISF advisors sought to organize recurring implementation team meetings early in the process. Monthly team meetings were conducted via, a collaboration tool with advanced phone conference and screening sharing capabilities. In addition, a limited number of in-person team meetings (typically just one) were organized for a day during the second month of the implementation phase.

  2. (N)

    Identify and prepare champions Consistent with research highlighting the importance of having someone championing the organization’s implementation efforts [31, 32, 76, 77], an ISF advisor’s focus on champion identification began immediately upon formal introduction to the organization and its SWOP team. The ISF advisor paid attention to the extent SWOP team members responded to emails and meeting discussions as a way of identifying team members’ levels of engagement and team influence. Once an ISF advisor identified a potential champion, they sought to optimize the individual’s commitment to the project and its goals.

  3. (O)

    Assess for readiness and identify barriers Building on extant research on readiness assessment and barrier identification [78,79,80,81,82], the ISF strategy included exercises developed to assist with assessing readiness and identifying barriers (e.g., past implementation effort exercise, decisional balance exercise, process walk through exercise), which were described earlier (see Develop tools for quality improvement).

  4. (P)

    Conduct local consensus discussions Consensus-building is an important strategy [83, 84]. Thus, concerted efforts were directed towards conducting local consensus discussions with key stakeholders, who are internal or external individuals that the SWOP team considered key to directly and/or indirectly helping sustain the MIBI services over time. Key stakeholders were invited to attend the in-person ISF meeting to learn about the project and participate in a formal sustainment planning discussion.

  5. (Q)

    Conduct cyclical small tests of change Cyclical small tests of change, such as plan-do-study-act cycles are a valuable quality improvement strategy [85,86,87]. Within the ISF strategy, however, this cycle was reframed into a study-act-plan-do cycle. This reframing was done to emphasize the importance of beginning with the study phase by assessing existing performance and then deciding about the need to act (or not act). When action or change was deemed necessary, a plan was developed and then implemented in the do phase.


Table 7 describes the three staff-level outcome measures (i.e., time-to-proficiency, implementation effectiveness, and level of sustainment) used to examine the extent to which the ISF strategy serves as an effective adjunct to the ATTC strategy. Additionally, Table 7 describes the two staff-level measures (i.e., personal recovery status and motivational interviewing experience) and four organizational-level measures (i.e., readiness for implementing change, implementation climate, leadership engagement, and tension for change) that have been hypothesized as moderators of the relationship between organizational condition assignment (ATTC vs. ATTC + ISF) and each respective primary outcome measure.

Table 7 Instruments, instrument-related procedures, and measures

Participant timeline

Figure 2 depicts the participant flow for the ISF Experiment, which was organized by the four-phased EPIS framework [29]. For each of the three ASO cohorts, which were spaced 1 year apart, the exploration phase was initiated via the dissemination of standardized project introductions via emails and phone calls to all ASOs within the cohort’s geographically based catchment area (i.e., Central, Western, and Eastern states). ASOs interested in learning more about the project were invited to participate in an introductory meeting (see Recruitment below). Following the meeting, ASOs that met project eligibility criteria were emailed a project participation agreement to be completed and returned to the project’s Principal Investigator for finalizing. Once a cohort’s target number of participation agreements was reached, the exploration phase concluded by having each ASO’s designated SWOP team members (2–4 leadership staff and 2 BI staff) complete a confidential baseline assessment survey. As described in in the allocation section, data from these surveys, conducted under the auspices of RTI’s Institutional Review Board (IRB) and requiring written consent, were used as part of the condition assignment process. Following the completion of the exploration phase, ASOs and their SWOP team completed the project’s three 6-month phases: preparation (months 1–6), implementation (months 7–12), and sustainment (months 13–18).

Fig. 2: Flow of participating AIDS service organizations (ASOs)
figure 2

Note: t time; ATTC Addiction Technology Transfer Center; ISF implementation and sustainment facilitation

Sample size

Sample size for the ISF Experiment was determined via power analyses with Optimal Design Software [88]. We assumed an equal number of BI staff (2 per ASO) and an intraclass correlation coefficient of .05. With 78 BI staff nested within 39 ASOs, there is 80% power to detect statistically significant (p < .05) differences when the effect size is .67 or greater.


The identification and recruitment of ASOs was conducted by the Principal Investigator (BG) and project coordinators (DK, EB). Potential ASOs were identified via searches of organization directories [89, 90]. Identified ASOs were sent standardized introduction emails, with follow-up calls completed as necessary by project coordinators. ASOs interested in learning more about the project participated in a 45- to 60-min, organization-specific, Web-assisted, informational webinar, which was conducted by the Principal Investigator or one of the project coordinators.

In addition to providing information about the project, a key goal of the informational webinar was to gather information about the ASO, including (a) whether describing their organization as a community-based ASO was accurate, (b) the key services provided to individuals living with HIV/AIDS, (c) the number of individuals living with HIV/AIDS served annually, (d) the number of case-management staff, (e) their level of interest in participating in the project, and (f) their reasons for wanting to participate. Upon review of the collected information, the Principal Investigator and project coordinators identified ASOs that did not represent a good fit for the project. ASOs deemed to be a good fit were contacted via email and/or phone, and official participation was documented by having the ASO’s signing official sign and date a project participation agreement.

Assignment of interventions


Participating ASOs were assigned to one of two study conditions via urn randomization [91]. Using staff survey data collected during the exploration phase from the BI staff and leadership staff, seven organizational-level factors (importance of substance use screening, importance of brief intervention for substance use, innovation-value fit, implementation strategy-value fit, implementation climate for substance use brief intervention, implementation readiness for substance use brief intervention, and implementation effectiveness for substance use brief intervention) were entered into the urn randomization program gRAND [92], which optimized the balance of the two study conditions on these seven factors. Written consent was obtained from BI staff and leadership staff before survey completion.

Blinding (masking)

ASOs and their staff were not blinded to study condition. However, the ATTC training and rating staff were blinded to study condition.

Data collection, management, and analysis

Data collection and management

The Independent Tape Rater Scale (ITRS) was used to assess proficiency in motivational interviewing and implementation effectiveness. The ITRS is a well-validated tool for assessing two key factors: adherence and competence [26]. Confirmatory factor analysis has supported the two-factor structure of the ITRS [26], and excellent levels of inter-rater reliability have been found for both motivational interviewing adherence (mean ICC .89; range .66–.99) and competence (mean ICC .85; range .69–.97) [26].

The lead developer of the MIBI protocol (co-author SM) oversaw the selection, training, calibration, and supervision of the project’s 15 MIBI raters, who were blinded to study condition. Booster trainings and recalibration of the raters were conducted in between cohorts. Consistent with the established guidelines promoted in the motivational interviewing assessment: Supervisory Tools for Enhancing Proficiency [93], BI staff were considered to have demonstrated proficiency when at least half of the 10 motivational interviewing consistent items were rated 4 or greater on a 7-point scale for both adherence and competence. Submissions of MIBI sessions from BI staff and ratings from MIBI raters were enabled via a secure, Web-based implementation tracking system adapted from one used in our prior implementation research [94]. In addition to enabling MIBI raters to stream the audio files rather than download, an important security feature, the Web-based system allowed MIBI raters to enter adherence and competence ratings directly into the secure and backed-up database located on RTI’s servers.

As shown in Table 7, ASO staff participating in the ISF Experiment were invited to complete staff surveys at three time points: the exploration phase assessment at month zero, the implementation phase assessment at month 13, and the sustainment phase assessment at month 19. In addition to collecting background information for each participant (e.g., age, race, ethnicity, gender, educational level, tenure in profession, tenure with organization, salary, substance use recovery status), staff surveys assessed several domains theorized to be of importance—namely innovation values-fit, tension for change, implementation climate, implementation readiness, and leadership engagement—and assessed both the number of clients screened for substance use and the number clients to whom a brief intervention for substance use was delivered.

Given the professional level of ASO staff, surveys were self-administered. However, as a means of ensuring the highest quality data possible, surveys were Excel-based to help prevent common data quality issues like out-of-range responses. In addition to these real-time quality assurance measures, all staff surveys received quality assurance reviews from a project coordinator. When issues were identified, the project coordinator contacted the participant via email and/or phone to resolve the issue. Once the staff survey was complete, it was exported into a master database on one of RTI’s secure access-controlled servers that are backed up nightly. Each survey required about 30–45 min to complete, and participants received a $25 e-gift card as compensation for their time.

Statistical methods

Statistical analyses will be conducted using an intention-to-treat analysis approach, which will analyze all ASOs as randomized. Hot-deck imputation [95, 96] will be used to address missing data issues, which are anticipated to be minimal (i.e., less than 5%). Analyses will be conducted using HLM software [97], which is well-suited for handling clustered data (i.e., time nested within staff, nested within organization). Analyses will be conducted in the order outlined in Table 1. In addition to reporting the coefficient, standard error, 95% confidence interval, and p value, results will also include effect size indicators.


Data monitoring

The ISF Experiment was conducted under the auspices of RTI International’s IRB. The Principal Investigator of the ISF Experiment, however, assumes ultimate responsibility for the project’s data and safety monitoring.


Minimal risks are associated with the study and are limited to the potential breach of confidentiality. All adverse events are reported to the Principal Investigator within 24 h. Adverse events are reported to the IRB within 2 weeks of the Principal Investigator’s awareness of the event, with serious adverse events reported within 1 week.


RTI International’s IRB conducts annual and random audits to assess adherence to federal human subjects protection regulations and to ensure that the rights and welfare of human subjects are protected.

Ethics and dissemination

Research ethics approval

The ISF Experiment was reviewed and approved by RTI International’s IRB, under Federalwide Assurance No. 3331 from the Department of Health and Human Services’ Office for Human Research Protections.

Protocol amendments

Any protocol modification that may affect the conduct of the study, potential benefit to the participants, or participant safety requires a protocol amendment. All amendments were submitted to RTI International’s IRB for approval, with no protocol modification implemented until after notification of IRB approval.


In addition to having ASOs complete a project participation agreement, written consent was obtained from both leadership staff and BI staff. The project’s IRB-approved informed consent form was emailed to potential participants along with a password-protected assurance of consent form, the password for which was sent in a separate email. Individuals could not participate in the project without first completing the form.


Information provided as part of the study is confidential and not shared with anyone outside the study. The exception, however, is if the participant has a plan to harm himself or herself or another specific person. Efforts to protect participant confidentiality include the following: (1) use of a unique participant ID number only accessible to the ASO study staff and a limited number of RTI study staff, (2) any study document (paper or electronic) that contains both the participant name and ID number is securely stored (e.g., locked file cabinet in a secure building, folder located on a password-protected server in a secure building), and (3) when study results are presented at meetings or published in journals, no identifying participant information will be included. Except for the assurance of consent form, which is required to be stored for at least 3 years after study completion, documents with identifying information will be destroyed within 90 days of study completion.

Declaration of interests

There are no competing interests or conflicts of interest to be declared.

Access to data

Access to data is restricted during the active data collection period and is limited to the Principal Investigator, data coordinators, statistician, and statistical programmer. Following the completion of the study, a public access dataset will be created and made available upon request to the Principal Investigator.

Ancillary and post-trial care

No ancillary or post-trial care is planned.

Dissemination policy

Irrespective of the magnitude or direction of effect, study findings will be disseminated. Dissemination efforts will include presentations at professional scientific conferences and publication in peer-reviewed journals. To the extent possible, we will seek to ensure study publications are open access (i.e., available online to readers without financial, legal, or technical barriers beyond those inseparable from gaining access to the internet).


In this paper, the study protocol for the SAT2HIV Project’s ISF Experiment, a cluster-randomized trial on the effectiveness of the ISF strategy as an adjunct to the ATTC strategy (Aim 2 of the parent SAT2HIV Project), has been described in accordance with the SPIRIT guidelines [21, 22]. In the sections below, we highlight and discuss: (1) key trial-relevant events (anticipated and unanticipated) that have occurred to date, (2) key limitations and strengths of the ISF Experiment, and (3) key anticipated impacts of the ISF Experiment.

Trial-relevant events that have occurred to date

Table 8 summarizes key anticipated and unanticipated trial-relevant events that have occurred and that help illustrate the ISF Experiment’s progression and changing outer context.

Table 8 Key trial-relevant events to date

Key Limitations and Strengths of the ISF Experiment

The SAT2HIV Project’s ISF Experiment has limitations and strengths that are important to acknowledge. Key limitations include (1) the sustainment phase observation period being limited to 6 months, (2) the level of sustainment being limited to self-reports, and (3) cost-effectiveness not being examined. These limitations, however, are outweighed by the project’s many strengths.

Key strengths include the ISF Experiment’s (1) highly rigorous design as a randomized, hypothesis-driven experiment using psychometrically sound measures, (2) focus on the high-need setting of ASOs, (3) large sample size of 39 ASOs with 4-6 staff per ASO, (4) large geographic representation (23 states and the District of Columbia), and (5) examination of multiple phases of the EPIS continuum (preparation phase, implementation phase, and sustainment phase).

Potential impacts of the ISF experiment

Panel A of Fig. 3 illustrates the current state of implementation research, where generalizable knowledge regarding the best approach for advancing EBPs along the EPIS continuum is limited, represented by question marks. Panel B of Fig. 3 illustrates that, regardless of the extent to which the ISF strategy is found to be an effective adjunct to the ATTC strategy, the ISF Experiment’s examination (represented by checkmarks) will increase generalizable knowledge regarding preparation, implementation, and sustainment strategies for advancing EBPs along the EPIS continuum. Beyond its impact on implementation research, the ISF Experiment may positively impact one or more key performance measures along the HIV Care Continuum (e.g., being linked to care, being engaged in care, being prescribed ART, achieving viral suppression). Indeed, the ISF Experiment may help advance ASO’s capacity to address substance use, which is important given that substance use has been shown to negatively impact being engaged in care, the most significant break point along the U.S. HIV Care Continuum [98,99,100].

Fig. 3
figure 3

Potential impacts of the SAT2HIV Project’s ISF experiment


The SAT2HIV Project’s ISF Experiment represents one of the largest and most rigorous implementation research experiments to date. Nonetheless, should it support the ISF strategy as an effective adjunct to the ATTC strategy for implementing a motivational interviewing-based brief intervention for substance use within ASOs, future research must examine the extent to which study findings can be replicated, improved upon, and generalized to other contexts and EBPs. Our hope is that the ISF strategy is a replicable strategy that can be used to help improve public health advancing EBPs along the EPIS continuum.



acquired immunodeficiency syndrome


antiretroviral therapy


AIDS service organization


Addiction Technology Transfer Center


brief intervention


evidence-based practice




human immunodeficiency virus


Institutional Review Board


Implementation and sustainment facilitation


Independent Tape Rater Scale


motivational interviewing assessment


motivational interviewing-based brief intervention


National Institute on Drug Abuse


substance abuse treatment to HIV Care


Standard Protocol Items: Recommendations for Interventional Trials


substance use disorder


staff working on the project


usual care


usual care plus motivational interviewing-based brief intervention


  1. Institute of Medicine. Bridging the gap between practice and research: forging partnerships with community-based drug and alcohol treatment. Washington: National Academy Press; 1998.

    Google Scholar 

  2. Institute of Medicine. Crossing the quality chasm: a new health system for the 21st century. Washington: National Academy Press; 2001.

    Google Scholar 

  3. Hogan MF. The President’s New Freedom Commission: recommendations to transform mental health care in America. Psychiatr Serv. 2003;54:1467–74.

    Article  PubMed  Google Scholar 

  4. Institute of Medicine. Improving the quality of health care for mental and substance-use conditions: quality chasm series. Washington: National Academy Press; 2006.

    Google Scholar 

  5. Garner BR. Research on the diffusion of evidence-based treatments within substance abuse treatment: a systematic review. J Subst Abuse Treat. 2009;36:376–99.

    Article  PubMed  Google Scholar 

  6. Sweeney S, Obure CD, Maier CB, Greener R, Dehne K, Vassall A. Costs and efficiency of integrating HIV/AIDS services with other health services: a systematic review of evidence and experience. Sex Transm Infect. 2012;88:85–99.

    Article  PubMed  Google Scholar 

  7. Bing EG, Burnam A, Longshore D, Fleishman JA, Sherbourne CD, London AS, et al. Psychiatric disorders and drug use among human immunodeficiency virus-infected adults in the United States. Arch Gen Psychiatry. 2001;58:721–8.

    Article  CAS  PubMed  Google Scholar 

  8. Gaynes BN, Pence BW, Eron JJ Jr, Miller WC. Prevalence and comorbidity of psychiatric diagnoses based on reference standard in an HIV+ patient population. Psychosom Med. 2008;70:505–11.

    Article  PubMed  PubMed Central  Google Scholar 

  9. Hartzler B, Dombrowski JC, Crane HM, Eron JJ, Geng EH, Christopher Mathews W, et al. Prevalence and predictors of substance use disorders among HIV care enrollees in the United States. AIDS Behav. 2017;21:1138–48.

    Article  PubMed  Google Scholar 

  10. Lucas GM, Cheever LW, Chaisson RE, Moore RD. Detrimental effects of continued illicit drug use on the treatment of HIV-1 infection. J Acquir Immune Defic Syndr. 2001;27:251–9.

    Article  CAS  PubMed  Google Scholar 

  11. Arnsten JH, Demas PA, Grant RW, Gourevitch MN, Farzadegan H, Howard AA, et al. Impact of active drug use on antiretroviral therapy adherence and viral suppression in HIV-infected drug users. J Gen Intern Med. 2002;17:377–81.

    Article  PubMed  PubMed Central  Google Scholar 

  12. King WD, Larkins S, Hucks-Ortiz C, Wang PC, Gorbach PM, Veniegas R, et al. Factors associated with HIV viral load in a respondent driven sample in Los Angeles. AIDS Behav. 2009;13:145–53.

    Article  PubMed  Google Scholar 

  13. Malta M, Strathdee SA, Magnanini MM, Bastos FI. Adherence to antiretroviral therapy for human immunodeficiency virus/acquired immune deficiency syndrome among drug users: a systematic review. Addiction. 2008;103:1242–57.

    Article  PubMed  Google Scholar 

  14. Friedman MS, Marshal MP, Stall R, Kidder DP, Henny KD, Courtenay-Quirk C, et al. Associations between substance use, sexual risk taking and HIV treatment adherence among homeless people living with HIV. AIDS Care. 2009;21:692–700.

    Article  PubMed  Google Scholar 

  15. Hendershot CS, Stoner SA, Pantalone DW, Simoni JM. Alcohol use and antiretroviral adherence: review and meta-analysis. J Acquir Immune Defic Syndr. 2009;52:180–202.

    Article  PubMed  PubMed Central  Google Scholar 

  16. Azar MM, Springer SA, Meyer JP, Altice FL. A systematic review of the impact of alcohol use disorders on HIV treatment outcomes, adherence to antiretroviral therapy and health care utilization. Drug Alcohol Depend. 2010;112:178–93.

    Article  PubMed  PubMed Central  Google Scholar 

  17. Palepu A, Tyndall M, Yip B, O’Shaughnessy MV, Hogg RS, Montaner JS. Impaired virologic response to highly active antiretroviral therapy associated with ongoing injection drug use. J Acquir Immune Defic Syndr. 2003;32:522–6.

    Article  PubMed  Google Scholar 

  18. National Institute on Drug Abuse, Department of Health and Human Services.

  19. NIH Research Portfolio Online Reporting Tools (RePORT). Project information. Substance abuse treatment to HIV care (SAT2HIV) (Project No. 7R01DA038146-02).

  20. Garner BR, Gotham HJ, Tueller SJ, Ball E, Kaiser D, Stilen P, et al. Testing the effectiveness of a motivational interviewing-based brief intervention for substance use as an adjunct to usual care in community-based AIDS service organizations: Study protocol for a multisite randomized controlled trial. Addict Sci Clin Pract. (2017).

  21. Chan AW, Tetzlaff JM, Altman DG, Laupacis A, Gotzsche PC, Krleza-Jeric K, et al. SPIRIT 2013 statement: defining standard protocol items for clinical trials. Ann Intern Med. 2013;158:200–7.

    Article  PubMed  PubMed Central  Google Scholar 

  22. Chan AW, Tetzlaff JM, Gotzsche PC, Altman DG, Mann H, Berlin JA, et al. SPIRIT 2013 explanation and elaboration: guidance for protocols of clinical trials. BMJ. 2013;346:e7586.

    Article  PubMed  PubMed Central  Google Scholar 

  23. Dunn C, Deroo L, Rivara FP. The use of brief interventions adapted from motivational interviewing across behavioral domains: a systematic review. Addiction. 2001;96:1725–42.

    Article  CAS  PubMed  Google Scholar 

  24. Lundahl BW, Kunz C, Brownell C, Tollefson D, Burke BL. A meta-analysis of motivational interviewing: twenty-five years of empirical studies. Res Soc Work Pract. 2010;20:137–60.

    Article  Google Scholar 

  25. Vasilaki EI, Hosier SG, Cox WM. The efficacy of motivational interviewing as a brief intervention for excessive drinking: a meta-analytic review. Alcohol Alcohol. 2006;41:328–35.

    Article  PubMed  Google Scholar 

  26. Martino S, Ball SA, Nich C, Frankforter TL, Carroll KM. Community program therapist adherence and competence in motivational enhancement therapy. Drug Alcohol Depend. 2008;96:37–48.

    Article  PubMed  PubMed Central  Google Scholar 

  27. Hill S, Kavookjian J. Motivational interviewing as a behavioral intervention to increase HAART adherence in patients who are HIV-positive: a systematic review of the literature. AIDS Care. 2012;24:583–92.

    Article  PubMed  Google Scholar 

  28. Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, et al. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Health. 2011;38:65–76.

    Article  PubMed  Google Scholar 

  29. Aarons GA, Hurlburt M, Horwitz SM. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Adm Policy Ment Health. 2011;38:4–23.

    Article  PubMed  Google Scholar 

  30. Miller WR, Yahne CE, Moyers TB, Martinez J, Pirritano M. A randomized trial of methods to help clinicians learn motivational interviewing. J Consult Clin Psychol. 2004;72:1050–62.

    Article  PubMed  Google Scholar 

  31. Klein KJ, Conn AB, Sorra JS. Implementing computerized technology: an organizational analysis. J Appl Psychol. 2001;86:811–24.

    Article  CAS  PubMed  Google Scholar 

  32. Helfrich CD, Weiner BJ, McKinney MM, Minasian L. Determinants of implementation effectiveness: adapting a framework for complex innovations. Med Care Res Rev. 2007;64:279–303.

    Article  PubMed  Google Scholar 

  33. Klein KJ, Sorra JS. The challenge of innovation implementation. Acad Manag Rev. 1996;21:1055–80.

    Google Scholar 

  34. Weiner BJ, Lewis MA, Linnan LA. Using organization theory to understand the determinants of effective implementation of worksite health promotion programs. Health Educ Res. 2009;24:292–305.

    Article  PubMed  Google Scholar 

  35. Hunter SB, Ayer L, Han B, Garner BR, Godley SH. Examining the sustainment of the adolescent-community reinforcement approach in community addiction treatment settings: protocol for a longitudinal mixed method study. Implement Sci. 2014;9:104.

    Article  PubMed  PubMed Central  Google Scholar 

  36. Squires DD, Gumbley SJ, Storti SA. Training substance abuse treatment organizations to adopt evidence-based practices: the Addiction Technology Transfer Center of New England science to service laboratory. J Subst Abuse Treat. 2008;34:293–301.

    Article  PubMed  Google Scholar 

  37. Powell BJ, McMillen JC, Proctor EK, Carpenter CR, Griffey RT, Bunger AC, et al. A compilation of strategies for implementing clinical innovations in health and mental health. Med Care Res Rev. 2012;69:123–57.

    Article  PubMed  Google Scholar 

  38. Gustafson DH, Quanbeck AR, Robinson JM, Ford JH 2nd, Pulvermacher A, French MT, et al. Which elements of improvement collaboratives are most effective? A cluster-randomized trial. Addiction. 2013;108:1145–57.

    Article  PubMed  PubMed Central  Google Scholar 

  39. Preacher KJ, Zhang Z, Zyphur MJ. Multilevel structural equation models for assessing moderation within and across levels of analysis. Psychol Methods. 2016;21:189–205.

    Article  PubMed  Google Scholar 

  40. Blair JM, McNaghten AD, Frazier EL, Skarbinski J, Huang P, Heffelfinger JD. Clinical and behavioral characteristics of adults receiving medical care for HIV infection—Medical Monitoring Project, United States, 2007. MMWR Surveill Summ. 2011;60:1–20.

    PubMed  Google Scholar 

  41. Proctor EK, Powell BJ, McMillen JC. Implementation strategies: recommendations for specifying and reporting. Implement Sci. 2013;8:139.

    Article  PubMed  PubMed Central  Google Scholar 

  42. Davis D, Barrington T, Phoenix U, Gilliam A, Collins C, Cotton D, et al. Evaluation and technical assistance for successful HIV program delivery. AIDS Educ Prev. 2000;12(Suppl A):115–25.

    CAS  PubMed  Google Scholar 

  43. Wandersman A, Chien VH, Katz J. Toward an evidence-based system for innovation support for implementing innovations with quality: tools, training, technical assistance, and quality assurance/quality improvement. Am J Community Psychol. 2012;50:445–59.

    Article  PubMed  Google Scholar 

  44. Chaple M, Sacks S. The impact of technical assistance and implementation support on program capacity to deliver integrated services. J Behav Health Serv Res. 2016;43:3–17.

    Article  PubMed  Google Scholar 

  45. Farmer AP, Legare F, Turcot L, et al. Printed educational materials: effects on professional practice and health care outcomes. Cochrane Database Syst Rev. 2008;3:CD004398.

    Google Scholar 

  46. Giguere A, Legare F, Grimshaw J, et al. Printed educational materials: effects on professional practice and healthcare outcomes. Cochrane Database Syst Rev. 2012;10:CD004398.

    PubMed  Google Scholar 

  47. HealtheKnowledge. On-Demand Courses.

  48. Martino S, Garner BR, Gotham H, Speck K, Vandersloot D. Motivational Interviewing-based brief intervention (BI) protocol for HIV-Infected Clients with Risky Substance Use. Unpublished treatment manual. 2014.

  49. Godley SH, Garner BR, Smith JE, Meyers RJ, Godley MD. A large-scale dissemination and implementation model for evidence-based treatment and continuing care. Clin Psychol (New York). 2011;18:67–83.

    Google Scholar 

  50. Miller WR, Moyers TB, Arciniega L, Ernst D, Forcehimes A. Training, supervision and quality monitoring of the COMBINE Study behavioral interventions. J Stud Alcohol. 2005;15(Suppl):188–95.

    Article  Google Scholar 

  51. Schneider EC, Zaslavsky AM, Landon BE, Lied TR, Sheingold S, Cleary PD. National quality monitoring of Medicare health plans: the relationship between enrollees’ reports and the quality of clinical care. Med Care. 2001;39:1313–25.

    Article  CAS  PubMed  Google Scholar 

  52. Gibbons CJ, Carroll KM, Ball SA, Nich C, Frankforter TL, Martino S. Community program therapist adherence and competence in a motivational interviewing assessment intake session. Am J Drug Alcohol Abuse. 2010;36:342–9.

    Article  PubMed  PubMed Central  Google Scholar 

  53. Martino S, Ball S, Nich C, Frankforter TL, Carroll KM. Correspondence of motivational enhancement treatment integrity ratings among therapists, supervisors, and observers. Psychother Res. 2009;19:181–93.

    Article  PubMed  PubMed Central  Google Scholar 

  54. McCormack L, Sheridan S, Lewis M, Boudewyns V, Melvin CL, Kistler C, et al. Communication and Dissemination Strategies To Facilitate the Use of Health-Related Evidence Report/Technology Assessment No. 213 (prepared by the RTI International-University of North Carolina Evidence-based Practice Center under Contract No. 290-2007-10056-I.) AHRQ Publication No. 13(14)-E003-EF. Rockville, MD: Agency for Healthcare Research and Quality; November 2013.

  55. Madson MB, Loignon AC, Lane C. Training in motivational interviewing: a systematic review. J Subst Abuse Treat. 2009;36:101–9.

    Article  PubMed  Google Scholar 

  56. Lane C, Hood K, Rollnick S. Teaching motivational interviewing: using role play is as effective as using simulated patients. Med Educ. 2008;42:637–44.

    Article  PubMed  Google Scholar 

  57. Jamtvedt G, Young JM, Kristoffersen DT, et al. Audit and feedback: effects on professional practice and health care outcomes. Cochrane Database Syst Rev. 2006;2:CD000259.

    Google Scholar 

  58. Hysong SJ. Meta-analysis: audit and feedback features impact effectiveness on care quality. Med Care. 2009;47:356–63.

    Article  PubMed  PubMed Central  Google Scholar 

  59. Ivers N, Jamtvedt G, Flottorp S, Young JM, Odgaard-Jensen J, French SD, et al. Audit and feedback: effects on professional practice and healthcare outcomes. Cochrane Database Syst Rev. 2012:CD000259.

  60. Colquhoun HL, Brehaut JC, Sales A, Ivers N, Grimshaw J, Michie S, et al. A systematic review of the use of theory in randomized controlled trials of audit and feedback. Implement Sci. 2013;8:66.

    Article  PubMed  PubMed Central  Google Scholar 

  61. Sholomskas DE, Syracuse-Siewert G, Rounsaville BJ, Ball SA, Nuro KF, Carroll KM. We don’t train in vain: a dissemination trial of three strategies of training clinicians in cognitive-behavioral therapy. J Consult Clin Psychol. 2005;73:106–15.

    Article  PubMed  PubMed Central  Google Scholar 

  62. Motivational Interviewing Network of Trainers. Welcome to the motivational interviewing page! Fairfax, VA: Motivational Interviewing Network of Trainers; 2016. Accessed 10 Jan 2017.

  63. Roosa M, Scripa JS, Zastowny TR, Ford JH 2nd. Using a NIATx based local learning collaborative for performance improvement. Eval Program Plan. 2011;34:390–8.

    Article  Google Scholar 

  64. Stephan SH, Connors EH, Arora P, et al. A learning collaborative approach to training school-based health providers in evidence-based mental health treatment. Child Youth Serv Rev. 2013;35:1970–8.

    Article  Google Scholar 

  65. Haine-Schlagel R, Brookman-Frazee L, Janis B, Gordon J. Evaluating a learning collaborative to implement evidence-informed engagement strategies in community-based services for young children. Child Youth Care Forum. 2013;42:457–73.

    Article  Google Scholar 

  66. Kauth MR, Sullivan G, Blevins D, Cully JA, Landes RD, Said Q, et al. Employing external facilitation to implement cognitive behavioral therapy in VA clinics: a pilot study. Implement Sci. 2010;5:75.

    Article  PubMed  PubMed Central  Google Scholar 

  67. Kilbourne AM, Abraham KM, Goodrich DE, Bowersox NW, Almirall D, Lai Z, et al. Cluster randomized adaptive implementation trial comparing a standard versus enhanced implementation intervention to improve uptake of an effective re-engagement program for patients with serious mental illness. Implement Sci. 2013;8:136.

    Article  PubMed  PubMed Central  Google Scholar 

  68. Stetler CB, Legro MW, Rycroft-Malone J, Bowman C, Curran G, Guihan M, et al. Role of “external facilitation” in implementation of research findings: a qualitative evaluation of facilitation experiences in the Veterans Health Administration. Implement Sci. 2006;1:23.

    Article  PubMed  PubMed Central  Google Scholar 

  69. Cully JA, Armento ME, Mott J, Nadorff MR, Naik AD, Stanley MA, et al. Brief cognitive behavioral therapy in primary care: a hybrid type 2 patient-randomized effectiveness–implementation design. Implement Sci. 2012;7:64.

    Article  PubMed  PubMed Central  Google Scholar 

  70. Miller K, Dunn D. Using past performance to improve future practice: a framework for method evaluation and improvement. In: Zupancic J, Wojtkowski WG, Wojtkowski W, Wrycza S, editors. Evolution and Challenges in System Development. US: Springer; 1999. p. 99–107.

    Chapter  Google Scholar 

  71. Miller WR, Rose GS. Motivational interviewing and decisional balance: contrasting responses to client ambivalence. Behav Cogn Psychother. 2015;43:129–41.

    Article  PubMed  Google Scholar 

  72. Ford JH 2nd, Green CA, Hoffman KA, Wisdom JP, Riley KJ, Bergmann L, et al. Process improvement needs in substance abuse treatment: admissions walk-through results. J Subst Abuse Treat. 2007;33:379–89.

    Article  PubMed  PubMed Central  Google Scholar 

  73. Weiner BJ, Belden CM, Bergmire DM, Johnston M. The meaning and measurement of implementation climate. Implement Sci. 2011;6:78.

    Article  PubMed  PubMed Central  Google Scholar 

  74. Rapp CA, Etzel-Wise D, Marty D, Coffman M, Carlson L, Asher D, et al. Evidence-based practice implementation strategies: results of a qualitative study. Community Ment Health J. 2008;44:213–24.

    Article  PubMed  Google Scholar 

  75. Zwarentein M, Goldman J, Reeves S. Interprofessional collaboration: effects of practice-based interventions on professional practice and healthcare outcomes. Cochrane Database Syst Rev. 2009;3:CD000072.

    Google Scholar 

  76. Hysong SJ, Best RG, Pugh JA. Clinical practice guideline implementation strategy patterns in Veterans Affairs primary care clinics. Health Serv Res. 2007;42:84–103.

    Article  PubMed  PubMed Central  Google Scholar 

  77. Pare G, Sicotte C, Poba-Nzaou P, Balouzakis G. Clinicians’ perceptions of organizational readiness for change in the context of clinical information system projects: insights from two cross-sectional surveys. Implement Sci. 2011;6:15.

    Article  PubMed  PubMed Central  Google Scholar 

  78. Weiner BJ, Amick H, Lee SY. Conceptualization and measurement of organizational readiness for change: a review of the literature in health services research and other fields. Med Care Res Rev. 2008;65:379–436.

    Article  PubMed  Google Scholar 

  79. Weiner BJ. A theory of organizational readiness for change. Implement Sci. 2009;4:67.

    Article  PubMed  PubMed Central  Google Scholar 

  80. Lehman WE, Greener JM, Simpson DD. Assessing organizational readiness for change. J Subst Abuse Treat. 2002;22:197–209.

    Article  PubMed  Google Scholar 

  81. Helfrich CD, Li YF, Sharp ND, Sales AE. Organizational readiness to change assessment (ORCA): development of an instrument based on the Promoting Action on Research in Health Services (PARIHS) framework. Implement Sci. 2009;4:38.

    Article  PubMed  PubMed Central  Google Scholar 

  82. Kajermo KN, Bostrom AM, Thompson DS, Hutchinson AM, Estabrooks CA, Wallin L. The BARRIERS scale—the barriers to research utilization scale: a systematic review. Implement Sci. 2010;5:32.

    Article  PubMed  PubMed Central  Google Scholar 

  83. Marshall T, Solomon P, Steber SA. Implementing best practice models by using a consensus-building process. Adm Policy Ment Health. 2001;29:105–16.

    Article  CAS  PubMed  Google Scholar 

  84. Margerum RD. Collaborative planning—building consensus and building a distinct model for practice. J Plan Educ Res. 2002;21:237–53.

    Article  Google Scholar 

  85. Nembhard IM. Learning and improving in quality improvement collaboratives: which collaborative features do participants value most? Health Serv Res. 2009;44:359–78.

    Article  PubMed  PubMed Central  Google Scholar 

  86. Nadeem E, Olin SS, Hill LC, et al. Understanding the components of quality improvement collaboratives: A systematic literature review. Milbank. 2013;91:354–94.

    Article  Google Scholar 

  87. Taylor MJ, McNicholas C, Nicolay C, Darzi A, Bell D, Reed JE. Systematic review of the application of the plan-do-study-act method to improve quality in healthcare. BMJ Qual Saf. 2014;23:290–8.

    Article  PubMed  Google Scholar 

  88. Raudenbush SW, et al. Optimal design software for multi-level and longitudinal research (Version 3.01) [Software]. 2011.

  89. Find HIV testing sites and care services—locator map.

  90. POZ Magazine. Health services directory. Find your local HIV/AIDS health care and service organizations. Smart + Strong. Accessed 7 Apr 2015.

  91. Stout RL, Wirtz PW, Carbonari JP, Del Boca FK. Ensuring balanced distribution of prognostic factors in treatment outcome research. J Stud Alcohol. 1994;12(Suppl):70–5.

    Article  CAS  Google Scholar 

  92. Charpentier PA. Urn Randomization Program gRand [computer program]. 1.10th ed. New Haven: Yale University; 2003.

    Google Scholar 

  93. Martino S, Ball SA, Gallon SL, Hall D, Garcia M, Ceperich S, et al. Motivational interviewing assessment: supervisory tools for enhancing proficiency. Salem, OR: Northwest Frontier Addiction Technology Transfer Center, Oregon Health and Science University; 2006.

  94. Garner BR, Godley SH, Dennis ML, Hunter B, Bair C, Godley MD. Using pay for performance to improve treatment implementation for adolescent substance use disorders: results from a cluster randomized trial. Arch Pediatr Adolesc Med. 2012;166:938–44.

    Article  PubMed  Google Scholar 

  95. Figueredo AJ, McKnight PE, McKnight KM, Sidani S. Multivariate modeling of missing data within and across assessment waves. Addiction. 2000;95(Suppl 3):S361–80.

    Article  PubMed  Google Scholar 

  96. Little RJA, Rubin DB. The analysis of social science data with missing values. Sociol Meth Res. 1989;18:292–326.

    Article  Google Scholar 

  97. Raudenbush SW, Bryk AS, Congdon R. HLM 7.01 for Windows [Computer software]. Stokie: Scientific Software International, Inc.; 2013.

    Google Scholar 

  98. HIV/AIDS Care Continuum. 2015. Accessed 11 Jan 2017.

  99. Altice FL, Kamarulzaman A, Soriano VV, Schechter M, Friedland GH. Treatment of medical, psychiatric, and substance-use comorbidities in people infected with HIV who use drugs. Lancet. 2010;376:367–87.

    Article  PubMed  PubMed Central  Google Scholar 

  100. Gwadz M, de Guzman R, Freeman R, Kutnick A, Silverman E, Leonard NR, et al. Exploring how substance use impedes engagement along the HIV care continuum: a qualitative study. Front Public Health. 2016;4:62.

    PubMed  PubMed Central  Google Scholar 

  101. Martino S, Haeseler F, Belitsky R, Pantalon M, Fortin AH. Teaching brief motivational interviewing to year three medical students. Med Educ. 2007;41:160–7.

    Article  PubMed  Google Scholar 

  102. Haeseler F, Fortin AH, Pfeiffer C, Walters C, Martino S. Assessment of a motivational interviewing curriculum for year 3 medical students using a standardized patient case. Patient Educ Couns. 2011;84:27–30.

    Article  PubMed  Google Scholar 

  103. Prochaska JO, Velicer WF, Rossi JS, Goldstein MG, Marcus BH, Rakowski W, et al. Stages of change and decisional balance for 12 problem behaviors. Health Psychol. 1994;13:39–46.

    Article  CAS  PubMed  Google Scholar 

  104. Prestwich A, Lawton R, Conner M. The use of implementation intentions and the decision balance sheet in promoting exercise behaviour. Psychol Health. 2003;18:707–21.

    Article  Google Scholar 

  105. McFarlane WR, McNary S, Dixon L, Hornby H, Cimett E. Predictors of dissemination of family psychoeducation in community mental health centers in Maine and Illinois. Psychiatr Serv. 2001;52:935–42.

    Article  CAS  PubMed  Google Scholar 

  106. Shea CM, Jacobs SR, Esserman DA, Bruce K, Weiner BJ. Organizational readiness for implementing change: a psychometric assessment of a new measure. Implement Sci. 2014;9:7.

    Article  PubMed  PubMed Central  Google Scholar 

  107. Jacobs SR, Weiner BJ, Bunger AC. Context matters: measuring implementation climate among individuals and groups. Implement Sci. 2014;9:46.

    Article  PubMed  PubMed Central  Google Scholar 

  108. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4:50.

    Article  PubMed  PubMed Central  Google Scholar 

Download references

Authors’ contributions

Study conceptualization and design was led by Dr. Garner. All authors were involved in developing and editing the manuscript and have given final approval of the submitted version. All authors read and approved the final manuscript.


Special thanks go to St. Louis Effort For AIDS ( and its leadership staff (Ann Ritz, Cheryl Oliver) for their great help and support during the initial development of the SAT2HIV Project. Special thanks also goes to Tracy Karvinen ( for helping connect Dr. Garner to St. Louis Effort For AIDS and its amazing staff. Finally, special thanks go to several of RTI International’s Research Operations Center staff for helping with the initial development of the SAT2HIV Project: Tamara Terry, Todd Prince, Adam Kaderabek, Lynda Tatum, and David Schultz.

Competing interests

The authors declare that they have no competing interests.

Availability of data and materials

Upon reasonable request, which should be made to the corresponding author, study data or materials may be made available.

Consent for publication

Not applicable.

Ethics approval

The current study was conducted under the auspices of RTI International’s IRB.


This work was supported by the National Institute on Drug Abuse (NIDA; R01DA038146; PI Garner). NIDA had no role in the design of this study and will not have any role during its execution, analyses, interpretation of the data, or decision to submit results. The contents of this publication are solely the responsibility of the authors and do not necessarily represent the official views of the government or participating organizations.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Author information

Authors and Affiliations


Corresponding author

Correspondence to Bryan R. Garner.

Additional files

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (, which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver ( applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Garner, B.R., Zehner, M., Roosa, M.R. et al. Testing the implementation and sustainment facilitation (ISF) strategy as an effective adjunct to the Addiction Technology Transfer Center (ATTC) strategy: study protocol for a cluster randomized trial. Addict Sci Clin Pract 12, 32 (2017).

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: