Skip to main content

Table 2 Specification overview of the multifaceted Addiction Technology Transfer Center (ATTC) strategy

From: Testing the implementation and sustainment facilitation (ISF) strategy as an effective adjunct to the Addiction Technology Transfer Center (ATTC) strategy: study protocol for a cluster randomized trial

Discrete implementation strategies

Defining characteristic according to Proctor et al. [41]

Operational definition of key dimensions for each discrete implementation strategy

Actor(s)

Actions(s)

Target(s) of the action

Temporality

Dose

Targeted implementation outcome(s)

Justification

A. Centralized technical assistance:

Develop and use a system to deliver technical assistance focused on implementation issues

Regional ATTC (e.g., Mid-America, Northwest, Northeast)

The overarching discrete implementation strategy that encompasses the other discrete implementation strategies listed below

2 BI staff per ASO

The initial kickoff meeting should be within 1 month of completing the exploration phase

See Tables 4, 5 and 6

Fidelity (i.e., proficiency and implementation effectiveness

[36, 42,43,44]

B. Develop educational materials:

Develop and format guidelines, manuals, toolkits, and other supporting materials in ways that make it easier for stakeholders to learn about the innovation and for clinicians to learn how to deliver the clinical innovation

Regional ATTC

The Motivational Interviewing-Based Brief Intervention (MIBI) protocol manual, which provides information and knowledge about how the MIBI is intended to be implemented

2 BI staff per ASO

Finalization of educational materials (e.g., MIBI protocol manual) should be prior to the initial kickoff meeting

See Tables 4, 5 and 6

Fidelity (i.e., proficiency and implementation effectiveness

[45, 46]

C. Develop and organize quality monitoring system:

Develop and organize systems and procedures that monitor clinical processes and/or outcomes for quality assurance and improvement

Regional ATTC

A Web-based system (sat2hivproject.org), that enables secure and efficient sharing of data relevant to the evidence-based practice (EBP) preparation and implementation process

2 BI staff per ASO

Finalization of quality monitoring systems (i.e., sat2hivproject.org) should be prior to the initial kickoff meeting

See Tables 4, 5 and 6

Fidelity (i.e., proficiency and implementation effectiveness)

[49,50,51]

D. Develop tools for quality monitoring:

Develop, test, and introduce quality-monitoring tools with inputs (e.g., measures) specific to the innovation being implemented

Regional ATTC

The Independent Tape Rater Scale (ITRS), which enables reliable and valid rating of the extent to which BI staff deliver the EBP with fidelity

2 BI staff per ASO

Finalization of tools for quality monitoring (i.e., ITRS) should be prior to the initial kickoff meeting

See Tables 4, 5 and 6

Fidelity (i.e., proficiency and implementation effectiveness)

[26, 52, 53]

E. Distribute educational materials:

Distribute educational materials (e.g., manuals) in person, by mail, and/or electronically.

Regional ATTC

Distribute professionally printed copies of the MIBI protocol manual to each BI staff

2 BI staff per ASO

Distribute at the workshop training 

See Tables 4, 5 and 6

Fidelity (i.e., proficiency and implementation effectiveness)

[45, 46, 54]

F. Conduct educational meetings:

Hold meetings targeted toward providers, administrators, other organizational stakeholders, and community, patient or consumer, and family stakeholders to teach them about the clinical innovation

Regional ATTC

In-person and Web-based meetings that enable direct interaction between the actors (ATTC) and targeted users of the EBP (BI staff)

2 BI staff per ASO

Educational meetings should begin at least 3 months before the implementation phase begins

See Tables 4, 5 and 6

Fidelity (i.e., proficiency and implementation effectiveness)

[30, 36, 55, 61]

G. Make training dynamic:

Vary the information delivery methods to cater to different learning styles and work contexts and shape the training in the innovation to be interactive

Regional ATTC

Incorporate standardized role plays that enable EBP trainees (BI staff) to practice with each other and that facilitate understanding of the EBP from both staff and client perspectives

2 BI staff per ASO

Should begin during the first contact

See Tables 4, 5 and 6

Fidelity (i.e., proficiency and implementation effectiveness)

[55, 56, 101, 102]

H. Audit and provide feedback:

Collect and summarize clinical performance data over a specified period, and give data to clinicians and administrators in the hopes of changing provider behavior.

Regional ATTC

Generate and email standardized feedback reports to EBP trainees (BI staff) using the standardized quality monitoring tool (ITRS)

2 BI staff per ASO

Should begin approximately 1–2 weeks following the end of the

in-person educational training workshop

See Tables 4, 5 and 6

Fidelity (i.e., proficiency and implementation effectiveness)

[30, 57,58,59,60]

I. Provide ongoing consultation:

Provide clinicians with continued consultation with an expert in the clinical innovation

Regional ATTC

Phone-based individualized meetings that enable direct contact between the actor (ATTC trainer) and one EBP trainee (BI staff)

2 BI staff per ASO

Should begin approximately 1–2 weeks following the end of the

in-person educational training workshop

See Tables 4, 5 and 6

Fidelity (i.e., proficiency and implementation effectiveness)

[30, 36, 61]

J. Create a learning collaborative:

Develop and use groups of providers or provider organizations that will implement the clinical innovation and develop ways to learn from one another to foster better implementation

Regional ATTC

Web-based group meetings that enable direct contact between the actor (ATTC trainer) and a group (10–14 targeted users of the EBP, BI staff), who can share lessons learned

2 BI staff per ASO

Should begin approximately 3–4 weeks after the implementation phase begins

See Tables 4, 5 and 6

Fidelity (i.e., proficiency and implementation effectiveness)

[63,64,65]