Life House Community Profile
Below is the beginning of a community profile for Life House.
Grand Junction has two main hospital systems, Mountain Medical—Grand Junction and
South Side General. The client population of each hospital tends to be differentiated by
geography and socioeconomic status. Mountain Medical serves predominately privately
insured residents from the north side of Grand Junction, and South Side General serves
publicly insured lower-income families from Grand JunctionÂ’s southern communities.
Both health-care systems offer inpatient mental health services, generally based on
emergency room admissions or referrals from primary care physicians. Last year,
Mountain Medical logged over 500 emergency room admissions for severe mental
illness, while South Side General logged approximately 1,000 emergent psychiatric
admissions. Following discharge, mental health services continue at a number of
outpatient agencies throughout the greater Grand Junction area.
Last year, the Grand Junction Police Department received over 2,500 domestic violence
calls, slightly higher than the national average of 6.3 calls for every 1,000 adults. The
majority of domestic violence police dispatches occur on the south side of Grand
Junction, thus South Side General carries more cases of domestic violence-related
trauma. Coincidentally, South Side General receives more emergency room admissions
for alcohol and drug-related accidents and trauma than the rest of Grand Junction.
The north side of Grand Junction has seen a more rapid gentrification since itÂ’s situated
nearer the revitalized downtown and has easy access to nearby shopping and
restaurants. With an influx of retirees to the north side, there has been a growth in
retirement communities and assisted living facilities. Referrals for general counseling
support for aging populations tends to be drawn from the north side. A wider range of
mental health needs, due to support for aging in place, increased economic stressors,
and higher reported rates of alleged elder abuse, come from the south side.
Last year, the Grand Junction school district made over 1,200 reports of alleged child
maltreatment to child protective services. A substantially higher proportion of reports
(60%) came from neighborhoods with high rates of unemployment. School-based social
workers in these economically distressed neighborhoods also reported higher rates of
truancy and classroom behavioral problems among the children. In parent-support
meetings, many parents referenced the need for additional supports to help mitigate
economic and psychological stressors and improve parenting skills during crises.
Grand Junction does have a public transportation system, but it is limited. Bus service is
infrequent at night and on the weekends, which creates transportation issues for those
in need of services who do not have—or cannot arrange for—rides to attend classes,
counseling, or mandated meetings or services.
1 – How do you reconcile the top-down (executive or administrative) interpretation of what is
needed with the bottom-up (client or community) interpretation of what is needed from a
program evaluation?
o
o
o
o
o
2 – For each of the below stakeholders:
Client
Practitioner
Administrator
Executive
Funder
Identify the program you have selected for evaluation.
In your own words, describe what the process evaluation is and why you are doing it.
List 3–5 questions you would like answered to help inform your process evaluation.
Explain why the questions you selected are important.
Describe next steps—what will you do once you have collected this initial data?
Answer the 3–5 questions posed by the program evaluator. Use any online research into similar
programs to help answer the questions as realistically as possible. Use fictional (but informed)
responses, as needed.
3 – Create a needs assessment plan for your selected program at the Life House agency. You will
use the program to narrow the focus of the related social problem and define the steps and
strategies necessary to collect valid and applicable data to inform the programÂ’s evaluation.
To Prepare
•
Read Chapter 11 of the course text.
•
Revisit Chapter 10 of the course text, as needed.
•
Review the document, Community Profile, in the Learning Resources this week.
•
Search for and select at least two additional empirical sources to inform your needs
assessment plan. Use the course text and the Assignment prompt to identify keywords for
your search.
Submit a 2- to 3-page needs assessment plan. Address the following in your plan:
•
Restate the program and related social problem.
•
Explain the steps you need to take to focus your needs assessment before collecting data.
•
Describe the four types of social needs and how they inform the social problem and,
ultimately, program goals.
•
Explain the role a community profile plays in the planning and implementation of a needs
assessment.
•
Describe how you would collect data to perform your needs assessment. Be sure to
include both quantitative and qualitative approaches.
4 – Determining whether a program is operating as intended can also help the program deal with
new demands, such as a sudden influx of clients. These types of changes may call for
adjustments in program plans that will allow a program to take these demands into account.
Showing that you can make these adjustments will help a potential funder to understand and
accept changes in your program that may have resulted from unforeseen circumstances.
For this Assignment, you will use the data you collected from your role play Discussion this
week, as well as your knowledge of steps and requirements of a process evaluation, to conduct
an initial process evaluation for your selected Life House program.
To Prepare
Review Chapter 12 of the course text again, as needed.
Review the process evaluation articles by Holbrook et al. (2018) and Sorenson and Llamas
(2018) in the Learning Resources this week.
Note: These articles are more comprehensive than what you will need to do in your Assignment
this week—including some additional elements you will complete as part of your Final Project—
but they provide a good overview of the key sections for your own process evaluation.
Review the activities and outputs in your logic model document from Week 5. The information
you produced in your logic model will help inform your process evaluation.
Submit your 2- to 3-page process evaluation as a Word document. Address the following:
•
Briefly describe the Life House program you have selected for evaluation.
•
Describe 2–3 process challenges in your program.
•
Identify appropriate methods to evaluate each process challenge.
•
Make recommendations on how the program can be run more efficiently.
Community Mental Health Journal (2018) 54:921–929
https://doi.org/10.1007/s10597-017-0224-6
ORIGINAL PAPER
Implementation of Dialectical Behavior Therapy in Residential
Treatment Programs: A Process Evaluation Model for a CommunityBased Agency
Amber M. Holbrook1 · Susan R. Hunt2 · Mary Renata See3
Received: 15 May 2017 / Accepted: 27 December 2017 / Published online: 12 January 2018
© Springer Science+Business Media, LLC, part of Springer Nature 2018
Abstract
Dialectical behavior therapy (DBT) can be challenging to implement in community-based settings. Little guidance is available on models to evaluate the effectiveness or sustainability of training and implementation efforts. Residential programs
have much to gain from introduction of evidence-based practices, but present their own challenges in implementation. This
paper presents a low-cost process evaluation model to assess DBT training piloted in residential programs. The model targets
staff and organizational factors associated with successful implementation of evidence-based practices and matches data
collection to the four stages of the DBT training model. The strengths and limitations of the evaluation model are discussed.
Keywords Community-based organizations · Community research · Dialectical behavior therapy · Evidence-based
practice · Implementation research · Program evaluation
Introduction
Dialectical behavior therapy (DBT; Linehan 1993) is an
evidence-based intervention that integrates a cognitivebehavioral model and tenets of dialectical and Zen-Buddhist
philosophical perspectives into its operational framework
(Hawkins and Sinha 1998; Neacsiu et al. 2010). Originally
developed to reduce self-injurious behavior associated
with borderline personality disorder, DBT is a promising
treatment choice for a range of behavioral health concerns,
including eating disorders (Telch et al. 2001; Lenz et al.
2014), oppositional defiant disorder (Nelson-Gray et al.
* Amber M. Holbrook
[email protected]
Susan R. Hunt
[email protected]
Mary Renata See
[email protected]
1
West Chester University, Reynolds Hall, Room 205,
West Chester, PA 19383, USA
2
Resources for Human Development, Inc., 4700 Wissahickon
Avenue, Philadelphia, PA 19144, USA
3
West Chester University, Reynolds Hall, West Chester,
PA 19383, USA
2006), adult ADHD (Hesslinger et al. 2002), and substance
use disorders (van den Bosch et al. 2002). The five essential
functions of DBT (enhancing client capabilities and motivation, ensuring generalization, structuring the environment,
and enhancing therapist capabilities) are typically delivered
through a combination of modes including a client skills
training group, individual therapy, telephone consultation,
case management, and therapist consultation meetings (Feigenbaum 2007).
Initially designed for the adult outpatient treatment
setting, DBT is currently applied in diverse settings,
including inpatient units, youth mental health services,
the family therapy milieu, and assertive community treatment programs (Burroughs and Somerville 2013; Carmel
et al. 2014a; Miller et al. 2002). However, as with other
evidence-based treatments, the conditions necessary for
successful implementation of DBT outside of controlled
settings are not always well-understood or easily met (Burroughs and Somerville 2013). The DBT model can be particularly difficult to implement in resource-scarce treatment
settings. In addition to the initial clinician training, ongoing
supervision, team consultation, and data collection are considered necessary for fidelity monitoring (Herschell et al.
2009). These activities do not represent billable services,
often precluding their incorporation in community-based
agencies (McHugh et al. 2009; Burroughs and Somerville
13
Vol.:(0123456789)
922
2013). The extent to which exclusion of a component
impacts the fidelity or sustainability of DBT delivery is
not well-studied. However, previous findings suggest that
staff skills in implementing evidence-based practices may
regress to baseline in as little as 3 months following training when not adequately reinforced within the agency setting (Miller and Mount 2001).
Implementation of evidence-based practices in residential programs can be especially challenging. Swales
et al. (2012) identified several organizational factors that
inhibited implementation and sustainability of DBT across
behavioral health settings, including lack of management
“buy-in”, failure to protect time needed to deliver the
treatment, insufficient funding, competing staff roles, and
high staff turnover rates. Staff retention is notably poor in
residential treatment settings (Colton and Roberts 2007;
Connor et al. 2003) and represents one of the major barriers to sustainability of evidence-based treatments such
as DBT (Swales et al. 2012; Swenson et al. 2002). Lack
of flexibility in staff schedules for training efforts due to
the 24-h nature of care and few resources also represent
significant barriers.
Additionally, direct service professionals (DSPs) in
residential programs often do not possess the masterlevel education and independent licensure recommended
by the Linehan Institute as a pre-requisite for DBT training (DBT-Linehan Board of Certification 2017). However,
scant guidance on the effectiveness of potential modifications to the training model is available to community
agencies who wish to adapt DBT training to their staff
population and resources. The absence of readily available
measures to assess the fidelity of DBT implementation
(McCay et al. 2016) and the paucity of literature describing evaluative approaches to the training and implementation process (Carmel et al. 2014b), presents an additional
challenge to community agencies who wish to ensure the
effectiveness of their training efforts.
The identification of factors related to effective implementation is critical in the task of translating DBT across
community-based settings. Prior research has suggested
the importance of data collection from multiple stakeholders (Bloch et al. 2006; Carmel et al. 2014a; Gray
et al. 2007) to assist in isolating critical implementation
factors, including challenges in training delivery, barriers and facilitating characteristics of the organizational
setting, and methods of assessing fidelity (Carmel et al.
2014a; Gotham 2006). Multi-level evaluative approaches
that assess the effectiveness of training models adapted to
the community treatment setting are needed. These evaluations can assist in identifying both effective and feasible
training models, as well as shaping strategies for ensuring ongoing fidelity and sustainability of the treatment
intervention.
13
Community Mental Health Journal (2018) 54:921–929
The Present Study
This paper describes a low-cost process evaluation model
to assess the effectiveness of a DBT training model
adapted for a community-based residential setting. The
primary objectives of the evaluation were to: assess
individual staff knowledge and retention following DBT
training; evaluate the relationship of the DBT training to
organizational culture; and to assess the fidelity of service
delivery to the DBT model. We describe the process evaluation model, and outline the strengths and limitations of
the approach to assessing DBT training implementation.
Implications for dissemination of DBT in communitybased programs are discussed.
Process Evaluation Model
The training model evaluated was one adapted for implementation in five residential and shelter programs serving
adults with psychiatric and substance use disorders. The
four phase training model (Staff Training, Service Design,
Implementation, and Maintenance) sought to address both
staff and program-level factors associated with successful
implementation of DBT.
The process evaluation was conducted simultaneously to
provide ongoing feedback on the training implementation,
with data collection matched to each of four training phases
(see Fig. 1). The evaluation gathered data from multiple
stakeholders including administrators, line staff, and clients
to target both individual and program-level factors associated with successful EBP implementation. In the following
section, training and implementation activities at each of
the four stages are introduced, followed by the associated
evaluation activities and measurement tools.
Phase 1: Staff Training
Consistent with the team-based model of DBT training recommended by Linehan and Wilkes (2015) each program
received 24 h of training for both administrative and line
staff concurrently, with DBT Skills Group facilitators receiving an additional 3 h of training. The content of the training
focused on DBT commitment strategies, the core components of the DBT approach, and group facilitation skills.
Both didactic and experiential pedagogical techniques were
utilized for each training module. Consistent with a recursive training model (Swales 2010), experiential activities
included role play, use of diary cards, and staff homework to
practice application of specific DBT skills in their personal
lives.
Community Mental Health Journal (2018) 54:921–929
Fig. 1??Schedule of training and
evaluation activities
923
Phase 1: Staff
Training
Implementation
Activities
Evaluation
Activities
Phase 2: Service
Design
3 Day Staff
Training
Meet with
Supervisory and
Administrative
Staff
DBT
Knowledge
Assessment
Quiz
Organization
Readiness to
Change
Assessment
Following conclusion of the initial DBT training modules,
program staff complete the Dialectical Behavior Therapy
Knowledge Assessment Quiz that includes items on the core
components and skills modules of DBT (see “Appendix”).
Scored by the trainer, they provide on-the-spot feedback to
training participants and allow the trainer to offer focused
attention on areas that require improved comprehension.
When aggregated, the quizzes offer feedback to the trainer
on the efficacy of the training delivery.
Demographic information was also collected to determine
what relationship, if any, staff characteristics may have to
retention of DBT related skills and knowledge. Although the
current sample size is too modest to detect correlations, data
accumulated from future extension of the training to additional programs may guide adjustments to the training delivery responsive to staff educational or cultural backgrounds.
Measures
Dialectical Behavior Therapy Knowledge Assessment
Quiz This is a self-administered, 18-item quiz developed by
the community agency to assess staff post-training retention
of DBT principles and skills. It includes questions on each
of the four modules of DBT: emotional regulation, interpersonal effectiveness, mindfulness, and distress tolerance.
Items include multiple choice, true–false, and word matching, and fill-in-the-blank responses (see “Appendix”).
Demographics Staff demographic data collected include
age, gender, length of service in the agency, years of educational attainment, and race/ethnicity.
Phase 2: Service Design
In the second phase of DBT implementation, the staff
trainer meets with the administrative and supervisory
staff of the participating program to design DBT services.
This series of meetings is intended to clarify roles in DBT
implementation and maintenance, ensure that adequate
Phase 3:
Implementation
Phase 4:
Maintenance
DBT Services
Provided
Titrated
Consultation
Team
Consultation
Meetings
DBT Skills
Group
Observation
Tool
Session Rating
Scales
Qualitative Staff
Interviews
Staff Turnover
Data
resources are devoted (such as designated staff time for
supervision), plan for long-term fidelity monitoring, and
to assess any practical or cultural organization barriers to
successful implementation that need to be addressed. The
meetings are guided, in part, by the Program Elements
of Treatment Questionnaire (PETQ; Schmidt et al. 2008),
which was designed as a self-study tool for programs to
assess their fulfillment of the DBT model and their preparedness for DBT program accreditation.
As part of the process evaluation, the trainer administers the Organizational Readiness to Change (ORCD4; Texas Christian University Institute of Behavioral
Research 2009) measure to supervisory and direct service
professionals during the monthly staff meeting. Data on
the organizational climate provides feedback to the trainer
on programmatic challenges that need to be navigated or
mitigated for successful EBP implementation, including
clarity of program mission and goals, and staff attributes
such as adaptability (Weiner et al. 2008). Aggregate data
over time may assist in identifying programmatic factors
that predict successful DBT implementation in residential
settings.
Measures
Organizational Readiness for Change (ORC?D4; Texas Christian University Institute of Behavioral Research 2009) The
ORC-D4 is a four part, standardized assessment that
addresses the motivation to change, staff attributes, adequacy of resources, and organizational climate. The instrument is designed to enable the four parts to be administered
separately. To reduce response burden and focus on the
attributes particular to the participating programs, Part D
Organizational Climate only was administered. The Organizational Climate scale is comprised of six subscales: mission (? = 0.62), cohesion (? = 0.83), autonomy (? = 0.52),
communication (? = 0.67), stress (? = 0.82), and change
(? = 0.49) (Lehman et al. 2002).
13
924
Phase 3: Implementation
In the implementation phase, the staff commences provision
of DBT services to clients, including operation of the weekly
DBT Skills group, individual DBT skills coaching for clients, DBT-informed clinical supervision and staff meetings,
and fidelity monitoring activities. With support from the
trainer, the program engages in treatment and monitoring
activities, and staff assume their designated roles in the DBT
team. The trainer troubleshoots any operational and clinical
issues that cannot be solved by the program independently
and provides more intensive, time-limited fidelity monitoring. As the program becomes more confident in its DBT
implementation, the trainer titrates consultation based on
the assessed level of quality and fidelity to the DBT model
and verification that the program has infused DBT principles
into existing clinical structures, such as staff meetings and
supervision.
The evaluation activities in this phase focus on fidelity
assessment, through direct observation of staff facilitating
the DBT skills group, and measuring therapeutic alliance
with clients. Direct observation, using the DBT Skills Group
Observation Tool (DBT Institute of Michigan 2009), focused
on delivery of key components of DBT, allowing provision
of detailed, timely feedback to group facilitators as they
implement newly acquired skills. The observations serve as
a touchstone for the trainer in adjusting the intensity and
content of consultation for the program.
Working therapeutic alliance was also assessed, using
the Session Rating Scale (Miller et al. 2000). One of the
improved treatment outcomes offered by DBT is that of
increased client retention rates (Ben-Porath et al. 2004;
Rathus and Miller 2002; Sunseri 2004). The primary mechanism by which DBT is hypothesized to impact retention rates
is through enhancement of therapeutic rapport and client
engagement in their treatment (Koerner and Linehan 2000).
Measures
Dialectical Behavior Therapy Skills Group Observation Tool
(DBT Institute of Michigan 2009) The DBT Skills Group
Observation Tool was created as a tool for the evaluation and
training of DBT Skills Group facilitators. This structured,
qualitative instrument guides the observer in recording key
elements related to provision of skills training (mindfulness
exercise, skills training, practice review of previously taught
skills), demonstration of group facilitation skills (time management, communication strategies, observing limits), and
use of therapeutic techniques (contingency management,
validation strategies, and dialectic balance).
Session Rating Scale (Miller et al. 2000; SRS) The SRS is
a measure of “working therapeutic alliance” designed for
13
Community Mental Health Journal (2018) 54:921–929
daily clinical use. The four-item visual analog instrument,
assesses therapeutic alliance as a product of: relational bond
between therapist and client; agreement on goals and tasks
of therapy; and confident collaboration (faith in the therapist and outcome of therapy). The instrument is scored by
summation of marks made by the client on a 10 cm line,
with each question yielding 10 possible points. The SRS has
an internal consistency of .88 and fair test–retest reliability
(0.70) over six administrations ranging from 4 weeks to 3
months (Duncan et al. 2003).
Phase 4: Maintenance
In the final phase of implementation, staff from the previously-trained programs meet monthly as a community of
practice (Wenger et al. 2002) to enhance and reinforce DBT
skills and knowledge, and to prevent “drift” from the DBT
model. Therapist team consultation meetings comprise one
of the core components of the classic DBT model. These
team consultation meetings serve to extend clinician skills,
support motivation, and to monitor burnout (Feigenbaum
2007).
Evaluation activities for this final stage focused on elucidating the ways in which these team consultation meetings supported staff work with clients and the impact of
DBT training on staff retention. Qualitative interviewing
was utilized to explore the functioning and impact of the
meetings, given that little research exists on the effectiveness of DBT consultation teams, communities of practice, or
similar strategies in disseminating and reinforcing evidencebased practice (Ranmuthugala et al. 2011). Feedback guides
improvements to the structure and content of the meetings
to ensure continued participation and engagement, and to
improve effectiveness of the meetings in supporting DBT
implementation.
Program-level data on staff turnover were also collected.
Limited evidence suggests that provision of DBT training
may lower clinician stress associated with provision of
treatment (Perseius et al. 2007) and burnout (Carmel et al.
2014a). Given that inadequate training and “burnout” have
been identified as substantial contributing factors to staff
turnover in residential care (Connor et al. 2003), it is possible that DBT training may have a positive impact on staff
turnovers rates in the participating programs.
Measures
Qualitative Interviews Individual qualitative interviews
with team consultation participants consisted of six questions focused on: the role of the staff person in the consultation meeting; the impact on their view of themselves as
practitioners and their clinical practice; and perception of
the strengths and weaknesses of the team consultation.
Community Mental Health Journal (2018) 54:921–929
Staff Turnover Staff turnover data for the year prior to and
following provision of DBT training were obtained from
the agency Human Resources department. Staff turnover
was expressed as the proportion of staff that vacated their
positions to the total number of staff positions within the
program.
This project was approved by the West Chester University Institutional Review Board. The authors have no conflicts of interest to report and accept responsibility for the
manuscript.
Discussion
The process evaluation described was designed to assess
a DBT training and implementation model adapted for
residential treatment settings. While effective for treating
a broad range of behavioral health disorders, DBT can be
an especially challenging treatment to implement, requiring substantial clinician training and often necessitating
expenses that community agencies are ill-equipped to meet,
particularly in settings with high staff turnover rates (Swenson et al. 2002; Trupin et al. 2002). The absence of validated, publicly available fidelity measures for DBT practice
means that clinicians must rely on DBT experts for guidance. In the absence of funding for this ongoing consultation, ineffective implementation and drift from the model
are a substantial risk (Swenson et al. 2002). There is a need
for models of community-based evaluation that are less
resource-intensive to provide feedback and quality-control
for DBT training efforts.
Adaptation to residential treatment programs presents its
own challenges. Initially developed for application in outpatient clinical programs, not all components of the classic
DBT service model are appropriate across treatment settings (Wolpow et al. 2000). While some success has been
reported in utilizing DBT skills groups in residential settings (Wolpow et al. 2000), the high staff turnover rates and
burnout, complex client issues, and inadequate staff training,
and budgetary restraints often complicate implementing evidence-based practices in residential care (James et al. 2013).
The majority of studies in the published literature have
focused on client outcomes as evidence of successful implementation of DBT adaptations for new client populations
or service settings. While these studies provide important
information on the appropriate applications of DBT, they do
little to identify those aspects of training and implementation that are critical to achieving positive outcomes (Burroughs and Somerville 2013; Carmel et al. 2014b). Data on
training models and implementation are especially crucial
to providers wishing to adapt DBT for their community
programs.
925
Strengths of the Evaluation Model
One of the main strengths of the evaluation model presented is its matching of data collection points to each
stage of training and implementation. This provides some
key advantages. First, with the exception of the qualitative
data collected on the monthly community of practice meetings and the second administration of the DBT Knowledge
Assessment Quiz, the data collection and evaluation activities are entirely integrated into implementation model, in
a manner that provides immediate feedback to improve
delivery. Second, the data collected offers the potential
for increased knowledge through long-term aggregation
of the data and critical examination of each component of
the training and implementation model.
For example, administration of the DBT Knowledge
Quiz following the initial staff training offers on-the-spot
data alerting the trainer that an individual staff member
requires additional assistance, or that a concept was not
adequately explicated during the training. As the questions
on the quiz are matched to the four DBT modules, over
time, analysis may identify if there are specific aspects
of the DBT principles and approaches that are less easily
understood and retained.
Correlating knowledge assessment results with the
demographic data collected may assist the agency in tailoring the training to address relevant staff characteristics
that impact interpretation and application of the training
content. Staff receptivity to organizational change, such
as the introduction of an evidence-based treatment model,
is often impacted by their confidence in their role (Austin
and Claassen 2008) and level of stress. Although little
evidence exists to support this requirement, DBT training is also currently recommended only for masterÂ’s level
clinicians, few of whom are employed in provision of residential care. Given the lack of training, high stress levels,
and elevated rates of staff turnover prevalent in residential care (Colton and Roberts 2007; Connor et al. 2003),
assessing length of tenure with the agency, perception of
job role burden, and educational background were deemed
important staff variables in fully understanding the implementation effort.
Similarly, administration of a measure of organizational
readiness to change in Phase 2 assists in identifying programmatic barriers related to resources, structure, and culture that may impede implementation of DBT so that they
can be acknowledged and addressed. As more programs are
trained in the course of the initiative, the aggregated data
may indicate what types of programmatic barriers tend to be
intransigent and/or prevent successful DBT implementation.
This knowledge can assist the agency in targeting its DBT
training resources to those programs most likely to benefit,
while working through other venues to improve operations
13
926
and program culture as programs that are not yet ready to
implement DBT.
This attention to both individual and program factors relevant to implementation is a crucial aspect of the evaluation
model. Staff frequently identify organizational factors, rather
than factors related to the treatment model, as their primary
challenges to sustainability of DBT (Swales et al. 2012).
Programmatic factors impact the success of implementation
from the first consideration of whether or not the proposed
treatment represents a good fit with the program mission
and goals to provisions for ongoing monitoring of client outcomes and treatment integrity (Bloch et al. 2006; Swales
2010). The ability and willingness to adjust job roles and
responsibilities to facilitate staff training, and delivery and
monitoring of the intervention are key (Swales et al. 2012).
Motivation, resources, staff attributes, and program climate
are all contributing factors to successful implementation of
evidence-based practice (Simpson 2009; Weiner et al. 2008).
Related to the assessment of both individual and organizational factors is the importance of incorporating multiple
stakeholder perspectives in the evaluation process (Carmel
et al. 2014a). The evaluation model described includes collection of data from direct service professionals, program
administrators, and clients. Program administrators and
line staff may often have different day-to-day priorities and
concerns, as well as different vantage points from which to
view the implementation effort. Both are essential to creating “buy in”, shaping the implementation process (Austin
and Claas