The unprecedented growth of DAP during my tenure was complicated by the emergence of many competing claims about the effectiveness of new interventions and new programs. programs. As both a scientist and as the leader of a publicly funded program, I felt compelled to ensure that we were using the latest research and the best data to guide our decision making at the level of individual student and the program as a whole.
Evidence-Based Practice Reviews
At various times, we were presented with recommendations for specific programmatic changes from parents, professionals, and state legislators: for example, to adopt the Son-Rise program, to integrate verbal behavior programming, and to endorse the use of special diets. In each case, I led reviews conducted with multiple partners, and drafted formal written reports and responses as needed.
This led me to help to develop and adopt standards of evidence-based practice and data-based decision-making (Doehring and Winterling, 2011, 2013). For example, I developed criteria for identify whether any studies of dietary interventions were well-controlled, and presented the fundings to a multi-agency state panel. This led a decision to seek additional state resources for raising awareness about established dietary disorders, but not to endorse universal screening for dietary sensitivities or promote specialized diets in the absence of a clearly established diagnosis. In the case of verbal behavior approaches, I successfully challenged claims that a single, evidenced-based model existed, but actively pursued training and coaching in specific, promising techniques.
Data based decision making
I instituted data-based decision-making statewide. At the program level, this included negotiating with state officials to obtain and to utilize state data on ASD enrollment for the analyses needed to project future growth, and conducting analyses seeking to establish DAP’s success in serving the population of the state (Doehring and Winterling, 2011, 2013; Doehring, 2008). At the individual level, I instituted and enforced policies that based recommendations regarding the need for behavior support, safety procedures, and extended educational services in specific progress (or lack thereof) in individual educational goals.
State policy and regulations
I introduced evidence-based practice as language in a wide range of state legislation, regulations, and memoranda of agreement, addressing assessment, intervention, and oversight.
I led parent and staff training to raise awareness about how to understand and to use research (Doehring, 2007; Doehring and Reichow, 2007; Doehring, 2005).
To replicate a practice or program, you must understand its infrastructure, like funding, staffing, target populations, curriculum, and settings
The many requests I received to consider adopting other practices and programs prompted me to look closely at what might be required to replicate a practice or program. Some of these requirements draw on lessons described elsewhere, like the need to understand which population(s) the program actually served, and how to ensure program fidelity and the necessary level of treatment intensity.
In addition to these, it is important to understand the program's infrastructure: how big it is, how it is staffed, how it is funded, how it related to other services or agencies, and how training and oversight is provided. In some cases, it quickly becomes clear that this infrastructure may be a significant factor in a role in the program's success: for example, a program that charges high fees or maintains a high staff:student ratio may be more successful simply because of the intensity of support it can offer. If those fees cannot be charged, or those ratios maintained, it may become difficult to replicate.