Stories about ASD

ASD Roadmap

Other Stories

ASD Roadmap

Train and Hope

A popular but misguided theory among researchers and leaders about how new practices

transform programs

November 30, 2016

 

Behavior analysts are familiar with the “Train and Hope” assumption underlying some teaching of students with ASD and related conditions.  Specifically, professionals assume that skills taught in a structured context will generalize naturally to other contexts; for example, that a child taught to request help using discrete trials will ask for help on the playground. Experienced clinicians and educators know that hope is just not enough, and that you have to specifically plan to help students generalize these skills. This has spurred research into a wide range of different strategies; how the task is presented, how reinforcement is used, and so on. Research has since demonstrated that these modifications can be effective.  Most clinicians and educators now know better than to Train and Hope.

A similar Train and Hope hypothesis underlies burgeoning efforts to transform programs by encouraging professionals to adopt new clinical or teaching practices. Simply stated, researchers, program leaders, funders, and advocates promoting the adoption of new practices and programs invest almost exclusively in delivering a traditional program of training. Whether we are talking of a specific practice related to ASD diagnosis or treatment, or a more comprehensive program addressing multiple skills, developers mistakenly assume that clinicians and educators just need more training!

To be sure, a rigorous program of training that develops expertise will be pivotal to any efforts to promote widespread adoption of a new practice. Most people now recognize that traditional training based largely on lectures and handouts is insufficient for all but the most basic practices. In my 2013 book, I offer a summary of considerations in developing comprehensive training programs.

While a comprehensive program of training is often necessary, it is rarely sufficient.  The over-reliance on training to promote the adoption of new practices and programs is a significant impediment to translating research into outcomes in community-based programs.  This is especially true when seeking to promote adoption on any meaningful scale.

Why do researchers and program leaders Train and Hope?  One reason is that they may not understand how a new practice or program might fit into existing practices or programs.  As a result, they try to incorporate new practices and programs as is, sometimes forcing a round peg into a square whole.

My mother taught me that you can choke on food if you swallow it whole. Science has also taught us that chewing your food also helps to prepare it for digestion. Leaders sometimes choke when they try to swallow new practices or programs whole. They really should listen to my mother, and should first chew on some important questions.  This helps to determine if and how a new practice can be incorporated - it helps determine how a proposed practice is best digested by the program.  And only then would you design the training needed.  Who should really be trained?  What level of training is needed given the professional's role?  Is the training an opportunity to engage a range of professionals across a range of departments or agencies in support of a more coordinated approach? These questions suggested a more carefully tailored or tiered program of training, and may be asked as part of a broader review to understand the infrastructure of the program under consideration.

Consider the example of ASD identification. There are more and more programs seeking to increase the accuracy and timeliness of identification by training community-based practitioners in the Autism Diagnostic Observation Schedule or ADOS, the gold standard upon which all researchers rely. I embarked on such a program in 2001 at the Delaware Autism Program.  Since then, I have become convinced that an ADOS is not needed to diagnose less complex cases of ASD, and that an insistence on the use of the ADOS potentially wastes time and resources that might be otherwise used to broaden efforts at screening, and increase options for more comprehensive assessments in more complex cases.

So what do I mean? I joined the Children's Hospital of Philadelphia's (or CHOP's) new Center for Autism Research in 2008 to be in a better position to advocate for a more coordinated approach.  The motivation there was that the waiting list for initial ASD diagnoses by CHOP's developmental pediatricians was 12 to 18 months.  Waiting lists like these plague specialized autism research and treatment centers across the country, and have barely budged despite increasing recognition of the importance of early intervention. So instead of simply training more community-based practitioners to administer the ADOS, I promoted a multi-pronged approach that tailored training to the role of each practitioner and agency. I advocated for a greater role by nurse practitioners at CHOP to diagnose less complex cases, and incorporated the necessary training it into our LEND training program. I initiated a new program for LEND trainees in ASD assessment and diagnosis. I also began to train community based practitioners in rural regions who typically referred patients to CHOP: this training focused on the use of a simple protocol - that is, without an ADOS - to diagnose ASD in less complex cases.  And I began to train other community-based teams to utilize the ADOS in more complex cases, as part of a more comprehensive protocol.

To streamline the referral process, I also considered the training needs of many different kinds of professionals (like speech pathologists, occupational therapists, physical therapists, and so on) who are more likely to encounter and refer children at risk but who do not formally diagnose ASD.  So at CHOP,  I initiated the first training in use of the Modified Checklist for Autism in Toddlers, or M-CHAT, for professionals within these other hospital departments. I did the same for their trainees participating the in LEND program, and brought them to practice ASD screening in homeless shelters next door to CHOP. Taken together, this strategy was intended to relive the pressure on CHOP's diagnosticians, reserve their expertise for the most complex cases, and build a better pipeline of referrals to make the training in ASD diagnosis at CHOP even more relevant.

Are there other examples of multi-layered programs of training that work only because they dovetail with changes of services? Probably, but these are rarely documented with the level of detail needed, and so I only have my own offer to offer.  This reflects a second barrier I began to describe in my first opinion piece; the relative lack of interest in questions of implementation among the community of ASD researchers.

Other barriers can found in the original research promoting the practice (an important issue which extends far beyond the scope of this opinion piece).  For example, training in a new practice is of little use without fidelity or integrity; that is, a mechanism to ensure that the practice is being delivered in the manner intended by its developers.  Too many outcome studies fail to establish treatment fidelity, or fail to make fidelity checklists or treatment manuals readily available for the practices under consideration. I helped to systematically review more than 100 outcome research studies addressing severe behavior problems published between 1995 and 2012.  Only about 1 in 7 studies included measures of fidelity, and only 2 studies described employing a manual.  Without a clear path to assure treatment integrity, practitioners may be reluctant to try something new, just as their program leaders may be reluctant to invest the resources needed.  Practitioners and leaders may not plant the seed through training if they do not know what the plant will look like.

There is also a lot of excitement surrounding the emerging framework offered by "implementation science". Implementation science looks far beyond traditional didactic training for strategies to promote system change. For example, the development of a community of practice can effectively complement training by building consensus surrounding the need for training, sharing ideas and resources, and troubleshooting problems as they arise. But in my experience, communities of practice take time to develop, and can struggle to increase buy-in among the more skeptical.  And while a community of practice may mobilize champions for change, it will never provide boots on the ground.... you must still empower someone within an agency to deliver the training needed and to undertake the other specific, concrete steps that must be taken for the new practice to be fully  implemented.

Our current model of science may only begin to fully embrace implementation science once we can define a more systematic and empirically-based approach to documenting successful implementation. Until then, I am not sure what kind of "science"  supports implementation. This may be one reason why the journals of interest to ASD researchers may not publish examples of successful implementation... and only then will agencies consider funding projects to demonstrate successful implementation. So, though interest in implementation science is emerging, the field is fallow and cannot yet support vigorous growth. Until we can help this field to grow, we have no forum for sharing innovations in the use of training together with other program changes to promote the adoption of new practices.   We may seed innovative implementation, but we cannot re-seed it elsewhere.

What are examples of concrete steps that advocates and leaders can take more quickly to help training stick, to help the seeds to grow?  Consider what kind of oversight might be needed to ensure the practices continue to be used as designed, after the initial enthusiasm engendered by training subsides.  Is the training accompanied by specific tools that supervisors or program leaders can use to monitor effective implementation?  Shaping treatment fidelity checklists into more comprehensive practice guidelines is one way to draw a straight line fro research to practice.  Tailoring a parallel training for supervisors and program leaders in these tools also provides an opportunity to make them into champions by clearly indicating the potential benefits.  And sometimes including advocates can build the excitement and commitment needed to sustain longer-term changes.

Consider looking beyond the training to actual implementation. Are there enough staff - and the right kind of staff - to deliver these new practices? And how do they incorporate the new practice into their day-to-day work? This was a critical component to the consultation I provided to a preschool program concerned that a proposal for a traditional program of training and coaching in ABA-based teaching techniques would not change the practice of teachers who had relied on more developmental approaches for many years. In the end, training in these new techniques led to their successful adoption program-wide within six months because I looked carefully at - and then systematically addressed - implementation barriers: I considered at existing schedules and lesson plans on a cases by case basis with the teachers, mapped out how these new techniques could not only fit in but enhance their work, and then identified and implemented the right level of support for teachers and assistants.  All of this was accomplished without any appreciable increase in overall cost, except for my consultation time.

These kinds of considerations can also raise related policy issues. How are increases in services funded?  Which students, patients,  or clients are eligible for this new practice? Do we need to re-define roles and responsibilities for front line staff or supervisors? Will these changes lead to an overall increase in the number of people served and the size of the program.  If so, how will this be phased in?

How do we make sure that these other considerations are not overlooked when developing a new program of training? We might begin with the training and experience of researchers and leaders pursuing this work. The lead researcher may have no formal training in delivering services, and the program leader may struggle to understand the relevant research behind the proposed program. Professionals on the research team participating as co-investigators or consultants may have never held a position delivering services in the community, outside of a university setting.  And while some have been responsible for delivering services in the community, few are likely to have experience leading community-based programs.

Fixing this problem requires a another significant shift in our current approaches to the training of academics and program leaders.  Until this is fixed, program leaders researchers will probably continue to assume we can just Train and Hope.  We may see more training just wither on the vine, or even fail to take root. In the meantime, funders of research and other programs seeking to disseminate new practices might consider asking more questions when a new proposal lands on their desk, a proposal  that assumes we just need to... train and hope.

Related Content

On this site

The logo is adapted from Joaquim Alves Gaspar's drawing of Pedro Reinel's compass rose.

X

My Related Publications

(2013). Autism Services Across America: Roadmaps for Improving State and National Education, Research, and Training Programs. Paul H. Brookes Publishing Co., Baltimore, MD

(2010).  Ott, Megan T., Levy, Susan E., & Doehring, Peter. Early Autism Screening and Identification Clinic (EASI):  A Nurse Practitioner & Physician Clinic Model. National Autism Conference. State College PA.

(2014). Doehring, Peter, Reichow, B., Palka, T., Philips, C., & Hagopian, L. Behavioral Approaches to Managing Intense Aggression, Self-Injury, and Destruction in Children with Autism Spectrum and Related Developmental Disorders: A Descriptive Analysis. Child and Adolescent Psychiatric Clinics of North America, 23 (1) 25-40.

Guideposts

Straight Lines Build professional development around core principles that translate research into practice guidelines with clear outcomes

Straight Lines To replicate a practice or program, consider `the infrastructure of funding, staffing, populations, curriculum, and settings

Other Lessons Create tiered professional development programs that tailor content to the role and expertise of each staff and agency

Other Lessons Optimize the amount and type of staffing support needed by creating a schedule that maps activities by intensity and expertise