Stories about ASD

The Mystery of SB93

ASD Roadmap

Why define and document impact?

Require public reports of outcomes that document real improvements in the lives of people with ASD

September 19, 2019

 

Tracking down a project summary describing the impact of a new program or a new practice can be very rewarding... or very frustrating. Find it and you might decide immediately to try a new program or method. Or continue the search, hoping that the claims of effectiveness are more than just hot air. A generation of tracking down claims of effective programs has taught me that where you should look and what you should expect to find depends on the nature of the proposal and the institution.

But find a project summary, and you are just getting started.  Your next task is to critically consider how progress was defined; does the progress reported only describe the activities undertaken, or does it actually capture meaningful improvements in the life of someone with ASD?  Let's consider both questions in the case of SB93.

Documenting impact

Aside from the original text of the legislation, no summary of the main initiatives funded through SB93, or the progress attained to date, is readily available to the general public (subsequent inquiries turned up reports provided to state legislators - more on that below).  Is this significant? It depends on what you might realistically have expected to find, given the nature of the project.

Published summaries of research projects? Easier to find

For projects that result in new knowledge gained through scientific research, the expectations are generally clear: every attempt will be made to publish any significant findings arising from the project.  This does not describe SB93, but may well apply to other projects of interest to readers, or other work conducted in university settings, so let's elaborate

Significant research findings are typically reported through publications in academic journals and often referenced via presentations at national professional conferences that describe the impact. Records of the publications and presentations - and often the publications themselves - are readily accessed. Publications in peer-reviewed academic journals are especially valuable, because the projects are more likely to meet certain standards (although the rise in pay-to-publish journals has upended the process). These publications are especially important for projects funded through federal grants. These publications are also very important for project leaders who are university based, and whose advancement often depends on their record of publications. Publication is a slow process, however; analyses are usually only undertaken once the entire study has been completed, while final revisions through the peer-review process may add another 1 to 2 years.  And many peer-reviewed journals come with pricey subscriptions that only a university can afford, making these kinds of articles inaccessible to anyone who is not university-based.

But the likelihood that you will find a summary decreases dramatically when the findings are insignificant or ambiguous. The failure to publish failures - the relegation of failed studies to the file drawer - is a well established problem.  In fact, the combination of such failures and the other publication delays referenced above probably explain why almost one-half of the applied research grants included in my recent review of research on ASD identification had yet to yield a single relevant publication. So, even when expectations are clear and the incentives to publish are powerful, publications do not always result.

Published summaries of other projects like SB93? Not always so easy to find

Finding summaries of projects focused on delivering services or training instead of research, and summaries of projects funded through state legislation or donations instead of federal grants, is more difficult.  As a state-funded project focused on improving services, training, and policy, SB93 falls clearly into this category.  Why are summaries of these kinds of projects harder to find?

First, relatively few articles in academic journals are dedicated to summarizing programs of ASD services or training, especially in community-based settings. This gap may simply reflect that research on practices and their implementation in community-settings is relatively rare; in my recent review, only 1 out of every 7 dollars ostensibly dedicated to improving the screening and diagnosis of ASD through research actually went to testing new practices and their implementation.

Second, there may also be few incentives to publish these kinds of summaries.  For university-based faculty being considered for promotion, descriptions of programs of ASD services or training may carry less weight than a more traditional research publication.  And for other leaders working outside of academic settings, whose pay or promotion is not tied to publications, the effort required to craft a detailed summary is a disincentive to publish. Thus, the lack of academic publications or presentations at professional conferences describing the activities of the ICA or DNEA is not entirely unexpected.

The third reason - perhaps the most important - is that state and private funders do not always set or enforce clear expectations that a summary of the project's findings be made available to the public  in any form. While some funders may require an annual report, this might not be made available to the public, and may focus on expenditures without addressing any or all of the 5 criteria listed earlier. If this might seem surprising, consider that these funders are much less likely to have someone with expertise in this area to review a more detailed summary.

Other kinds of information?  Interesting, but rarely helpful in reliably establishing the progress achieved

So if our Plan C does not yield any relevant project summaries, what next?... back to Google!! And these days, it seems like you cannot have a project without generating some posts likely to show up in a Google search: a blog, calendar, social media, and so on, maybe bookended by the occasional press release. For a reviewer deep into Plan C and desperate to find any possibly relevant information, these posts can sometimes seem to fit the bill.   But do they? Consider these examples.

  • The press release: A well-crafted press release marking a project's launch can be an effective tool for building interest and awareness among key constituents,and can portend well for the project's success. But sometimes press releases do little more than list broad aspirations that simply sate an agency's appetite for publicity.
  • The Annual Report: The annual reports produced by publicly traded companies are chock full of details about the company's progress and performance. These are so carefully scrutinized by shareholders eager to determine whether to buy, sell, or hold that the company can be sued if the report misleads investors about progress!  The annual reports typically produced by agencies that provide ASD services, training, or advocacy are, however, quite different. While they can celebrate key milestones, describe core activities, and acknowledge important funders, they rarely provide the level of detail needed by a funder to begin to determine the ROI. Some are little more than a glossy collection of carefully curated statistics intended to impress, scattered among testimonials intended to inspire.
  • The event calendar: In the absence of clear reports of progress, event calendars can still provide a window into the ongoing work, especially when they are linked to supporting documents.  In my review of statewide committee and programs, I often had to rely on these calendars to gain insights into the activities undertaken (or to at least be assured that anything was going on!).  This is especially true for projects like SB93 centered on key events, like the trainings organized by the DNEA, and so on. In such cases, the absence of this kind of information is not insignificant.
  • Meeting minutes and handouts: For initiatives centered on periodic meetings (like the ICA), minutes or handouts resulting from presentations at these meetings can offer some insights into the activities undertaken. Upload these documents to a website and they are readily available to any interested party.  But unless these include systematic summaries of progress to date with respect to specific and meaningful goals, they rarely provide the level of information needed to establish an ROI.

In the case of SB93, relatively little information falling into these categories turned up in web searches.  Not surprisingly, the little information gleaned offers few insights into the project or the progress achieved.

So what is reasonable to expect from projects like those funded through SB93?

In the end, the funding and the nature of the project really dictate whether a publicly available summary should be expected. Consider a project funded through a private donation to develop a new program in a single school.  The only product is the program itself; there is no expectation that the program would be expanded to, or serve as a model for, other settings. And the only person to whom a project leader is directly accountable is the funder. Why provide a summary if the funder does not require one?

Projects like those funded through SB93 are different.  As a state-funded initiative, project leaders are accountable to state legislators, who are themselves accountable to voters.  This kind of project is intended to increase capacity and improve coordination of public services across the state - in part by engaging the public. And raising awareness and mobilizing professionals and the public alike are central to the project's success.

So in the case of initiatives funded through SB93, I think it is just makes sense to expect that a summary documenting the program's impact is posted to a project website. Postings like this avoid the delays introduced by peer review and the barriers created by journals that require a pricey subscription.   Thus, it should not be surprising that these kinds of expectations were incorporated into SB93; the legislation stipulated that a website describing the activities of the ICA and DNEA would be developed and maintained.

Defining impact

Read the goals listed for almost any project involving ASD services, training, and research, and chances are you will find language implying that these efforts will directly and/or eventually benefit people with ASD in some important way. Don't kid yourself: designing projects that achieve meaningful and measurable improvements in the lives of people with ASD is a challenging task, the description of which will require a separate essay.  Nonetheless, there are some lessons that can be quickly drawn about the kind of impact typically reported for initiatives like those funded through SB93.

The most common problem is that most initiatives do not directly assess whether someone's life is measurably improved. Instead, they measure other things that are believed to be associated with , or to eventually lead to, these kinds of improvements. In almost every case, however, this is just a belief, because research has not demonstrated that these kinds of proxies for improvement ever actually correlate reliably with real improvement.  But this practice is so pervasive we do not think twice about it.

Consider this example: new outcome research demonstrates practices that prove to help people with ASD to sleep better, and a training manual that has been developed has been shown to deliver the same results with typical practitioners.  We can imagine what might happen next.

  1. A training has been delivered to some practitioners.
  2. Practitioners who attend this training show increases in their knowledge.
  3. Practitioners apply these new techniques with their patients and clients
  4. Patients and clients have access to coaches to answer their questions as they implement these new technique that professionals helping them have learned.

So what's missing?  Any confirmation that people with ASD are actually sleeping better!  We might gather evidence demonstrating that we have undertaken 1, 2, 3, and/or 4, but there is no guarantee - or even a reasonable level of assurance - that sleep has actually improved.

Nonetheless, I suspect that the vast majority of research and other projects depend on this implicit and similarly flawed logic to define their outcomes and justify their choice of methods.  The most common assumption is that more knowledge leads to better practice and on to better outcomes.  How many researchers hail the discovery of a characteristic that distinguishes ASD from other disorders or that correlates with some measure of outcome as a breakthrough, assuming that this knowledge alone will translate immediately into improved practice?  How many trainers congratulate themselves for a job well done when trainees report being delighted with the training they received and demonstrate increased knowledge about the practice in question? While evidence that 1 to 4 have been successfully completed is a step in the right direction, this is just one small step on a much longer and more uncertain journey.  For evidence, we need look no further than the pervasive gaps in early and accurate identification of ASD that persist despite 20 years of intense research and advocacy. Clearly, knowledge alone has been insufficient to generate widespread improvements in rapid and accurate diagnosis!!

As is the case with most other projects I have seen, these are the kinds of data used to describe the outcomes of SB93.  These data were summarized in a report provided to state legislators describing the progress achieved to date (these reports have yet, however, to be made public). These reports list how many trainings were provided, how many people attended each training, how satisfied attendees were with the training, and how much their knowledge increased as a result of the training. These data certainly speak to the efforts undertaken to date, and begin to offer insights into just how much time is needed to deliver these kinds of trainings.  But these data leave many more critical questions unanswered. Do trainees implement these practices? Do people with ASD benefit from them?

And yet it is sometimes not that difficult to define meaningful outcomes that can be measured through initiatives like these. Consider two of the four priorities established by the ICA related to improving the identification of ASD in school and hospital settings, a goal central to many other statewide initiatives undertaken elsewhere in the US.  There are many reasons why this is a worthwhile goal: we already have tools for reliably identifying ASD, research has clearly documented persistent gaps in early and accurate diagnosis (especially for traditionally underserved groups), resources are already allocated within schools and hospitals to undertake these assessments... the list goes on.

So given these priorities, how can you define desired outcomes in addition to (or maybe even instead of!) just identifying best diagnostic practices or providing more training? Well, you can survey parents of recently diagnosed children to learn about the delays and barriers they experienced, and you can follow-up with parents who seek help from navigators to access these services for their child.  You can begin to establish the delays and barriers to accurate diagnosis that people in Delaware face right now. While a traditional researcher might ask valid questions about whether these kind of data offer the most accurate estimate of the extent of the problem, these kinds of data nonetheless offer a tremendously useful baseline against which to begin to measure progress. This baseline is especially informative for the leaders responsible for assuring rapid and accurate diagnosis. This might lead you to to define a goal to decrease these delays and barriers. These kind of data might also help to identify specific gaps and barriers that future training and policy initiatives might begin to target.  And these kind of data provide the kind of direct and immediate feedback to advocates and funders that the initiative is having the desired impact.

As an aside, it is worthwhile to reflect briefly on the complex but pivotal roles that family navigators - perhaps the must successful component of SB93 to date - can play in these kind of initiatives. Of course, navigators pay a critical role by helping families get the right information and resources (in this case, to access a rapid diagnosis). And navigators also provide support and guidance to overcome the barriers that might arise across a whole host of issues. The immediate benefit provided by family navigators compensates somewhat for the inevitably delayed benefits of capacity building, achieved through training and program development, that initiatives like SB93 promote.  But some parents also function like canaries in the coalmine, their concerns providing an early, unfiltered warning of potentially dangerous breakdowns in the system of care.  Program leaders need look no further than feedback from navigators to identify worthwhile goals; indeed data on the calls fielded by family navigators in Delaware, shared at recent ICA meetings, confirm that diagnosis remains an ongoing concern across the state.  And data gather by navigators can help identify specific gaps and barriers to target.

There is no doubt that gathering and evaluating these kind of data requires additional planning and often some re-allocation of effort but, believe me, this is well worth it! My commitment to this kind of data-based decision-making is driven by 15 years of trial and error (and error and error) leading specialized school and hospital-programs, and experimenting with many different strategies for program development.  It would be wonderful if you could change practice and deliver outcomes just by providing knowledge... but that almost never happens!! There is no short cut in the long-term: developing new programs and implementing new practices requires a lot of time, effort, and resources. There is tremendous pressure on leaders of programs that deliver services to find a short cut, and you juggle too many responsibilities - you are accountable to staff for their time, accountable to the bean-counters for budgets, and most importantly to patients, students, and their parents for results.  But in the end, there is never any real short-cut to demonstrating real and meaningful outcomes!

 

The logo is adapted from Joaquim Alves Gaspar's drawing of Pedro Reinel's compass rose.

X

My Related Publications