Audio Tour of Behavioral Progress Monitoring Tools Charts

Transcript

The National Center on Intensive Intervention has established four Technical Review Committees, or TRCs, to review the scientific rigor of commercially available tools and interventions that can be used as part of a data-based individualization program for educating students with disabilities who require intensive intervention due to persistent learning and behavior problems. Results from the TRC reviews are published on NCII’s website through a series of what we call tools charts. The purpose of this video is to provide a demonstration of the tools chart on behavioral progress monitoring.

Each TRC is made up of between ten and twenty-nine individuals with relevant expertise. The TRC on behavioral progress monitoring includes eleven experts selected for their strong measurement expertise and their experience related to behavioral progress monitoring. Since 2012, this TRC has issued calls for behavioral progress monitoring tools. In response to these calls, interested vendors have submitted their behavioral progress monitoring tools, as well as evidence documenting the efficacy of those tools, for consideration and review.

On the progress monitoring tools chart, in the first column on the left, you will see the names of the behavioral progress monitoring tools that completed the TRC review process. In the second column from the left, you will see the scales that correspond to each of the progress monitoring tools. For each tool, the chart contains information, divided into four separate tabs, for four separate aspects of each tool — ratings on the psychometric standards, ratings on the progress monitoring standards, ratings on the data-based individualization standards, and usability information.

The first tab shows the psychometric standards and how each tool was rated by the TRC on each of these standards. There are three major psychometric standards that were established by the TRC. These are: reliability, validity, and disaggregated reliability and validity data. In the cells underneath these headers, you will see how each tool was rated against these standards. As described in the legend underneath the chart, a full bubble means that the evidence submitted for that standard was rated by the TRC as “Convincing.” A half-filled bubble means that the evidence submitted for that standard was rated by the TRC as “Partially convincing.” An empty bubble means that the evidence submitted for that standard was rated by the TRC as “Unconvincing.” A dash indicates that the vendor submitted no evidence for that particular standard.

To learn more about what a particular standard means and how it was rated by the TRC, you can click on the circled “i” in the corresponding column heading. For example, if you want to know about how the TRC evaluated reliability, click on the “i” in that heading, and you will see an explanation of the term. You will also see a rubric that explains how the TRC determined the tools’ ratings on each dimension.

The next tab shows how the tools were rated on the progress monitoring standards. There are two progress monitoring standards: sensitive to student change, and levels of performance specified. Together, these standards are related to how well the assessment functions as a progress monitoring tool that can accurately detect small changes in student behavior over time. Technical information about how to interpret each of the individual standards can be found by clicking on the lower-case “i” in any of the column headers.

Third, the data-based individualization standards tab within the tools chart includes two standards: data to support intervention change, and data to support intervention choice. Together, these standards are related to the extent to which use of the tool is associated with positive student or teacher outcomes.  Again, click on the lower-case “i” to learn more about what each individual standard means and how it was rated by the TRC.

Last, the usability tab provides basic information on the implementation of the behavioral progress monitoring tools. There are three usability components: assessment format, rater/scorer, and usability study conducted. Together these columns detail how the tool is used.

There are a few additional features of the chart that are important to note, as they provide users with additional information to help them in making decisions about which tools might best meet their needs. First, by clicking on any of the actual tool names in the first column, users will come to an implementation table. So, for example, if you click on BIMAS, you will come to a table that provides practical, descriptive information about the program, including what it’s used for, how it works, how much it costs, what resources and training are required to administer it, and where to go for technical support.

In addition, users have the option of clicking on any of the ratings themselves, which will then provide more details about the actual data that were submitted for review. For example, if you click on any of the bubbles in the column for reliability, you will see the actual reliability data submitted by the vendor for this tool.

 If you are interested in seeing all of this information in one place, simply click on the name of a tool to view the implementation table and scroll down, and you will see all of this supplementary information in one place.

It is important to note that the National Center on Intensive Intervention does not endorse or recommend the products included in the chart. Rather, the Center is providing this information to assist educators and practitioners in becoming informed consumers who can select intervention and progress monitoring programs that best meet their individual needs.

For more information about the TRC process or the tools charts, please contact the National Center on Intensive Intervention by email at NCII@air.org.

Developed By: 
National Center on Intensive Intervention