The 2017 Supreme Court decision Endrew F. v. Douglas County School District highlighted the importance of monitoring students’ progress toward appropriately challenging individualized educational program (IEP) annual goals and making changes to students’ educational programs when needed. In this guide, we explain how educators can establish IEP goals that are measurable, ambitious, and appropriate in light of the student's circumstances.
Error message
The page you requested does not exist. For your convenience, a search was performed using the words in the page you tried to access.
Search
Resource Type
DBI Process
Subject
Implementation Guidance and Considerations
Student Population
Audience
Search
Successful implementation of a multi-tiered system of supports (MTSS) and, specifically, intensive intervention through the data-based individualization (DBI) process, demands the collection and analysis of data. As teams consider data collection, challenges may occur with assessment administration, scoring, and data entry (Taylor, 2009). This resource reviews three data collection and entry challenges and strategies to ensure data about risk status and responsiveness accurately represent student performance and minimize measurement errors.
Diagnostic tools provide data to assist educators in designing individualized instruction and intensifying intervention for students who do not respond to validated intervention programs. Diagnostic tools can be either informal, which are easy-to-use tools that can be administered with little training, or standardized, which must be delivered in a standard way by trained staff. Teams may find it helpful to initially consider using more informal and easily accessible diagnostic tools and data to avoid loss of instructional time. Standardized diagnostic tools, which require more time to administer and interpret, may be required for students who continually demonstrate a lack of response or who require special education.
These professional learning training materials are intended to assist district or school teams involved in initial planning or implementation of data-based individualization (DBI) as a framework for providing intensive intervention in academics and behavior. The modules listed below provide an overview of the DBI process and more in-depth exploration of the various components of DBI.
This document addresses five guiding questions for educators to consider when reviewing and interpreting assessment data for English Learners and includes links to selected resources.
In this video, John M. Hintze, Professor in the Department of Student Development at the University of Massachusetts Amherst explains why it is important to consider whether an assessment is biased against a specific sub-group.
This IRIS Star Legacy Module, the second in a series on intensive intervention, offers information on making data-based instructional decisions. Specifically, the resource discusses collecting and analyzing progress monitoring and diagnostic assessment data. Developed in collaboration with the IRIS Center and the CEEDAR Center, this resource is designed for individuals who will be implementing intensive interventions (e.g., special education teachers, reading specialists, interventionists).
When a student fails to respond to a validated intervention, teams need to identify why the student is not responding to determine how to adapt the intervention. Diagnostic data can assist teams in this process. They may be used to understand a student’s specific skill deficits and strengths or to identify the environmental events that predict and maintain the student’s problem behavior.
In this video, Mike Jacobsen, Assessment and Curriculum Director, White River School District in Washington State discusses how their districts planned for and implemented intensive intervention within the districts RTI model.
Norms for oral reading fluency (ORF) can be used to help educators make decisions about which students might need intervention in reading and to help monitor students’ progress once instruction has begun. This paper describes the origins of the widely used curriculum-based measure of ORF and how the creation and use of ORF norms has evolved over time. Using data from three widely-used commercially available ORF assessments (DIBELS, DIBELS Next, and easyCBM), a new set of compiled ORF norms for grade 1-6 are presented here along with an analysis of how they differ from the norms created in 2006.