In a healthy RTI/MTSS practice, a data-driven approach is not only important for guiding decisions for individual student needs, but it’s also critical for evaluating the quality and impact of the practice at the school and district level. We recommend that school and/or district leadership meet three times a year, following the administration of universal screening assessments, to reflect on and evaluate their practice. The goal of this meeting is to understand the health of school-level RTI/MTSS practice by looking at the percent of students who are adequately being served by the core, the equity of instruction across demographics, and improvement in student outcome measures since the last meeting. These metrics are used to evaluate the quality of practice across tier 1, 2, and 3 levels of support and guide school-level improvement plans.
WHO SHOULD ATTEND?
Data specialists (e.g., AP or counselor)
Student service/instructional service representative
Special Ed representative/teacher
Grade-level rep (large schools) OR Gen Ed teacher rep (small schools)
WHAT ARE THE GUIDING QUESTIONS?
Question 1: Is our core curriculum adequately supporting students?
In a healthy MTSS practice, the core curriculum should be strong enough that it is meeting the needs of approximately 80% of students, i.e., 80% of students are on track to meet grade level expectations and require tier 1 level of support. Unfortunately, many schools across the country have become familiar with the term “flipped pyramid” in which the core is not sufficient for the majority of students, and a far greater percentage of students have screening data that suggests the need for tier 2 or 3 level of support. To run an efficient MTSS practice, it’s simply not possible to provide additional support to the majority of students, and far too often, schools address this by adjusting their criteria for tier 2 or 3 support so that they are only providing intervention to the high needs students. To run an efficient and effective MTSS practice, schools need to fix the quality of their core instruction.
After each screening period, school and district leaders should be reviewing these data at the grade and campus level to identify areas in need of a stronger core curriculum. For areas that do require improvement, first verify that what is being used is evidence-based. More information on evaluating the evidence-based of your core curriculum can be found here.
In addition to being evidence-based, it’s important to evaluate the quality of implementation to ensure teachers are delivering the content as intended and with fidelity. The following instructional strategies should be used to ensure that the curriculum and lessons are impactful (Chard et al., 2008):
Systematic Explicit Instruction: Skills are taught from less to more complex using direct, clear and concise instructional language.
Differentiated Instruction: Students have different levels of background knowledge and school readiness; differentiated instruction engages each student in active learning according to his/her needs. The content of instruction, delivery of instruction, and targeted level of instruction can be differentiated.
Flexible Grouping: A combination of whole group, small group, and individual instruction allows teachers to create fluid groups that meet the needs of all students.
Active Student Engagement: Ensuring all students are actively involved during instruction and are not passive recipients; this can be accomplished with high rates of opportunities to respond, ample time to practice skills, and prompt corrective feedback.
Classroom Behavior Strategies: Proactively and explicitly teaching the expected behaviors and routines, frequent use of reinforcement and praise (4:1 positive to negative feedback loop), quick and efficient transition times, and consistent instructional response to misbehavior.
Question 2: Is the core supporting students equitably?
After examining the overall quality of the core instruction, it’s important to take a deeper dive to determine if it’s really providing the same quality of instruction for all students. School and district leaders should examine the percent of students on track to meet expectations on their screening assessment by different demographic groups (gender, race, ethnicity, English Language Learning level, and socioeconomic status). It is important to examine these percentages with the screening assessment data prior to adjusting tier placement because implicit biases and expectations can influence our decisions about performance and level of need. See our recent blog post and webinar. It’s also important to make sure the assessment you’re using for screening has been formally evaluated for bias. Your screening assessment provider should be able to confirm this and the analysis for most major assessments can be found on The Center for Intensive Intervention.
Question 3: What’s the quality of tier 2 and tier 3 levels of support?
Although this question seems a bit daunting to answer, simply examining the number of students moving up vs down in tier can provide great insight. At the end of each screening period, school and district leaders should examine how students moved at each tier level. For example, how many tier 2 students, when down to tier 1, stayed at tier 2, or moved up to tier 3?
If more students moved down to less intensive support, than moved up to more intensive support, the tier 2 support can be considered healthy. If more students are moving up, the tier 2 support isn’t working and should be more closely examined in the following ways:
What’s the evidence-level of interventions being implemented at that tier?
Are teachers aligning the intervention to skill needs of the students?
Do teachers have training in delivering the selected interventions?
Are the interventions being delivered for an adequate amount of time?
Are the interventions being delivered with fidelity?
As with evaluating the equity of core instruction, it’s also important to evaluate the equity of tier level support across demographic groups. To do this, examine the tier movement for each demographic group separately and compare.
In addition to identifying areas that need improvement, it’s also helpful to look for the bright spots and take a deeper dive into what those schools or grades are doing that is promoting successful outcomes
WHAT DO WE DO NEXT?
Make an improvement plan. Although there may be several areas that are identified as needing support, it’s important to leave the meeting with a goal for what you will work to improve before the next leadership meeting. As with student goals, improvement plan goals should SMART
Specific: they should have a clear articulation of what you are trying to accomplish
Measurable: they should be evaluated using a quantitative assessment
Attainable: they should be both feasible and ambitious
Relevant: they should be grounded in clear context of why you’ve determined the goal
Time-bound: they should clearly state when the goal should be achieved
The following forms can be used to help guide the conversation, document the data, and create your goals for the next meeting. To see it in action, join our webinar (access to registration below)
Dr. Dundas is the Chief Learning Officer of Branching Minds, where she pursues her mission to bridge the gap between the science of learning and education practice. Dr. Dundas has a Ph.D. in Developmental and Cognitive Psychology from Carnegie Mellon University where she conducted research on how the brain develops when children acquire visual expertise for words and faces. Her research also explores how the relationship between neural systems (specifically language and visual processing) unfolds over development, and how those dynamics differ with neurodevelopmental disorders like dyslexia and autism. She has published articles on that subject in the Journal of Cognitive Neuroscience, Neuropsychologia, Journal of Experimental Psychology: General, and Journal of Autism and Developmental Disorders. Dr. Dundas also has a M.Ed. in Mind, Brain, and Education from Harvard University; and a B.S. in Neuroscience from the University of Pittsburgh.