So, you have identified students needing a support plan, created goals, selected and implemented an appropriate intervention, and collected data using a progress monitoring tool or assessment. Fantastic! These are all necessary steps in the right direction to supporting students through your Multi-Tiered System of Supports (MTSS). But, NOW WHAT? How do you know if the intervention is actually “working”?
In my career as an instructional coach helping teachers implement intervention plans with struggling students, it all came down to the data. When reviewing data to identify and support students, it was especially important to set Specific, Measurable, Attainable, Relevant, and Timebound, or SMART, goals. This was the starting point when building and adjusting plans. But what specific data do we look at, and how do we know if the intervention plan is successful or needs to be adjusted? And when is it appropriate for students to move within tiers?
Part of this work is about determining the “just right” level of intervention, and like Goldilocks, there is a time to make adjustments. Just as Goldilocks collected data along the way to find what worked for her, with an intervention plan, there may need to be adjustments as data is collected.
As you and your MTSS support team consider the effectiveness of an intervention plan, the right set of questions can help you make informed decisions about student progress and next steps. This collaborative practice will not only set current students up for success, but also help determine effective instructional practices and interventions for the future.
1. How Many Data Points Do You Have?
It is a best practice to gather at least three progress monitoring data points before the plan is reviewed for effectiveness. Why three? Because after three data points are gathered, it is possible to calculate a trend line that shows the Rate of Improvement (ROI) that allows us to see how the student is growing in the specific skill. If growth is uncertain based on the three data points, wait for more data points before making any decisions.
Keep in mind that the data may vary and be higher and lower than the goal line due to a variety of factors, including testing environment, the time of day the student is monitored for progress, fatigue or illness, etc. For example, if the student is taking the assessment in a classroom versus in a quiet library, or before lunch versus after lunch, the student might perform differently. For this reason, more data is better! Analyzing multiple data points versus just a few will provide a more accurate assessment of the student’s progress. At the same time, you don’t want to wait too long if the intervention isn’t providing what the student needs. Strike a balance (just right!), and be sure that the progress monitoring tools are standards-aligned and appropriate for the student's needs based on the skill and goal area in need of improvement.
2. What Is the Student’s Rate of Improvement (ROI)?
As you look at the progress monitoring data, what is it telling you? Calculate the Rate of Improvement (ROI) to visually analyze the progress a student is making toward their identified SMART goal. An MTSS software platform such as Branching Minds makes it easy to see this trend line along with the goal line to identify whether a student is responding to intervention. If the student’s ROI is at or above the goal line, then the student is on track to meet the goal.
Rate of Improvement Showing Sufficient Growth
If it is difficult to see a pattern and the ROI is uncertain, take a close look at intervention fidelity and ensure that you have collected enough data points. If the rate of improvement is below the goal line, again check the intervention fidelity. If fidelity is adequate, the intervention is not working for the student, and the plan should be adjusted or changed. If a student or group of students is not showing any growth at all, the scenarios in this BRM article, “How To Respond to an MTSS Intervention Plan Showing No Growth” can be used to help you make decisions about plan adjustments.
See the charts below for guidance when making decisions about student progress with Tier 2 or Tier 3 plans:
Guidance for Tier 2 Decision Rules:
Growth/Rate of Improvement
3 consecutive PM data points at or above 25th percentile goal line
Move to Tier 1: Discontinue or fade out Tier 2 targeted small-group instruction
PM data consistently between 10-25th percentile
Stay in Tier 2: Maintain the current Tier 2 targeted small-group instruction for another cycle
Stay in Tier 2: Revise the current Tier 2 targeted small-group instruction and implement for another cycle
4 consecutive PM data points between 0-9th percentile
Uncertain or Insufficient Growth
Move to Tier 3: Increase intervention intensity to reflect Tier 3 level of support and implement for another intervention cycle
Guidance for Tier 3 Decision Rules:
Growth/Rate of Improvement
3 consecutive PM data points at or above 10th percentile
Move to Tier 2: Revise plan to reflect Tier 2 targeted small-group instruction, and implement for another cycle
PM data consistently below 10th percentile
Stay in Tier 3: Maintain the current Tier 3 intervention for another cycle
Stay in Tier 3: Revise the current Tier 3 intervention and implement for another intervention cycle
Consider Special Ed Referral: Review criteria and schedule referral meeting with team and parents
*The “percentile” represents the comparison of the student’s growth to what is average for the grade level.
Before adjusting plans, each component of the plan should be analyzed. Below are some considerations given each piece of the plan:
SMART Goal: When reviewing plans, you can ask, does the goal align with the needs of the student? Is the student aware of what the goal is and how to reach it? Make this explicit! Your goal should also be aligned with the norming chart based on the progress monitoring assessment used. Remember, this is where our support begins when building support plans. Understanding the student's needs and identifying a specific skill needing improvement is a crucial first step.
Progress Monitoring: Does the progress monitoring tool or assessment align with the goal? As mentioned previously, progress monitoring needs to be consistent and standards-aligned. When building our math support plans for a group of sixth graders, we used weekly short quizzes aligned to the SMART goal. If students were showing progress on the quizzes, we knew we could continue the intervention. If they weren’t successful, we would discuss why the student might be struggling as a team. When analyzing student data in MTSS meetings, we reviewed multiple pieces of data to determine growth. For example, I recentlyworked with a school to review a student’s progress monitoring data as well as their district benchmark data, NWEA Measure of Academic Progress (MAP) scores, and qualitative data from the classroom teacher to determine the student’s progress.
Intervention Fit: One size does not fit all when selecting an intervention. Ensure that the intervention is research-based and targeted to the specific skill deficit of the student and is culturally relevant. Also, consider whether there is an underlying issue such as a more foundational skill that the student has not yet acquired. One Branching Minds partner school saw huge gains in a student’s iReadygrowth. What was the change? The student was receiving intervention in their first language (Spanish) and was responding very positively based on the progress monitoring data.
Intervention Fidelity: Fidelity of interventions should be consistently tracked by the problem-solving team. Documentation of the implementation dates, number of minutes, and any other notes may be kept in logs and reviewed to help analyze the effectiveness and integrity of an intervention plan. In my work as an instructional specialist, our team gave at least six weeks to implement before deciding whether the plan worked for the student. In addition, MTSS support personnel should also periodically observe the intervention in action to identify adjustments that might be needed to improve the support implementation. I found it very helpful to directly observe students with our interventionists and offer feedback on their intervention implementation. Observing the intervention in progress allowed me to see what was working and what small adjustments might be needed. As a team, we were able to problem-solve together to find what worked best for the students and the interventionists.
If the intervention has not been delivered with fidelity, the plan should be discontinued and issues discussed in an MTSS team meeting. It could be that additional support and accountability are required or that a more feasible plan should be developed based on available resources and the student's needs. If the above critical factors are in place, but student performance is not improving, a change is required. Maybe the cadence or length of the intervention needs to be adjusted. The problem-solving team should come together and identify what needs to be adjusted in the plan by reviewing and analyzing the data.
Regardless of the type of intervention or progress monitor tool used, an objective look at the data should be the guiding factor in making decisions about the effectiveness of an intervention plan. A problem-solving team that meets consistently to analyze data will be able to understand what is working in an intervention plan and what needs to be adjusted. Be a ‘Goldilocks” with your adjustments—intervention plans are meant to be about finding the “just right” support to help the student make progress, and the Rate of Improvement provides an objective guideline for these adjustments.