Building in Continuous Improvement

Overview: Why are continuous improvement systems critical to sustainability?

All tutoring programs and especially new tutoring programs need improvement. Continuous improvement systems allow you to gather, act on and share the information needed to reach and exceed program goals and inform and build support from stakeholders. Without intentional continuous improvement systems embedded in your program, your program will not generate the outcomes necessary, nor will you have the data available to advocate for ongoing support and growth of your tutoring program.

How should you use data for continuous improvement?

What are data reviews for?

Routine data review and formal reflections provide a systematic and timely way to evaluate effectiveness and assess how to adjust the model or its implementation when necessary.

Routinely reviewing recent student data on a small scale (e.g., weekly reviews of school-wide data, or even daily reviews of class-wide data) will allow tutors and their supervisors to catch small gaps before they before they widen, then adapt implementation tactics to meet the specific needs of individual students. Formal data reflections at a larger scale (e.g., monthly, quarterly, or annual analyses of district-wide data) will allow your program to tell a clear story to all stakeholders about its impact so far, then make data-informed recommendations for changes in implementation strategy (or even revisions to the underlying program model).

Read more about Program Evaluation and Improvement on the National Student Support Accelerator website.

Assigning Responsibilities

For each dataset you collect for your Performance Measurement Plan, outline the following:

  • Who is responsible for collecting these data? When and how will they collect them? How often?
  • Who is responsible for reviewing these data? When and how will they review them and distill insights?
  • Who is responsible for acting on the insights distilled from the data review? What is their timeline?
  • Who is responsible for supporting the people acting on the data, and what form will this support take?
  • Who needs to be informed about the data, insights, and actions? Who will inform them, and by when?

Data Review Protocol

Standardizing a data review process helps set a clear expectation that the objective is not simply knowledge, but action based on knowledge. Any Data Review Protocol should ensure that raw data are converted into a clear and digestible format beforehand, so that reviewers can focus on interpreting the data, not deciphering it.

The next page lays out a detailed template agenda/protocol for a data review. Your tutoring program leadership team might apply this protocol to end-of-year outcome data; a program lead might apply it to training data at the end of tutor preservice training; a leadership team might apply it to quarterly caregiver feedback. While this protocol can be used in a wide variety of contexts, the rationale for data review remains consistent:

  • When: Review data as soon as possible after collecting relevant data because outdated data are less valuable and actionable.
  • Why: Focus on learning and improving rather than assigning blame for shortfalls.
  • Who: Empower the facilitator to guide the conversation and make sure every voice is heard.
  • What: Review both aggregated data and disaggregated data. Disaggregating data by demographics and other characteristics will reveal impact across lines of difference: race, gender, IEP status, home language, school, etc.
  • How: Prioritize quality over speed and adjustment time based on the scale of the review:
    • Tutors reviewing daily assessment data for their students should only need about 15 minutes.
    • An entire team reviewing the past year’s worth of data might take an entire day to review them all.
Data-Review Protocol
Step Purpose Possible Questions
Step 1:
WHAT did we want to happen?
Ensure all participants are on the same page about what the goal or intended outcome was
  • What was our goal? (Refer to any relevant performance expectations from the Performance Measurement Plan.)
  • What was our plan for reaching this goal?
Step 2:
WHAT actually happened?
Ensure all participants are on the same page about what the actual outcome or result was

Explore the divergences between expectations and realities

  • Did we meet our goal? What did we achieve?
  • Did we follow our plan? If not, where did we diverge from it?
  • Where were the differences between our intent and our impact?
Step 3:
WHAT did we learn?
Reflect on successes and failures during the course of the project, activity, event or task

The question ‘Why?’ generates understanding of the root causes of these successes and failures.

  • What worked?
  • What didn’t work?;
  • What could have gone better?
  • Was our plan a success? Why or why not?
Step 4:
WHAT can we do better in the future?
Generate clear, actionable recommendations and next steps for future projects
  • What would we do differently next time?
  • What advice would you give yourself if you were to go back to where you were at the start of the project?
  • What two or three key lessons would you share with others?
  • What should be different one year from now (or after the next similar project) given this conversation?
  • What comes next for us on this project?
  • Are there any lessons for you, personally, to internalize that may not be relevant to the wider group?
Step 5:
WHAT changes do we need to make to our project and individual plans?
Incorporate key lessons into your future actions

Document all key lessons for those who may inherit this project in the future

  • Have we added all reflections and next steps to individual plans?
  • Have we added all reflections and next steps to project plans?