Schools that conducted more frequent analysis of data—and then used that analysis to shape their instruction—made greater gains than schools that analyzed and used data less frequently. However, schools that analyzed data more frequently without using that data to shape instruction were, if anything, less effective in raising student achievement.
First, let’s be specific about what this means because it’s a bit tricky. CEPR is not saying that analyzing data is bad. In fact, they found the opposite: students in schools where educators spent more time analyzing data and using data to shape their instruction achieved greater gains than students in schools that spent less time analyzing data. This was true across the study—both for schools that partnered with ANet and schools in the control group.
But more data analysis was not correlated with more learning if you hold constant teachers’ use of data to shape their instruction. For example, let’s say two teachers used data each month to create small groups in their classrooms. Both of those teachers’ students outperformed students in classrooms where teachers were not using data to create small groups. However, students in the classroom of the teacher who analyzed her data once that month performed just as strongly as students in the classroom of the teacher who analyzed his data four times in that month but did not adjust the groups. If anything, more analysis within the month correlated with lower student performance. Again, this was true among both ANet schools and the control group.
“When data analysis becomes one more thing we have to do rather than something that shapes the things we are already doing, it makes teachers feel stressed and it doesn’t help students. ”
It’s no wonder that so many teachers find assessments and data distracting. They can be if they aren’t implemented in a way that empowers teachers to use data as a springboard for instruction. In many ways, this finding is intuitive. We knew looking at data alone wouldn’t increase student achievement. In fact, this is why we included coaching in our program to begin with—we knew schools needed help using the data to shape action. And that the path to helping schools was not through one-off workshops on how to use our assessments, but rather side-by-side coaching with school leaders to ensure that teachers felt regularly supported in using data as a springboard for instruction.
But it has also highlighted that we must continue to examine how we’re best helping educators build these skills since not all schools took instructional action based on the data. What are ways instructional leadership teams have moved from analysis to action with their data? Here are two themes that reflect the best approaches we have seen, and that we are more deeply incorporating into our coaching support of leadership teams:
When data analysis becomes one more thing we have to do rather than something that shapes the things we are already doing, it makes teachers feel stressed and it doesn’t help students. Effective schools use data to reinforce a small number of school-wide instructional priorities.
Watch the first 90 seconds of this interview with the leader of one of our partner schools in Illinois. Listen to the way Principal Love talks about the school’s first school-wide priority—math achievement—and the way his leadership team chose to shift focus to another priority—text dependent questions—for a specific time period based on their data. There are two great things to note here. First, rather than adding another set of things to do on top of the existing math priority, the team looked at the data and chose to stop doing some things. Second, they aligned all their actions with that decision. For example, rather than continuing their weekly PD sessions as originally planned, they re-focused the content on their chosen priorities. Those actions by the leadership team make it easier for teachers to move beyond data analysis and toward consistent actions that will help students learn.
The “Data Meeting” has become the anchor for analyzing interim assessment data in most schools, but it is often disconnected from the rest of the teaching and learning cycle—particularly the structures that leaders use to observe classroom instruction. In our most successful partner schools the “data meeting” might better be termed the “prepare-for-your-teaching-and-feedback” meeting because it is deeply interconnected with the way the school uses observation to help its teachers get better over time.
Here’s a simple practice that educators at one of our partner schools in Boston use: Before they leave a data analysis meeting, teachers send a calendar invite to the math or ELA lead in their building. This invite includes the topic they are going to teach, a lesson plan, and the time of the lesson they are going to deliver. This enables the math or ELA lead to focus his or her precious observation time on the specific areas in which the teacher needs help, and it helps the teacher move from analysis of data to use of data by creating a teaching plan that will help both him and his students learn. Finally, it sets up data as a tool for leaders to foster a supportive feedback culture among adults in the building.
We’ve already taken steps since the study started to help data use fit more seamlessly into the everyday planning and instructional actions of teachers. For example, we’ve embedded videos, vertical progression maps, sample lessons, and other instructional resources directly into the platform our partners use for data analysis. This allows teachers to go directly from lesson planning to their data so that they can check to make sure they are anticipating content students may struggle with. And it allows them to go directly from their data to a sample lesson that can help them address specific student needs.
Today, teachers spend as much time looking at the planning and instructional resources on our online platform as they do examining the data from the assessments. And they have started using our assessment content as a tool to help them establish a vision for the level of mastery they are trying to help their students achieve. In this way, assessments are no longer a distraction from instruction, but they actually help shape it.
We see this as a positive shift that connects to our i3 study findings—from data analysis alone, to instructional action. But we can do more, and need to do more. Teachers in our partner schools have told us that, in order to feel they can incorporate data use into the way they support their students, they also need help drawing explicit connections between their assessment data and the standards they are teaching. So we’ve been working closely with Student Achievement Partners, the Vermont Writers’ Collaborative, and others to ensure that our coaches are deeply knowledgeable about instructional changes demanded by the Common Core. Being at the forefront of current thinking on rich, rigorous content and great instruction means our coaches can help educators draw those connections between the content they are teaching and what their students know.
We hear the desire from teachers for more help on instruction. We also hear it from principals. And the i3 results reinforced why: when there is no one to help draw explicit connections between assessment data and the standards that teachers are teaching, data can feel like a distraction. But by helping make those connections clear—and by offering expertise on the instruction that the standards demand—we can help our partners support deeper student learning.