When Providing School Climate Data, Researchers and Districts Should Also Provide Supports for Data-informed Decision Making

Research BriefHealthy SchoolsDec 17 2020

Correction 1/5/2021: We’ve updated this brief to correct errors that misstated findings from Walking a fine line: School climate surveys in state ESSA plans.

The correct number of states that included measures of school climate as their SQSS indicator or incorporated measures of school climate for quality improvement purposes is 13—not 16. Additionally, the correct number of states that indicated plans to pilot and/or incorporate measures of school climate is three plus the District of Columbia—not 13.


School decision makers have access to a broad and growing range of data that can inform decisions about how to best support students and improve schools. The No Child Left Behind Act of 2001 (NCLB) arguably shaped our current data culture by emphasizing assessment-based accountability, “scientifically based research,” and evidence of “effectiveness.” This law ushered in a new norm of data-informed decision making that focused largely on assessment, instructional practice, and teacher quality, but has since broadened to include other aspects of educational policy and practice. With the passage of the 2015 Every Student Succeeds Act (ESSA), states were required to add a fifth indicator on “School Quality or Student Success” (SQSS) to their school accountability systems. An analysis of submitted ESSA state plans found that 13 states included measures of school climate as their SQSS indicator or incorporated measures of school climate for quality improvement purposes. In addition, three states and the District of Columbia indicated plans to pilot and/or incorporate measures of school climate in the future,[1] signaling a growing recognition of school climate assessment and improvement to support the whole child.

As an increasing number of states consider including school climate measurement as an indicator within their ESSA plans, and more districts move toward implementing district-wide school climate surveys, it is critical to equip schools with the tools they need to productively engage with school climate data. This brief presents advice to other researchers or school districts providing school climate data to ensure that schools are best able to use those data.


Project Overview

From 2016 to 2020, Child Trends partnered with a group of public schools and public charter schools in Washington, DC to implement the “Improving School Climate in DC” project (ISC-DC). ISC-DC was supported by a grant from the National Institute of Justice (NIJ) under the Comprehensive School Safety Initiative (CSSI). Participating schools implemented Safe School Certification (SSC), a technical assistance and certification model consisting of eight key elements (leadership, data, buy-in, policies, student engagement, family and community engagement, training, and programs). SSC is designed to build on schools’ existing school climate work and develop their capacity to make data-informed decisions about programs and policies.

Download

All ISC-DC schools had the opportunity to participate in four years of school climate survey data collection using the U.S. Department of Education School Climate Survey tools (ED-SCLS),[2] and to receive annual detailed data reports based on their survey results. An additional two schools that were not part of the evaluation—and that did not implement SSC—also participated in school climate survey data collection and received a detailed data report.

Child Trends developed and provided schools with school climate data reports designed to contextualize and explain the data (see example data report). The reports did this in several ways:

  1. The reports presented aggregate survey results for each individual school—based on data from all students who participated in the survey—as well as differences by grade level, gender, race/ethnicity, sexual orientation, and gender identity, when at least 10 students were identified in any given group to protect student privacy.
  2. In the first year of the project, the reports compared each school’s climate scores to the scores of other ISC-DC schools, identifying strengths and areas of growth relative to other schools.
  3. After the first year of the project, the reports compared each school’s climate scores to its own scores from prior years, highlighting change over time—and, if applicable, the climate domains in which the school had consistently scored low or consistently shown gaps between groups of students. Reports also provided benchmarks for scores as defined by the U.S. Department of Education.
  4. Finally, the reports included brief narrative summaries of key findings, in addition to data tables and figures. These summaries were followed by action-oriented recommendations based on identified differences between groups and/or climate domains in which the school scored low relative to other schools or other climate domains.

Child Trends evaluated implementation[3] and outcomes associated with SSC.[4] As part of the implementation evaluation, we conducted semi-structured interviews with school points of contact and technical assistance specialists working with a subset of schools to understand their perspectives on implementation. These interviews, combined with informal feedback from schools, shed light on how schools reacted to, interpreted, and used the school climate data provided in the data reports.


Strategies to Help Schools Use School Climate Data

Across four years of collecting and sharing school climate data with participating schools, our research team identified five strategies that researchers, school districts, or others providing data to schools can use to ensure that schools are able to use those data for decision making.

Report data in a timely manner and in a format that supports data interpretation.

Data reports should be timely and go beyond summary statistics to present detailed, disaggregated, and contextualized findings that address the information needs of school decision makers.[5],[6],[7],[8] In the ISC-DC project, many school leaders mentioned that they were originally drawn to the project because of its focus on data. In fact, some schools had been actively looking for ways to become more data-driven. Moreover, of the framework’s eight elements, the data element was the most consistently implemented across schools. Even schools that never succeeded in assembling a core leadership team for the project still participated in survey data collection year after year. We found that delays in receiving the data report during the first year of the project led some school teams to dismiss survey results as outdated. We also learned, during the first year of the project, that school teams valued receiving data in a way that allowed them to readily identify their school’s strengths and challenges and track progress over time, but that they did not value the ability to compare their school climate data with those of other schools in the project. We incorporated this feedback beginning in the second year of the project and received positive feedback from school points of contact about the subsequent analysis included in the data reports.

“It’s been really nice to have the data synthesized and summarized, with just, ‘Hey, these are some key points that we noticed.’ Without necessarily, you know, giving an opinion. But at the same time, with the resources available, ‘Maybe explore this further, this area. Or this little data point.’ I think that was really helpful. Because when I see the whole spreadsheet, it’s like, ‘Wow, there’s just lots of numbers and arrows pointing up and down.’ And getting it in a different language, so to speak, that was helpful.”

–School point of contact

Provide data-focused consultation or technical assistance, along with training on data interpretation and data-informed decision making.

Given competing priorities and limited time and resources, it is important to help school leaders gain the capacity to understand, compare, and make use of a wide variety of data sources.[9],[10],[11],[12] School leaders in the ISC-DC project often had to juggle multiple demands on their time. This made it difficult for them to dedicate the necessary time to the project, including the time to review data reports and use the survey data for decision making. The task of reviewing the data was sometimes delegated to staff with more time available but with limited data interpretation and analysis capacity and/or knowledge of the school context. Technical assistance specialists reported that school climate data were most readily used for decision making by school leaders who were already comfortable with reading and analyzing data. Moreover, when school teams found time to engage with technical assistants in a guided discussion of the data, they were able to hone their data interpretation skills and more strategically use school climate data for decision making.

“I saw [some of] them thinking about their data a little bit differently … not just their [school climate survey] data, but also other data that they collect, and just kind of going through the process of asking why they collect it and what are some of the potential uses. What information can they actually glean from [it] … I think that sometimes there’s too much to digest, but I saw them developing the beginnings of a sense of data literacy … a little bit more critical thinking around understanding that more information isn’t always better.”

–Technical assistance specialist

Build buy-in for data use by ensuring that data are high-quality and reflect the needs of the school community.

Working with school leaders in advance of data collection can help them understand the potential value of school climate data and build the sort of buy-in that can make a difference in whether schools use the data. When school leaders communicate the value of the data to the school community, their demonstrated support can improve response rates. This is critical because low response rates can lead members of the school community to question the validity of the data. In the ISC-DC project, technical assistance specialists noted that some school teams did not trust survey results when response rates were low or when they had concerns about the representativeness of survey respondents.

“One of the areas where we had trouble initially was in parent [survey] responses. And last year we were much more successful … It came with the shared leadership on the core leadership team, of teachers reaching out, telling their students what we were doing, why we needed help from their parents, sending messages to parents, talking to parents, asking them to complete the survey, telling them why it was important, how it was going to help us …”

–School point of contact

Engage multiple stakeholders in the review of data.

Encourage schools to engage diverse perspectives; this is critical for making sense of school climate data.[13] Students, teachers, school-based mental health professionals, and parents are often able to contribute pertinent information that can help contextualize results—for instance, by sharing personal insights or bringing to light other data sources that help explain why a certain score is high or low. This dialogue can lead to a more robust understanding of what the data mean for a school and how to best address the issues identified within the data. It also helps guard against bias—such as the tendency to rationalize or dismiss findings that do not align with one’s expectations—and builds data capacity among stakeholders. In the ISC-DC project, schools varied in the extent to which, and the ways in which, they used the survey data for decision making. At some schools, reviews of the school climate data led to meaningful conversations with various stakeholder groups. These conversations, in turn, elicited additional context for the data and helped engage a broad range of perspectives on best addressing the needs identified in the report.

“I think having that diverse stakeholder group contribute to that shared work around school climate is so important … I learned so much from learning how to have better conversations with students around school climate and what it means, and what the school climate data mean … I believe that the knowledge about using data that we learned together will carry over [into the future] … [Before this project], we were looking at school climate data, but we weren’t looking at it with the same shared leadership perspective and approach. Were we using data? Yes, we were using data … [but] we learned how to have a better conversation about the data that was more inclusive, and open, and resulted in some specific action steps to help make improvements.”

–School point of contact

Emphasize the utility of school climate data for improvement, rather than for accountability.

Recognize that, historically, data have been used in punitive ways and be clear about the purpose of school climate data to promote continuous improvements. Help school decision makers reflect about their data-related hopes, fears, and expectations to minimize bias when reviewing data and increase the likelihood that data are used for school improvement. In an era of high-stakes assessments of student and staff performance, school leaders may be wary of school climate data.[14] In the ISC-DC project, technical assistance specialists observed that many school leaders and teams displayed an initial distrust or discomfort with data, stemming from past experiences in which data were used in ways that were punitive toward schools, school leaders, and educators.

For instance, a number of high schools in Washington, DC have been labeled as failing—and administrators have felt that their jobs were at risk—when high percentages of students did not meet grade-level performance indicators on standardized assessments such as the Partnership for Assessment of Readiness for College and Careers (PARCC) test. Stressing the value of school climate data for improvement, rather than for accountability, can help attenuate negative past experiences with data and increase data use, as can investing in the lessons outlined above—presenting data in easy-to-understand formats, building the capacity of school leaders to interpret and use school climate data, building buy-in, and engaging multiple stakeholders.

“People turn off to data or are afraid of it because it’s become used as a weapon.”

–Technical assistance specialist


This project was supported by Award No. 2015-CK-BX-0016, awarded by the National Institute of Justice, Office of Justice Programs, U.S. Department of Justice. The opinions, findings, and conclusions or recommendations expressed in this report are those of the authors and do not necessarily reflect those of the Department of Justice.

Photo courtesy of https://deeperlearning4all.org


References

[1] Jordan, P. W. & Hamilton, L. S. (2019). Walking a fine line: School climate surveys in state ESSA plans (Report). FutureEd. Retrieved from https://www.future-ed.org/wp-content/uploads/2019/12/FutureEdSchoolClimateReport.pdf

[2] Ryberg, R., Her, S., Temkin, D., Madill, R., Kelley, C., Thompson, J., & Gabriel, A. (2020). Measuring school climate: Validating the Education Department School Climate Survey in a sample of urban middle and high school students. AERA Open, 6(3). http://doi.org/10.1177/2332858420948024

[3] Solomon, B. J., Stratford, B., Steed, H., Sun, S., & Temkin, D. (2020). Implementation of a capacity building framework to improve school climate in an urban school system. [Manuscript submitted for publication].

[4] Ryberg, R., Her, S., Temkin, D., & Rodriguez, Y. (2020). Associations between a school organizational capacity building model and improvements in student perceptions of school climate. [Manuscript submitted for publication].

[5] Lachat, M. A. & Smith, S. (2005). Practices that support data use in urban high schools. Journal of Education for Students Placed at Risk, 10(3), 333-349. http://doi.org/10.1207/s15327671espr1003_7

[6] Marsh, J. A., Pane, J. F., & Hamilton, L. S. (2006). Making sense of data-driven decision making in education: Evidence from recent RAND research (OP-170). RAND Corporation. Retrieved from https://www.rand.org/content/dam/rand/pubs/
occasional_papers/2006/RAND_OP170.pdf

[7] Means, B., Padilla, C., DeBarger, A., & Bakia, M. (2009). Implementing data-informed decision making in schools—teacher access, supports and use. U.S. Department of Education.  Retrieved from https://files.eric.ed.gov/fulltext/ED504191.pdf

[8] Rankin, J. G. (2016). Data systems and reports as active participants in data interpretation. Universal Journal of Educational Research, 4(11), 2493-2501. http://doi.org/10.13189/ujer.2016.041101

[9] Knapp, M. S., Swinnerton, J. A., Copland, M. A., & Monpas-Huber, J. (2006). Data-informed leadership in education. University of Washington, Center for the Study of Teaching and Policy. Retrieved from https://www.education.uw.edu/ctp/sites/default/files/
ctpmail/PDFs/DataInformed-Nov1.pdf

[10] Lachat, M. A. & Smith, S. (2005).

[11] Marsh, J. A., Pane, J. F., & Hamilton, L. S. (2006).

[12] Means, B., Padilla, C., DeBarger, A., & Bakia, M. (2009).

[13] Schildkamp, K. (2019). Data-based decision-making for school improvement: Research insights and gaps. Educational Research, 61(3), 257-273. https://doi.org/10.1080/00131881.2019.1625716

[14] Lachat, M. A. & Smith, S. (2005).

Newsletters