In January, the University of Chicago Consortium on Chicago School Research released a report titled “School Closings in Chicago: Understanding Families’ Choices and Constraints for New School Enrollment.” The report, which is available online, presented a variety of data on how families chose where to send their kids for the 2013-2014 school year after Chicago Public Schools closed the doors of forty-seven elementary schools. While CPS assigned designated “welcoming schools” for students whose schools closed, not all families automatically sent their kids to these schools, and a variety of factors influenced their final decisions. In a two-part series, the Weekly aims to provide contextual background and in-depth reporting on some of the reports’ findings. This week, we present a timeline on the school closings process and the political controversy it generated, and an examination of the CPS performance ratings that influenced closing decisions.

Read our timeline of the school closings here.

In a January 29 endorsement interview with the Sun-Times, Rahm Emanuel defended his decision to close nearly fifty schools, arguing that many students were “trapped in a school that was not only under-enrolled, but consistently under-performing.” The closings, he said, gave students a chance at “a better future at a better school.” Emanuel’s defense refers to the rankings of closed and designated welcoming schools—the closed schools were all relatively low-performing (Level 2 or 3 on CPS’s now-outdated “Performance, Remediation, and Probation” rating scale), and the schools that students were assigned to were all rated higher.

However, the system of ratings that was used to designate higher- and lower- performing schools is controversial. Research on previous school closings had shown that only students who attended schools with substantially higher ratings showed improvement after their school closed; as a result, CPS claimed they would only send students to higher-rated welcoming schools. According to the “School Closings in Chicago” report, though, only thirty-nine percent of students were assigned to a “much higher-rated” welcoming school (designated by a difference of at least twenty performance policy points on a scale of one hundred). Also, many parents opted to send their children to non-designated schools, which sometimes had even lower ratings than their closed school.

This, the report said, was because parents often don’t judge schools based on “official markers” of academic quality, relying instead on factors such as class size, extracurricular programs, and school environment. Cassie Creswell, of the anti-standardized testing coalition More Than A Score, points out that these issues don’t factor into school rating policies at all: at the time of the closings, schools were solely rated by student performance on the ISAT test and attendance rates.

“What you can’t keep apart in Chicago is that school ratings are almost entirely based on test scores,” Creswell said. “We know that test scores are primarily a measure of income. They’re also a measure of some other socioeconomic factors—we know that African-American students score disproportionately low on these tests, even when you factor out income.”

Attendance rates are closely tied to social factors as well, like neighborhood safety and home stability—factors that schools can’t control, Creswell said.

The current “School Quality Rating Policy” (implemented in 2014-15) uses similar criteria of test scores and attendance rates, with the addition of the “5 Essentials” score, a survey-based measure of not usually quantified markers of school success. But even this measure, Creswell claims, has been skewed because it has been made high-stakes; she cites reports of principals asking students and teachers to give favorable answers on the surveys.

Charles Tocci, assistant professor at Loyola University’s School of Education, says school ratings aren’t particularly useful to parents compared to the local reputation of the school. This is in part because school rating systems are highly complicated. While parents may know roughly what schools are being graded on, the specifics of CPS’s rating algorithm and the weighting of different components are for the most part undisclosed.

“It’s hard to understand; it isn’t clear what all these categories are, what all these numbers mean,” Tocci said. “And then, when you mix in the wild card component—that the district CEO can change school rankings of her own volition—it makes the process unpredictable for parents.”

In the days after the closings decision, CPS CEO Barbara Byrd-Bennett defended the action in a press release, citing the high cost of maintaining “underutilized, under-resourced schools.” School performance ratings were a secondary factor in the decisions, she said, and an important mechanism in reassigning students to new schools. However, Creswell claims that arguments about utilization have been walked back in favor of rhetoric around “saving” students from poor schools, in order to validate closings that weren’t as economical as they promised to be (CPS is still paying millions of dollars in utility costs for the closed school buildings). In the end, she argues, it all boils down to demographics—the test scores, the attendance rates, the ratings, and the closings.

The closed schools were mostly located in low-income areas on the South and West Sides, which many critics saw as a targeting of vulnerable communities. Tocci says this was mostly a byproduct of a decades-long trend in Chicago demographics of shrinking populations on the South and West Sides. Underutilized schools, in turn, tend to be low-rated, since school funding is distributed on a per capita basis.

“Fewer kids means fewer resources, fewer enrichment activities, fewer extracurricular activities, fewer options in terms of elective activities, fewer resources in things like computer labs, which schools have discretionary funding for,” Tocci said. “So those are all indirect effects that [low populations] might have on academic performance.

“Ideally, the district would’ve been working off that model of demographic change for the past twenty years as they’ve allocated resources—which schools to keep open, which schools to phase out, where to direct limited resources in the district,” she continued. “They were not doing that, so they were left with the situation where they had to make those decisions very quickly.”

See the school closing report here.

 

 

Leave a comment

Your email address will not be published. Required fields are marked *