A Peer Group By Any Other Name
We have long said that the Department of Education rates schools at its own peril. In an attempt to assess the quality of each public school, the DOE has moved for the second year to a data-driven system that rates schools in four discrete categories and then weighs the results against a stated peer group of schools that have roughly the same demographic as the school being rated. Given is the fact that there are no level playing fields among city schools. Each is so different from another that to rate them head to head would be unfair. For example, rating the Scholars'Academy, which takes only highperforming students, against any school that has to draw from its community without regard to achievement, would be unfair. We believe, however, that the present system is just as unfair in its own way as a head-to-head system would be. First of all, the weight given to each rating category - fifteen points for "environment," 25 for "achievement," and 60 for "progress" - is blatantly skewed to devalue schools with a preponderance of achieving students and overvalue schools with lots of failing students who manage to move from a "4" rating, the lowest, to a "3" rating, which is movement, but still failure. The New York Times recently did a study of each of the 18 failing elementary and middle schools to see what would happen if those weights were changed, and it found that changing the way the report cards were weighted made all the difference in the world. Since no local schools received an "F" grade, there are none in the Times study, but a look at a typical elementary school in Brooklyn Heights is instructive. The school received an "F" on this year's DOE report card. When the Times removed the peer group rating, for example, it was found that the school would have received a B. Had performance been counted more than progress, the school would have received a C. That is a pretty wide range of possibilities and gives one pause as to the reliability of the entire report card process. In addition, while the schools that make up the peer group are listed on individual school report cards, even a detailed perusal of some of those schools gives no clue as to why they are considered peers of their Rockaway counterparts. In many cases, they seem to be as different from the Rockaway school to which they are linked as night to day. As with many things, we believe that the report cards serve a more political purpose than an educational purpose - to show that the mayor has indeed put his hand on the school system and that everything is Aokay and getting better each year. To add to the absurdity, a large number of schools that are reported as "failing" by the federal government under the NCLB act got A's from the city, while 18 of the schools that got an F on their city report cards got high marks from the feds. We would like the perception and the reality to line up somewhere along the way. Our schools have become a political football, to be kicked over the goal posts for the winning points by the mayor while, in reality, there is little substance to the ball under his control.