In anticipation of the release of the city Department of Education School Progress Reports, SchoolBook asked four experts for their take on the value of the reports: James S. Liebman, a creator of the reports; Aaron Pallas, a school data expert; Clara Hemphill, an education journalist who founded Inside Schools; and Christine Rowland, a high school teacher. Here are their views, lightly edited:
James S. Liebman is a professor at Columbia Law School. Starting in 2006, as chief accountability officer under Schools Chancellor Joel I. Klein, Mr. Liebman helped create the school progress reports that are still in use today.
At my first meeting with city principals about the Progress Reports I was designing, they listened intently, offering frank comments on how to make it better -- advice used to improve the reports each year. But one principal said not to bother measuring student results at his school because the families were so dysfunctional the kids couldn’t learn and the teachers didn’t try because the union contract let them shirk.
Of course, a big reason kids at that school weren’t learning was the principal. We know that, because dozens of the city’s 1,600 schools have children just like that school’s, yet their students learn more. If the kids at two schools are similar, but those at one school learn and those at the other don’t, you can’t blame the kids. It has to be the school -- and it’s a crime to make kids attend a school where no one learns.
That’s the logic of the Progress Report: telling parents, students, teachers, principals and the public whether students in each school are learning as much as we know they can based on experience, or whether they’re getting short-changed.
The report compares each school to others based on (1) how parents and teachers rate the learning environment; (2) average student attainment; (3) average gain in attainment—learning each year; and (4) how well poor, minority, English learners and special needs students do. If a school under-performs others for two years, the principal is replaced or a better school is created.
Has it worked? Recently, the city’s average student attainment has gone up more than in any other period since attainment has been measured, and more than in any other urban district in the state and most others nationwide. Those other districts have similar students who take the same tests.
It can’t be the kids or the tests that explain lower attainment there. It has to be something the districts are or aren’t doing. One thing New York City does to help children learn more each year is issue annual Progress Reports.
Clara Hemphill is the founding editor of Insideschools.org, a project of the Center for New York City Affairs at The New School.
My advice to public school parents: Don’t take the city’s Progress Reports too seriously. The letter grades from A to F may look simple, but schools where children do well sometimes get bad grades while schools where most the children are struggling sometimes get good grades.
The Progress Reports are based on a complex formula designed to highlight the yearly gains on standardized tests in reading and math made by various groups of students, including new immigrants, special needs children and black males.
The schools are ranked within a “peer group” of schools with similar demographics. That means a school in a poor neighborhood where kids make significant gains on standardized tests may get a better grade than a school in a middle class neighborhood where the kids consistently do well on the tests.
P.S. 43 in the Bronx, for example, got an A this year, even though fewer than half the students met state standards for reading. At the same time, P.S. 3 in Greenwich Village and P.S. 150 in Tribeca both got “Cs,” even though more than 80 percent of students at each school met state standards in reading.
There’s a problem with giving too much weight to standardized test scores, which make up 85 percent of a school’s grade on the Progress Reports. Schools that focus intensely on improving scores may neglect subjects that aren’t tested, like science, history, music and art. The tests don’t measure whether a school helps develop character, citizenship, good work habits and self-control -- qualities that children need to be successful adults.
I’ve visited a school that received an F this year: P.S. 277 in the South Bronx. It is a cheerful, orderly place where teachers work hard to give children a rich well-rounded curriculum.
Despite these drawbacks, there are some important clues to school quality buried in the Progress Reports. Fifteen percent of a school’s grade comes from its attendance rate and the results of surveys measuring parent, teacher and student satisfaction.
These surveys, available on the Department of Education Web site, offer a clearer, more nuanced picture than the simple and often misleading letter grade on a Progress report.
Aaron Pallas is a professor of Sociology and Education at Teachers College, Columbia University, who specializes in examining statistics and data on public schools.
The most telling feature of the Progress Reports produced by the New York City Department of Education is that they are described as an accountability tool. The primary purpose of the Progress Reports is to hold the principals and teachers who work within the city's public schools accountable for a set of outcomes, labeled this year as student progress, student performance, school environment and closing the achievement gap.
The audience for the Progress Report letter grades is first and foremost the administration of the school system; it's the administration which uses the letter grades to reward some schools and punish others, including the controversial decisions to close schools which receive low Progress Report ratings.
Parents and other members of the public are a secondary audience; there is a presumption that if the Department of Education publicizes the Progress Report grades, that parents and others will "vote with their feet," increasing demand for access to the schools receiving A's and B's, and seeking to avoid the schools receiving C's, D's and F's.
The issue of the audience for the Progress Reports is an important one. Parents might be interested in comparing schools to which their children might apply, and having a common scale for such comparisons is desirable. But the key question a parent might ask -- "Is this a good school for my child?" -- is unlikely to be answered in a satisfactory way by a Progress Report letter grade.
Parents might want to consider many features of the school, but the Progress Reports focus on just a few. At the elementary and middle school levels, for example, 85 percent of the overall Progress Report letter grade is based solely on the performance of the school's students on the New York State English Language Arts (E.L.A.) and mathematics examinations. Other features of what children might learn in school are not addressed at all.
Moreover, much like the college rankings produced by U.S. News and World Report, a school's overall grade, and the school's score on the components of that grade, might vary from year to year in ways that are puzzling.
Do good schools suddenly become bad schools from one year to the next? Is our conception of a good school that unstable? If you believe the Progress Reports, this year's good school might be next year's clunker.
Parents seeking to make sense of how a given school is functioning would be well-advised to look at many different sources of information.
The Progress Reports and the New York City Quality Reviews, which are based on the reports of brief visits by outside observers, each provide a partial perspective. So too do the narrative reports on schools produced by InsideSchools.
And, if I can be forgiven a shameless plug, SchoolBook now has some useful analytic tools for comparing schools, and is intended to be a site for individuals to share their thoughts and opinions about particular schools.
Taken together, these kinds of resources will be much more useful to parents and to the public than the Progress Reports examined in isolation.
Christine Rowland is a teacher of English Language Learners at Christopher Columbus High School in the Bronx.
The problem with the School Progress Reports is that the stakes are so high that they produce unintended consequences.
In elementary and middle schools, state tests make up 85 percent of each school’s grade. In high schools, one of the biggest problems is credit accumulation.
The high school reports reserve almost a third of their weight for the percentage of students gaining 10 or more class credits per year. Since the progress reports began there has been a staggering increase in credit accumulation by high school students citywide. For 9th grade students alone the increase has gone from a city-wide average of 63 percent gaining 10 or more credits in the 2006-2007 school year to 79 percent making the same target in 2009-2010, the most recently available data. This represents an increase of more than 25 percent.
Yet, during this same period, city-wide attendance (one indicator of student performance) has only risen from 85.3 percent to 86.5 percent, a negligible amount. Most of these credit gains seem to have come from either school leader mandates to pass specific percentages of students regardless of performance or even attendance (thankfully my principal is not one of them) and "credit recovery projects'' -- frequently modest assignments given in lieu of ongoing class performance/attendance – a practice that has seen an astronomic rise since the Reports’ inception. This could be seen as an indication that the system is being gamed.
Credit has traditionally been given for the successful completion of a semester-long course, and typically includes multiple measures of student performance and usually culminates in a final examination.
Teachers have always given students the opportunity to make up work for failing classes when the situation warranted it. The move from this practice to credit recovery does not benefit students.
The public was shocked when recently released data indicated that only 20 percent of New York City high school graduates are "ready'' for college, and in fact some schools with A or B grades on their Progress Reports had not a single student deemed "college ready.'' Progress Report grades may be improving, but how about real learning?
As for schools, they must play the game or face the consequences: negative media pressure and possible closure. Faculty and students alike feel the stigma of the poor grades.
A recommendation? Drop the letter grades and remove the "high stakes'' nature of the reports. Use them to foster community conversations around teaching, learning and the social issues faced within our schools.
Maybe then the data could lead to more valuable and substantive educational experiences for students and their families and there would be a greater chance of attracting and retaining strong educators in our highest needs schools.