Beth Fertig is the contributing editor for education, covering the New York City public school system for WNYC on air and online at SchoolBook.org. She has covered education in the city for more than 15 years. Beth is the author of Why cant u teach me 2 read? Three Students and a Mayor Put Our Schools to the Test (FSG Books) which grew out of a radio series on the low graduation rate for special education students. Follow her @bethfertig.
Teachers Union Sues to Halt Release of Teacher Evaluations
Wednesday, October 20, 2010
The teachers' union is going to court to stop the city from releasing reports rating teachers based on student test scores.
The suit seeks to block the media from obtaining what are called "teacher data reports." These reports determine how much of a difference an individual teacher makes on students' achievement, based on students' math and reading scores. But the union doesn't want teachers' names revealed to the public.
"There was an obligation on the behalf of the Department of Ed. to protect the teachers who were doing this work," says United Federation of Teachers President Michael Mulgrew. "And now they have gone back on their obligation."
Approximately 12,000 elementary and middle school teachers were rated by the city. Various media organizations have filed requests for the teacher data reports from 2008-2009, the latest year available, and the Department of Education was planning to release them this week. The DOE said it respects the union's right to sue, but that it believes the public has a right to this information and will release it Friday unless a court order stops it.
"That will be for a judge to decide," said Department of Education spokesperson Natalie Ravitz.
The union also claimed the formula for rating teachers based on how much of a difference they made on student test scores is "unreliable, often incorrect, subjective analyses dressed up as scientific facts." The measurements are supposed to isolate a teacher's impact on student performance by also factoring in poverty, race, gender and other issues. But the union called that "a complicated algorithm" and sent reporters the graphic below. The UFT said it obtained this from the Value Added Research Center at the University of Wisconsin-Madison, which was hired by the city to create the teacher data reports. It's not know if this is the same formula now being used in NYC.
Ravitz, at the Department of Education, said teachers are trying to have it both ways. She noted that the United Federation of Teachers signed on to the state's winning Race to the Top application — which included a proposal to rate teachers in part with student test scores.
But Mulgrew said the city's teacher data reports rely too heavily on exams — exams the state has acknowledged were too easy until it made them tougher to pass this year.
"Parents have been told all their children are doing better than ever, and all of a sudden all of the test scores came crashing down," Mulgrew said. "Those invalid tests are the same tests being plugged into the unreliable formula [used by the city]."
Mulgrew claimed the union also looked at 20 of the teacher data reports and found 13 had corrupted data, such as teachers assigned to the wrong students. But he doesn't dismiss the use of data completely.
"Teachers want to do as much as they can to help children," he explained, referring to the union's agreement in 2008 to take part in the data reports. "We saw the possiblity of working inside a pilot to develop a program that might lead to a useful tool for us to help children, then we went and did it."
Mulgrew said the reports are being given too much weight, however, now that New York City is using them, in part, to determine teacher tenure.
"Value-added" reports like these are highly controversial. The Obama administration has encouraged states to use student test scores to measure teacher effectiveness, without making them the sole criteria. But academics have questioned their validity and say the jury's still out when it comes to creating the perfect formula. New York University Assistant Professor of Education Economics Sean Corcoran studied New York City's teacher data reports based on 2007-2008 test scores (he was not given the names of individual teachers or even schools). He found the biggest problem was imprecision.
"Each teacher's value-added estimate is exactly that, it's an estimate of their contribution to student learning," Corcoran said. "It's not clear that these models take into account other factors driving differences in achievement." Such as poverty or class size.
"So therefore you're left with a very big margin of error around the resulting estimates," he said.
But he added, "New York City is doing as good a job as any place with these models and I think the fundamental problem is with the models and not necessarily what New York City is doing."
In late August, the Los Angeles Times released its own analysis of 6,000 teachers. It used data collected by the district that had never been added up before to measure a teacher's individual impact on student achievement. Teachers' unions and several academics objected to releasing the names of individual teachers and questioned the validity of the data. But U.S. Education Secretary Arne Duncan praised the Times for releasing the reports.
Here in New York City, UFT President Michael Mulgrew suggested there wouldn't be an issue if the Department of Education redacted the names of the teachers from its reports."
That is not something they have offered us, and that is something we would look at," he said.