Beth Fertig is WNYC’s Contributing Editor for Education. She previously covered politics, which included City Hall during the Giuliani administration, and the U.S. Senate campaigns of Charles Schumer and Hillary Clinton. She also covered transportation and infrastructure.
Teacher Ratings: What Are Other Cities Doing?
Wednesday, December 08, 2010
New York City isn't the only district using student test scores to measure which teachers are most effective. The Obama administration is using federal Race to the Top grants and other funds to encourage states and districts to develop similar systems. Some are farther along than others. Below is a brief description of a few places that use the data.
In August, The Los Angeles Times published the names and ratings for 6,000 teachers whose effectiveness it determined through its own analysis. The Times linked individual teachers to their students' test scores, because the city doesn't normally use value-added data to evaluate teachers.
California has its own system of grading schools based on how their students perform on standardized tests each year. The Times requested detailed records it could match with teachers. Under the California Public Records Act, information pertaining to public employees must be made available to the public. The Los Angeles Times created its own "value-added" analysis by measuring the difference between a student's expected test performance and the actual score. This determined the teacher’s value-added score.
The Times hired Richard Buddin, senior economist at Rand Corporation, to crunch the data (Rand was not involved in the analysis). The paper found teaching quality varied more within individual schools than across the the district. On December 5, the paper published a story finding some of the most effective teachers were the ones laid-off because of budget cuts.
Buddin stands by the accuracy of his analysis, but warns that value-added scores are only a portion of a teacher’s overall evaluation and were never intended to stand alone. But they get a lot of attention because other "soft" data, such as classroom observations, aren't quantified the same way.
The teachers union called releasing the data "dangerous," according to an article in the Times. "Publishing the database is irresponsible and disrespectful to the hard working teachers of Los Angeles," said David Sanchez, President of the California Teachers Association. A fifth grade teacher took his own life shortly after the data was publicized. He had been ranked less effective than his peers. No direct link was ever established, though, between the teacher's suicide and the evaluation.
Those who support publicizing value-added data claim that parents have the right to know a teacher's scores. Education Secretary Arne Duncan said "this is an extraordinary opportunity to take teaching in L.A. to another level -- to identify those who year after year are getting great results in difficult circumstances," he told the Times. "It can really empower teachers to strengthen their craft and find out who are the great teachers around them." The state's education secretary, Bonnie Reiss, also told the paper, "Publishing this data is not about demonizing teachers. It's going to create a more market-driven approach to results."
Los Angeles is the first city where teacher's individual ratings were publicized.
Tennessee is considered the national leader in value-added data, which it has used since 1993. In fact, the methodology was first developed in Tennessee by Dr. William Sanders, a former math professor.
Individual teacher scores are available only to school administrators, but district and school information is available to the public. These data do not include demographic information on students, such as race or income, because the state says all students should progress at the same rate. Teachers aren't given incentive bonuses, but schools can get extra money if their scores go up.
Value-added data is used to account for eight percent of a teacher's evaluation. But starting this year, it accounts for a greater share, following an agreement between the governor and the unions to qualify for federal Race to the Top funding (Tennessee and Delaware were the first states to receive these coveted grants). The Tennessee Education Association has supported relying more on value-added data. Jerry Winters, TEA chief lobbyist, says "I think we’ve stepped up to the plate and we’re being accountable.”
Houston gives bonuses to teachers based on several criteria, one of which is how effective the teachers are in raising student achievement, or "adding value." The bonuses ranged this year from $25 on the low end to as much as $15,530 with the typical award averaging $3,606. But the district doesn't teacher ratings with the public.
In 2008, the Houston Chronicle tried to get around that barrier by requesting the amounts paid to each teacher through the district's performance pay program. By looking at who got the bonuses, the paper could figure out which teachers got the highest ratings. The Houston United School District denied the request. But it was overruled by the Texas state attorney general’s office, allowing the Chronicle to publish the names of all the winners and the amounts they received as a bonus. Parents were therefore able to see which teachers were most effective based on bonus size.
In 2009, the Chronicle made a second request for a list of teachers who were in the top and bottom 10 percent in value-added rankings, along with their names and scores. This request was denied by the attorney general's office, on the grounds that state law classifies the evaluations as “confidential” and not public record. Once again, the paper only published the names of those teachers who received the bonuses.
In January 2010, a total of 88 percent of eligible teachers received bonuses. Houston’s use of value-added data has been met with consistent opposition from the Houston Federation of Teachers and its president, Gayle Fallon. She told WNYC that Houston’s use of data is “divisive” and an unreliable resource for making high-stakes personnel decisions. She opposes its lack of transparency, because teachers don't know the formula used by the district's consultant, and she says the data is a poor representation of the complexities of the Houston schools.
But the group Parent Visionaries disagrees. A spokesperson, Mary Nesbit, says “information is really insightful." Nesbit calls the data a "very useful tool for evaluating a school’s quality." But she acknowledges, "it’s not the only tool. I think it’s important to understand what conclusions you can’t draw from this information. There can be a lot of misinformation.”
Ultimately, she says the data "changed the conversation about what parents can expect from schools and what they insist upon from schools.” There's now an expectation, she says, that students can grow every year. "Parents will no longer accept a bad year,” she says.
Denver uses a teacher bonus program similar to that used in Houston. It's called "ProComp," an alternative pay program. As in Houston, there are several factors that are weighted in determining who gets the bonus. Denver teachers can be rewarded for advancing their teaching qualifications by pursuing a master's or attending seminars, teaching in hard-to-serve communities and receiving successful evaluations. The teacher's impact on student test scores is also a factor.
In the 2009-10 school year, $25 million in bonus money was distributed among qualified teachers. But no breakdown has been provided to the public about who received those incentive bonuses or their value-added ratings. According to The Denver Post, 75 percent of teachers opted to take part in the bonus program and the average payout was $7,277 in the 2009-10 school year. The newspaper figured out that one special ed teacher made more than twice that amount (though not specifically for student growth). The paper has not published the names of teachers.
Washington DC's approach to rating teachers has been controversial. The city uses a program called IMPACT, a rigorous evaluation system combining value-added methods with observations. More teachers were labeled as ineffective after the system went into effect in 2009.
Value-added data has been used to make personnel decisions in D.C. Former schools chancellor Michelle Rhee came under fire from the teachers union last summer when she fired the bottom five percent of the city's teachers, who rated lowest under IMPACT. This information was not made available to the public. The release of evaluative records is not allowed under D.C. law, but Rhee told The Washington Post that if made public, value-added scores could be "the right sort of pressure we want to see to reform the system."
However, like many educators around the country, Rhee acknowledged that value-added data can be misread and can lead to complications, such as parents demanding the highest-scoring teachers. DC's interim chancellor Kaya Henderson, who worked under Rhee in developing the IMPACT program, has said she will uphold Rhee's evaluation system and pay initiatives (which, unlike IMPACT, must be worked out with the union). Teacher's union president Nathan Saunders says he will take a more aggressive opposition to the district's evaluation system. He says it penalizes those who teach under the most difficult social conditions, such as impoverished communities with high numbers of English language learners. "The best teachers are empowered teachers," he told the Post.
With reporting by Annalies Winny