Streams

Teachers React to Release of Data

Friday, February 24, 2012 - 04:04 PM

Teachers throughout the city have been objecting to the public release on Friday of individual teacher performance rankings. They called in to WNYC’s "The Brian Lehrer Show," they sent comments to SchoolBook and they wrote sometimes lengthy explanations on the forms that SchoolBook made available for teachers to respond to their individual reports.

Their comments spoke to the anguish they experienced in seeing the published reports, or their anger about their publication. Many pointed out what they saw as flaws in the methodology.

When some were contacted by phone, several said they were glad to have a chance to provide more context to their ratings. But many alluded to a decision by Gotham Schools not to publish the data released by the Department of Education. InsideSchools also decided not to publish the data, explaining: "This virtual wall of shame (and fame) will live online for years to come. But does it actually help parents to find the best schools and teachers? Not really."

Here are some of the comments submitted to SchoolBook and WNYC. Many of them have been condensed:

Randi, an eighth-grade teacher from Queens, in a call to"The Brian Lehrer Show":

I don’t mind having a report about children’s progress listed someplace, like in the paper or wherever. However, along with that I think there should be something with the students who are not showing progress, a little bit of their history. If a child is not showing progress, I’d like to know how many times he was absent from school, how many times he was late, how many homework assignments he missed. I think that that really is the true picture of a teacher -- if there is progress or not progress -- when you see the whole picture of what’s going on in the classroom.


Patricia Rivera, a teacher at P.S. 315 in the Bronx, responding on SchoolBook to her own report:

Although my reports show positive results, I do not agree with the release of this data, due to the fact that value-added reports such as this data report have wide margins of error. Experts from all over the country have stated that there are many problems with this type of data report and I stand in solidarity with other teachers across the country who reject this type of data as a measure of performance until its accuracy and margin of error are reasonable and fair. Furthermore, teaching is a complex and nuanced profession that requires multiple measures to capture all the aspects of teaching the whole child, academically, socially and emotionally. Publishing these reports does not address the formidable work of quantifying teacher performance and developing an evaluation system that supports optimum development of all teachers and ultimately benefits all the students they are entrusted to teach.


Arthur Goldstein, an English as a second language teacher and United Federation of Teachers chapter leader at Francis Lewis High School in Queen, in an e-mail to editors at SchoolBook:

I teach high school, so there is no report about me, good, bad, or otherwise. However, given the impending evaluation system, I can only suppose this is the precedent we are facing, and all teachers can soon be vilified for no good reason, as value added has not been established to be anything more than junk science. I wonder how that will affect first-year teachers, who will now need to shape up by second year or be dismissed. In fact, without tenure, it’s absolutely within the purview of principals to dismiss them without giving them a second year. I wonder how that will affect those pondering whether or not to enter the field of teaching, particularly when and if the economy turns around.


Steve, who said he was an eighth-grade English teacher from Manhattan, in a call to "The Brian Lehrer Show":

I remember looking at it with one eye closed, one eye opened. I was a little nervous. I think I was somewhere in the average range because it’s sort of peer-normed, I guess, and I remember thinking it didn’t tell me anything I didn’t already know about the fact that the tests are biased. The tests from year to year have changed. I saw that like from one year I had a lot of growth and the next year I didn’t. I looked more at the test that was being used, not so much at my performance as a teacher.”


Karen Fine, a teacher at P.S. 134 in Manhattan, responding on SchoolBook to her own report:

I first want to correct that for the academic year of 2009-2010 I taught Social Studies only to several fifth-grade classes and did not teach either English Language Arts or Mathematics, but may be held accountable for that year in the released data. My own data report issued to me by the Department of Education shows that for the year I did teach English Language Arts and Mathematics, 2008-2009, my "score" was 97 for English Language Arts and 65 for Math. When I checked on the DOE website today, the reading score was the same, but the math score appeared to have been changed to a much lower score. A subheading beneath 2008-2009 said "Last Two Years," but again, I did not teach these subjects in '09-10 and the scores for 2010-11 were not formulated into a value-added number. If this flawed analysis appears complex, it is, and should not be used for such high stakes purposes as firing a teacher. I hope one day the people leading the cruel scapegoating of teachers, find their conscience. They are ruining public education with reforms that are based on junk math and pseudo-science, and students will pay the price along with their teachers as the curriculum is further narrowed to teach to these tests. Creativity, critical thinking, questioning and exposure to the abstract will be supplanted by the limited learning of how to choose the best answer to a test question. It is deadening to teachers and their students, and it's very sad.


Lea Weinstein, a teacher at M.S. 45 in the Bronx, responding on SchoolBook to her own report:

This data is based on ONE test taken on ONE day when several variables, such as child poverty, quite possibly will affect student performance. Yes, I administered this test that generated this data to my sixth-graders two years ago. I no longer teach sixth grade, and I no longer teach in the same school, or even the same subject. How is this data relevant today?

I know many teachers, including myself, who spent hundreds of dollars of their hard-earned money and hours of their personal time working tirelessly to give their students the best education possible, while sacrificing time and money meant for their own families. Some of those teachers were rated "ineffective." Do you really think young, talented teachers are going to stick around after suffering this kind of humiliation?


Jamie Lilly, a teacher at P.S. 89, in Battery Park City, responding on SchoolBook to her own report:

I wonder if my score would have been higher had I spent more time "teaching to the test" rather than focusing my students on inquiry-based, well-rounded, higher level thinking skills throughout the year. In the end, I would rather be considered an "average" teacher to the test, knowing well that I work my hardest to instill a lifelong love of learning in my students day in and day out.

Another issue is the small sample size. In my case, 1/3 of the 24-32 students I taught were the same children moving from fourth grade (2008-2009) to fifth grade (2009-2010). Basically, my "value-added" score is compared against myself the previous year. Furthermore, it states that I taught 21 students when I taught several more than that. How can we trust the data when the numbers are not correct?


Donna Lubniewski, a teacher at P.S. 114 in the Bronx, responding on SchoolBook to her own report:

I participated in a pilot program during the 2009-2010 school year in which I only taught Reading and Writing and another teacher taught my students Math the whole year. Since these students were listed in my class, the math scores went under my name. This is why this evaluation should not be used to judge teachers! They are flawed!


Sarah Schauben-Fuerst, a teacher at the Lyons Community School, responding on SchoolBook to her own report:

My report says that I am an "above-average" teacher, or at least was, in the 2008-2009 school year. That may be true (though I have trouble believing it based on the incredibly faulty data it uses). What is not true is a lot of the other data the report includes. It says that I taught 40 students that year, when in reality I taught more than 70. It says that I taught only males, when in reality my classes were split almost fifty-fifty, male and female. It says that I taught no special education students or English language learners, when in reality, many of my students fit into those categories. But perhaps the biggest issue of this report, and there are many, many issues, is that the year this report is from was the last year before all the cut scores were looked at and it was found that test score inflation had gotten seriously out of control. If I had been scored on these exact same students, who took the exact same tests, in the exact same system just one year later, many of them would have gone down instead of up, and I might be judged as a "below-average" or even "low" teacher, whatever that means. Lucky for me, this is the only year that my data report says I am responsible for test scores, though I have been the literacy coach for the last three years and am presumably responsible for the decreased scores of many students in the 6-8th grades at my school, since I coach their teachers. But again, lucky for me, that is not the case. Instead, I am an "above-average" teacher, because one year, three years ago, I taught 40 (or was it 75?) general education students (or did they have special needs?) and they made substantive gains on their hugely inflated state test scores. Yes, lucky for me.


Gary Malone, a teacher at J.H.S. 189 in Flushing, responding on SchoolBook to his own report:

How accurate can these rankings be? I'm not sure which reports are being published, but in 2008-2009, I was ranked in the 88th percentile, only to plummet all the way down to the 38th percentile one year later. The 2009-2010 report also gives my 5-year average as being in the 40th percentile, but I've only received two data reports, the 88 and the 38, so I am unsure where the average figure comes from. The reports also include students who either transferred in or out of my class at some point during the year as well as students who were truant and rarely in class at all. Since we were never provided with the chance to verify the accuracy of these student rosters until last year, I have to rely on memory in determining their accuracy, and remembering the names of every student you taught since 2005 is a daunting task, and one I am not capable of doing, In addition, there is an entire class that seems to be missing from my 2008-2009 report. Lastly, and, perhaps most importantly, it should be pointed out that unlike the Math exam, the ELA is NOT based on the curriculum. If the state provided us with a set curriculum for English Language Arts and then tested the kids on what we were paid to teach them, at least there would be some element of fairness involved. I feel like I have been a very respected teacher in my career thus far, but once I am falsely and libelously labeled publicly as a "38", I imagine gaining the respect of my students and their parents will be nearly impossible.

Patricia Willens of WNYC contributed reporting.

Tags:

The Morning Brief

Enter your email address and we’ll send you our top 5 stories every day, plus breaking news and weather.

Leave a Comment

Email addresses are required but never displayed.

Sponsored