Is Finding Extra Points on Regents Tests Cheating?

Email a Friend

How long have state officials known that teachers sometimes find an extra point or two on their students' state Regents exam scores to help them pass?

That was the question posed by an article in The New York Post on Monday, which focused on a previously unreported 2004 e-mail in which a state education official cited statistics that showed how teachers statewide appeared to be helping some students over the bar.

According to the e-mail, fewer than 1 percent of high school students in 2004 — a total of about 1,500 students — scored 52, 53 or 54 on most of that year's Regents exams, and roughly the same number got a 55, the passing score.

That meant that about three times as many students scored exactly at the passing mark than at each one of the scores below it, a result not in keeping with a standard statistical distribution.

On the math test, the skewing was more pronounced, the official, James A. Kadamus, wrote in his e-mail. All together, just over 1 percent — 2,116 students — scored 52, 53 or 54, while about 2 percent — 4,058 students — scored a 55.

That meant nearly six times as many students just passed as got one of the scores just below passing.

Mr. Kadamus, then a deputy commissioner, wrote the e-mail to advise his boss on whether the state should institute a special appeals process for students who score just below passing. The state had just started requiring that all students pass Regents exams to graduate, and there was concern that many would be unable to make the grade.

The e-mail recommended against an appeals process, Mr. Kadamus said in an interview on Monday, because at the local level it seemed that something similar was already taking place.

“There is already a de facto compensatory scoring system being employed in the schools because of the high numbers scoring 55,” Mr. Kadamus wrote to the state education commissioner, Richard P. Mills, and other officials. “Obviously, teachers look for points to get kids to pass.”

No special appeal process was started, though six months later one would be, for students who scored just below 65. By then, there was already evidence that how the state graded its Regents exams, within schools and sometimes by students' own teachers, was sometimes leading to inflated scores.

But though Mr. Kadamus’s e-mail now reads like a red flag of scoring misconduct, it was not treated that way at the time.

"This information didn’t sound the alarm because the numbers are not very high," Mr. Kadamus said in the interview. "There is not a big difference between the numbers scoring 55 and the numbers scoring 52, 53 or 54.” In addition, he added,the total number of test takers scoring a 55 on most tests — about 1 percent of 200,000 students — was low.

Mr. Kadamus described a State Education Department that was busy with the complex challenge of transforming a system to one in which all students had to pass tough Regents exams to graduate, from one that would award diplomas to some students based on a standard of eighth-grade competency. It involved new curriculums, training and logistics.

“We were just trying to get it off the ground,” he said of the new system, “and I think there was less attention to what the results were and more attention to saying, 'let’s get the tests out on time, let’s get the scoring right, let’s get the data back.' ”

Yet state officials were given repeated warnings about inaccuracies in Regents scoring as they expanded the stakes carried by the Regents exams, but did not modify the traditional way the tests had been graded for decades.

In 2003-4, the testing company CTB/McGraw-Hill rescored a sample of Regents exams and found that its scores were generally lower than the scores awarded by the schools, a sign that score inflation was taking place, according to a 2009 audit of Regents scoring by the state comptroller's office.

And in 2005, a team of the State Education Department's own experts rescored some June Regents exams and found a “significant tendency for local school districts to award full credit on questions requiring scorer judgment, even when the exam answers were vague, incomplete, inaccurate, or insufficiently detailed,” the comptroller’s audit reported, adding, “These inaccuracies have tended to inflate the academic performance of students and schools.”

Indeed, the auditors found the problem was not with detection — the state had known for years that there were weaknesses in its Regents scoring system — but with doing something about it. (A 1990 audit had found grading errors on 10 percent of the Regents tests it checked.)

Mr. Mills, interviewed on Tuesday, said that with the department focused on implementing the new Regents graduation standards, he did not recall concern over the issue of teachers grading their own students' tests as "a major focus."

“You are imposing today’s concerns on the past,” he said. “In the past, our concentration was on making these exams as strong as we could, correcting problems when we found them to the best of our ability, and I think that’s what we did.”

Mr. Mills described the traditional situation of a teacher grading his students' exams as analogous to a doctor looking at his own patient's test results, or a scout leader making a decision about a merit badge.

He distinguished the gray area of a teacher rereading an exam that fell just below passing from cases of "outright cheating," like providing students with exam questions before hand, which the department dealt with harshly.

As for Mr. Kadamus's e-mail, he said: "This was a small percentage, so this was not viewed as evidence of a major problem. It seemed more like a reflection of the normal classroom experience that every teacher has had. The teacher is not trying to fail students. They are trying to give students a fair chance."

Mr. Mills retired in June 2009; the state comptroller's audit came out that November. By then, the auditors believed that scoring discrepancies in local exam results did not amount to simply giving students the benefit of the doubt.

“Despite the seriousness of the review team findings and questions raised,” the auditors wrote, “there was little evidence that the State Education Department took action to follow up to address these matters with the officials of local school districts where the variant scoring took place.”

In the past year, the state has stopped requiring teachers to reread science and math exams that fall just below passing. The state is also moving away from a system in which teachers grade their students' own tests.

Now the phrase of the moment from the State Education Department is test integrity.

"We are relying more than ever on state exams — to measure student achievement, to evaluate teacher and principal effectiveness, and to hold schools and districts accountable for their performance," Merryl H. Tisch, the Regents chancellor, said last month, in support of tightened grading practices. "If we’re going to use the tests in these ways, we need to be absolutely certain that our system is beyond reproach."