Streams

Note to Test Designers: Bad Questions are not the Same as Hard Questions

Thursday, August 22, 2013 - 04:00 AM

(Stephen Nessen)

We have been hearing recently about how “hard” the recent state tests were.  We are told that they had to be “hard” because we have to raise the standards in order to make sure all our students are college-ready.  

Having proctored the fifth-grade tests this spring, I have to wonder where the folks at testing company Pearson went to college. From my vantage point, the tests suggested that the designers have had little direct experience with literature, history, or math.  The tests were hard in the sense that they were hard to do well on.  But they were hard to do well on because they were poorly designed, with little connection to the work that students should be doing in school, at the fifth grade or at college. 

Here are some key problems I encountered with the latest state tests:

Problem #1:  There’s not one right answer in literature

When I teach fiction, I emphasize that different interpretations and opinions are fine, as long as students can support their ideas with evidence from the text.  Yet for three days in April, fifth graders were asked to read poems and stories and identify the one and only one theme of the selection.  Not a main idea - I believe it is possible to pinpoint one main idea in reading selection - but one theme, as if a story can’t be about both friendship and trying new things.  

This could be fair if three of the four choices provided were clearly wrong, yet they weren’t.  As a savvy test-taker, I could usually identify which choice Pearson considered right.  But as I watched students make their “wrong” choices, I could see how they could make a beautiful argument with evidence from the text to support their answer.  And sometimes I couldn’t choose.  One question had two choices that were different ways of saying the exact same thing and another had two choices that seemed equally possible, no matter how many times I reread the poem in question.  

As a teacher, I know how to prepare students for college and I know how to assess their reading; this test did not do either. 

Problem #2:  Expository texts require visuals

Reading expository texts is challenging for many students at any level, including college. Good texts are filled with maps, graphs, photos, charts, etc. to help the reader visualize and organize new information.  In fact, good teaching involves teaching students how to read all those visuals and integrate the information provided with what is in the text, and we should be assessing whether students can do that.  Yet Pearson expected fifth graders to comprehend an article about Jefferson’s vision for westward exploration that ultimately led to the Lewis and Clark expedition while providing no visuals.

There was no map to help students understand that “The United States” was just a tiny sliver of the land between the Atlantic and Pacific Oceans, or what “The Louisiana Purchase” referred to. There was no timeline to help students grasp the time span covered in a few short paragraphs or to keep track of the different generations of Clarks.  This was not an example of challenging text; this was an example of poor text, making it a poor assessment.  

Problem #3:  Math should make sense

We hear a lot about students who can perform the basic operations in math but can’t apply them.  Presumably the new test would ask students to apply their math skills to real-life situations.  Instead, Pearson’s goal seemed to be to come up with situations that would never happen in real life to test students’ stamina and ability to complete nonsensical operations.  

A person has change in her pocket.  She spends varying amounts at various places.  In the end she has 1/45 left; how much money does she have.  1/45?  When in college or life does anyone have occasion to work with forty-fifths?  In another question we learn how long it takes kids to do something a certain number of times, and the question is how long does it take them to do it one time.  The answer, which all the fifth graders I was with could figure out, was four minutes.  The problem?  Pearson wanted the answer in hours, and since it was not multiple choice, it was anyone’s guess what answer they were looking for.  I hoped that the kids that wrote “1/15 hour” got it right.

Question after question, either the scenario described made no sense or the math required made no sense.  I like math. I think it can be fun and I know it is useful.  But this test was like math through the looking glass because there was no point to any of it, least of all to assess students’ math skills. 

If Pearson stood by their tests, they would release them to the public and let the public be the judge.  And I would challenge any adults that think these tests reveal something about children’s learning to take the tests themselves.  I would love to see those scores.  

Editor's Note: The state education department released a sample of test questions on their website. You can see them here.

We have been hearing recently (and relentlessly) about how “hard” the recent state tests were.  We are told that they had to be made “hard” because we have to raise the standards in order to make sure all our students are “college-ready.”  The assumption is that teachers with master’s degrees do not know how to prepare students for college and we need Pearson to show the way.  Having proctored the 5th grade tests this spring, I have to wonder where the folks at Pearson went to college, because the design of those tests suggested that the designers have had little direct experience with literature, history, or math.  The 5th grade tests were “hard” in the sense that they were hard to do well on.  But they were hard to do well on because they were poorly designed, with little connection to the work that students should be doing in school at either the 5th grade or college level. 
Problem #1:  There’s not one right answer in literature
When I teach fiction, I emphasize that different interpretations and opinions are fine, as long as students can support their ideas with evidence from the text.  Yet for three days in April, fifth graders were asked to read poems and stories and identify the one and only one theme of the selection.  Not main idea - I believe it is possible to pinpoint one main idea in reading selection.  No, they had to pick one theme, as if a story can’t be about both friendship and trying new things.  This could be fair if three of the four choices provided were clearly wrong, yet they weren’t.  As a savvy test-taker, I could usually identify which choice Pearson considered right.  But as I watched students make their “wrong” choices, I could see how they could make a beautiful argument with evidence from the text to support their answer.  And sometimes I couldn’t choose.  One question had two choices that were different ways of saying the exact same thing and another had two choices that seemed equally possible, no matter how many times I reread the poem in question.  As a teacher, I know how to prepare students for college, and I know how to assess their reading; this test did not do either. 
Problem #2:  Expository texts require visuals
Reading expository texts is challenging for many students at any level, including college. Good texts are filled with maps, graphs, photos, charts, etc. to help the reader visualize and organize new information.  In fact, good teaching involves teaching students how to read all those visuals and integrate the information provided with what is in the text, and we should be assessing whether students can do that.  Yet Pearson expected fifth graders to comprehend an article about Jefferson’s vision for westward exploration that ultimately led to the Lewis and Clark expedition while providing no visuals.  There was no map to help students understand that “The United States” was just a tiny sliver of the land between the Atlantic and Pacific Oceans, or what “The Louisiana Purchase” referred to. There was no timeline to help students grasp the time span covered in a few short paragraphs or to keep track of the different generations of Clarks.  This was not an example of “challenging text;” this was an example of poor text, making it a poor assessment and poor preparation.  
Problem #3:  Math should make sense
 We hear a lot about students who can perform the basic operations in math but can’t apply them.  Presumably the new test would ask students to apply their math skills to real life situations.  Yet instead, Pearson’s goal seemed to be to come up with situations that would never happen in real life to test students’ stamina and ability to complete nonsensical operations.  A person has change in her pocket.  She spends varying amounts at various places.  In the end she has 1/45 left; how much money does she have.  1/45?  When in college or life does anyone have occasion to work with forty-fifths?  In other question we learn how long it takes kids to do something a certain number of times, and the question is how long does it take them to do it one time.  The answer, which all the fifth graders I was with could figure out, was four minutes.  The problem?  Pearson wanted the answer in hours, and since it was not multiple choice, it was anyone’s guess what answer they were looking for.  I hoped that the kids that wrote “1/15 hour” were “right.”  Question after question, either the scenario described made no sense or the math required made no sense.  I like math.  I think it can be fun and I know it is useful.  But this test was like math through the looking glass because there was no point to any of it, least of all to assess students’ math skills. 
If Pearson stood by their tests, they would release them to the public and let the public be the judge.  And I would challenge any adults that think these tests reveal something about children’s learning to take the tests themselves.  I would LOVE to see those scores.  
 

Contributors:

Katherine Sorel

Tags:

The Morning Brief

Enter your email address and we’ll send you our top 5 stories every day, plus breaking news and weather.

Comments [8]

mark pelletier from Bozeman, MT

Just my two cents, this is life now, makes no sense. An example, I am an designer, I finally over a number of years learned AutoCad, a computer program that is for Architects and designers. What does Autodesk (the designers of Autocad) do every year? Yep they change the program, change the icons, reinvent the wheel, make it so.... well... so that your dont know how to run the program to get your work done with the confidence you once knew. Its all about selling new, more, updated technology. This is how the programmers make money. So in this day and age I must say get used to it, even people that have vast experience in there profession are being phased out for the new, and educators are complicit with recreating our own field of knowledge to justify their batch of students. Either keep learning weird unfathomable stuff or just quit it and go back to pencil drawing.

Jan. 28 2014 10:17 PM
Dorothy LaBarbera from Texas

This is such a good observation. When I was teaching 8th Grade Math I often prepared quizzes and quick tests. During the grading of these tests, if most students were responding incorrectly on a particular item, I knew one of two things had occurred. Either the test question was poorly constructed or I had failed to effectively teach the concept. This did not happen very often but I was up front with my students about it. Mathematics is a language and, when we construct testing materials, we are interweaving it with the English language. This can be quite tricky. Good communication in all settings is both an art and a science.

Jan. 28 2014 08:22 AM
KGray

So many things that used to be considered public goods and were appropriately handled by government at federal, state, and local levels are now considered ill-placed in public hands but better positioned as private for-profit services and products. Who pays for them? The government. So instead of keeping school testing and increasingly other school services "in-house," we turn education into a scrum of corporate competition for government money to make private profits off the labor of our most vulnerable: children. Schools are not businesses, never have been. They do not exist to create profit. The more we pretend they do, or even can, the more we are going down the wrong path.

Jan. 23 2014 05:18 PM
Deanna from Indiana

Fantastic Article....Our daughter suffered with those exams. She only got a local diploma because she did not score a 65 on 2 NY REGENT exams! We presently reside in a different state and he now is in community college and is doing quite well! It is a travesty they keep pushing an assessment tool that clearly DOES NOT DEMONSTRATE WHAT STUDENTS LEARN or how smart they are! Most of the people who submit test questions, most likely could NOT pass the exam themselves!

Sep. 11 2013 07:01 PM
nullhogarth

Reading the sample tests via the link provided at the end of the piece was quite an experience. The sense I have coming away is that of an attempt to force students, not merely to solve problems, but to adapt to and accept a kind of authoritarian, top-down manner of THINKING that I find offensive and repressive. It is a "you will think THIS way" manner of presentation that feels like mental enslavement. I hate these tests, and I hate a society that accepts this as a way of judging the "quality" of a young person's mind.
This is horrible. This is fascism, writ small and large.

Aug. 24 2013 11:36 AM
A NYC Teacher

I'm not sure if things will be different, but in the past, exams from previous years have always been made available online. While looking at a copy of an old exam may give some indication of what to expect on the next test, what would really be of real value would be to see the actual test booklets/ answer sheets the students completed (or didn't) the previous year. Teachers, parents, students, etc. are only told the final score. Johnny fot a 645. He is at a level 2. Great. Why? Who knows?They never get to see the exam they took. This means that they can't see how their work was evaluated; they can't see what they did well or what they did poorly on. Why? I assume that the testing companies, the state, the city, etc, do not want to deal with the inevitable inconsistencies that will be found in the scoring of the exams. Imagine the uproar when thousands of students, parents, principals, teachers, etc. notice errors in scoring. How would they deal with it? They won't. We should all be demanding that the actual test papers: papers that are used to determine whether or not a kid is promoted, papers that will decide a teacher's rating, papers that will lead to so many schools being shit down and replaced with charters....It's a big scam.

Aug. 23 2013 10:31 PM
Ian Berger from Peekskill, NY

teachers and parents are not even allowed to see the tests, ever. They are the intellectual property of Pearson and as a result are copyrighted. That's extremely creepy. How can a parent or teacher ever challenge the test if we never get to see it. How do teachers ever know what to review if we never see the past tests?

Aug. 22 2013 09:33 PM
Jennifer

I could not agree more with this post! Test questions are poorly designed because test creators are trying to measure something that cannot easily be measured in a timed testa -- interpretation, nuance. Whatever happened to testing vocabulary and reading comprehension? (I read the sample texts and related questions that were released and was appalled.) Won't simple arithmetic/mathematical equations yield the same performance results regarding ability as convoluted, poorly constructed word problems?

We're not being honest about why some students are falling behind, nor addressing the myriad factors that affect their academic performance -- AND we're gambling the futures of all of the average and above average, even brilliant, students who are also very much a part of this public school system and come from every race and ethnic background. There's a pernicious political agenda that people like Meryl Tisch and Michael Bloomberg are engineering. Our children are not their lab rats!

How many people remember that a mere ten years ago, the city of New York had a contract with Princeton Review, which not only provided families with practice questions that they could do at home, but also posted the actual test online after scores were released so students and parents could see which questions they got wrong? Do people understand that now parents are given little to no information about their children's actual performance other than a number that represents a politically determined "norm"?

We have gone backwards and sideways, and we need to challenge the proponents of these tests, especially those who have or had their own children enrolled in private schools. I, for one, do not need Meryl Tisch's condescension. Private school students are not subjected to endless, pointless tests, nor is their performance year after year tracked and stored in a database with no clear protections or plans for obsolescence. And despite having tremendous material advantages, those students perform across a spectrum, too. The only difference is that private school students are not subjected to the subtle (or not so subtle) message that they are somehow less than or not measuring up, a message that public school students are told year after year. It's time for all of us taxpayers with children in our public schools to get really, really angry about what's going on!

Aug. 22 2013 12:51 PM

Leave a Comment

Email addresses are required but never displayed.

Sponsored