First, here is some background on the National Assessment of Educational Progress report. It is often described as “the nation’s report card.”
And here is a taste of critique of the NAEP by Sam Wineburg, Mark Smith, and Joel Breakstone:
Students have never fared well on NAEP’s tests in these subjects. The first history test in 1987 found that half of the students couldn’t place the Civil War in the right half-century. Some 15 years later, following a decade of new standards, The Washington Post wrote that students on the 2001 exam “lack even a basic knowledge of American history.” In 2014, the last time history was tested, the New York Times fished into the recycling bin for this headline: “Most Eighth-Graders Score Low on History, Civics.”
But what would happen if instead of grading the kids, we graded the test makers? How? By evaluating the claims they make about what their tests actually measure.
For example, in history, NAEP claims to test not only names and dates, but critical thinking — what it calls “Historical Analysis and Interpretation.” Such questions require students to “explain points of view,” “weigh and judge different views of the past,” and “develop sound generalizations and defend these generalizations with persuasive arguments.” In college, students demonstrate these skills by writing analytical essays in which they have to put facts into context. NAEP, however, claims it can measure such skills using traditional multiple-choice questions.
We wanted to test this claim. We administered a set of Historical Analysis and Interpretation questions from NAEP’s 2010 12th-grade exam to high school students who had passed the Advanced Placement (AP) exam in U.S. History (with a score of 3 or above). We tracked students’ thinking by having them verbalize their thoughts as they solved the questions.
What we learned shocked us.
In a study that appears in the forthcoming American Educational Research Journal, we show that in 108 cases (27 students answering four different items), there was not a single instance in which students’ thinking resembled anything close to “Historical Analysis and Interpretation.” Instead, drawing on canny test-taking strategies, students typically did an end run around historical content to arrive at their answers.
Read the entire piece here. I need to share this piece with my “Teaching History” class. We are in the midst of reading Wineburg’s Historical Thinking and Other Unnatural Acts.