Episode 52: History of the iPhone Generation

PodcastNow that most everyone carries a search engine in their pocket, why do we still need to study history? Our present age demonstrates just how deceiving the internet can truly be. Host John Fea and producer Drew Dyrli Hermeling make the case that historical thinking is a critical tool for surviving this “post-truth” era while also warning against the dangers of leaning too heavily into presentism. They are joined by Sam Wineburg (@samwineburg), the author of Why Learn History (When It’s Already on Your Phone).

Sponsored by the Lyndhurst Group (lyndhurstgroup.org) and Jennings College Consulting (drj4college.com).

Historians on Assessment

00b02-classroom-1

Enter a caption

In his recent book Why Learn History (When It’s Already on Your Phone), Stanford University professor Sam Wineburg challenges history teachers to develop new assessments of student learning to see if the study of history really does teach the skills we claim it teaches. (Wineburg is scheduled to visit The Way of Improvement Leads Home Podcast in the next few weeks to talk about the book). The chapter in Why Learn History is based on research conducted by Wineburg’s the Stanford History Education Group.  You can read more about that work here.

Yesterday at the annual meeting of the American Historical Association, a group of history educators explored some of Wineburg’s findings in a session titled “What Are We Learning?”: Innovative Assessments and Student Learning in College-Level History Classes.”  Colleen Flaherty of Inside Higher Ed reported on the session.   Here is a taste:

CHICAGO — A 2018 paper by members of the Stanford History Education Group called out historians for failing to value evidence of student learning as much as they value evidence in their historical analyses.

The authors’ occasion for rebuke? Their recent finding that many students don’t learn critical thinking in undergraduate history courses — a challenge to history’s sales pitch that its graduates are finely tuned critical thinkers.

Even among juniors and seniors in a sample of public university students in California, just two out of 49 explained that it was problematic to use a 20th-century painting of “The First Thanksgiving” to understand the actual 1621 event, wrote lead author Sam Wineburg, Margaret Jacks Professor of Education and professor of history at Stanford University, and his colleagues.

The paper, which included other similar examples, was distressing. But it wasn’t meant to damning — just a wake-up call, or, more gently, a conversation starter. And that conversation continued Thursday at the annual meeting of the American Historical Association. A panel of professors here urged a sizable crowd of colleagues to embrace not just grades but formative, ongoing assessment to gauge student learning or lack thereof in real(er) time.

Suggested formative assessments include asking students to engage with primary-source documents such as maps, paintings, eyewitness event accounts, newspaper ads and unconventional historical artifacts via specific prompts. Others include asking students to examine a symbol of American nationhood, a local historical site or how pundits use history to advance arguments.

Panelist Lendol Calder, professor of history at Augustana College in Illinois, ran a study very similar to Wineburg’s on his own campus, and said the disappointing results held up. In general, he said, students either take any historical source at face value or — when they discover it was created by a human being — dismiss it outright as “biased,” he said, to chuckles.

Partly in response to that finding, Calder and his colleagues have doubled down on their ongoing campaign to discuss historical “sourcing” in every single class. That is part of a larger, existing departmental motto: LASER, an acronym for Love history, Acquire and analyze information, Solve difficult problems, Envision new explanations, and Reveal what you know. Sourcing work, which Calder called a “threshold concept” in history, means asking students to evaluate the reliability of various historical texts. Who made it? When? Why? What value does it hold for historians, if any?

Read the rest here.

How Do We Assess Student Learning in History?

 

Gilbert_Historical_Museum_Classroom_Exhibit

Historians make a lot of claims.  We claim that the study of history produces a set of transferable skills that students can use in the marketplace.  We claim that history teaches principles necessary for a democratic society.  We claim that history teaches critical thinking and might even be an antidote to fake news.

But can we prove it?

This is the question that Sam Wineburg and his team are asking today at Inside Higher Ed.  A taste:

“What are you going to do with that — teach?” Uttered with disdain, it’s a question history majors have been asked many times. Clio’s defenders have a response. The head of the American Historical Association says that the study of history creates critical thinkers who can “sift through substantial amounts of information, organize it, and make sense of it.” A university president asserts that the liberal arts endow students with the “features of the enlightened citizen” who possesses “informed convictions … and the capacity for courageous debate on the real issues.” Historians pride themselves on the evidence for their claims.

So, what’s the evidence?

Not much, actually. Historians aren’t great at tracking what students learn. Sometimes they even resent being asked. Recently, however, the winner of the Bancroft Prize, one of history’s most distinguished awards, washed the profession’s dirty laundry in public. The article’s title: “Five Reasons History Professors Suck at Assessment.”

Anne Hyde described what happened when accreditors asked her colleagues to document what students learned. They paid little heed to the requests — that is, until Colorado College’s history department flunked its review. Committed teachers all, her colleagues “had never conducted assessment in any conscious way beyond reporting departmental enrollment numbers and student grade point averages.”

Among many college history departments, this is routine. To address the issue of assessment, the American Historical Association in 2011 set out on a multiyear initiative to define what students should “be able to do at the end of the major.” Eight years, dozens of meetings and hundreds of disposable cups later, the Tuning Project produced a set of ambitious targets for student learning. But when it came to assessing these goals, they left a big question mark.

Read the rest here.

The Latest from Sam Wineburg: Historians are Duped By Fake News More Often than “Fact Checkers”

a7343-stanfordeducation

Here is the press release from Wineburg‘s Stanford History Education Group:

How do expert researchers go about assessing the credibility of information on the internet? Not as skillfully as you might guess – and those who are most effective use a tactic that others tend to overlook, according to scholars at Stanford Graduate School of Education.

A new report released recently by the Stanford History Education Group(SHEG) shows how three different groups of “expert” readers – fact checkers, historians and Stanford undergraduates – fared when tasked with evaluating information online.

The fact checkers proved to be fastest and most accurate, while historians and students were easily deceived by unreliable sources.

“Historians sleuth for a living,” said Professor Sam Wineburg, founder of SHEG, who co-authored the report with doctoral student Sarah McGrew. “Evaluating sources is absolutely essential to their professional practice. And Stanford students are our digital future. We expected them to be experts.”

The report’s authors identify an approach to online scrutiny that fact checkers used consistently but historians and college students did not: The fact checkers read laterally, meaning they would quickly scan a website in question but then open a series of additional browser tabs, seeking context and perspective from other sites.

In contrast, the authors write, historians and students read vertically, meaning they would stay within the original website in question to evaluate its reliability. These readers were often taken in by unreliable indicators such as a professional-looking name and logo, an array of scholarly references or a nonprofit URL.

When it comes to judging the credibility of information on the internet, Wineburg said, skepticism may be more useful than knowledge or old-fashioned research skills. “Very intelligent people were bamboozled by the ruses that are part of the toolkit of digital deception today,” he said.

Read the rest here.