Neem: We Cannot “Think Critically” Without Knowledge

think-622689_960_720

Johann Neem is on fire.  Earlier today we linked to his Chronicle of Higher Education piece calling for the elimination of the business major.  Now we link to his Hedgehog Review piece on “critical thinking.” I have ordered his book What’s the Point of College?: Seeking Purpose in an Age of Reform.

Neem argues that critical thinking cannot take place without knowledge–the kind of knowledge one learns in a particular discipline.  Or, as he puts it, colleges and universities should understand skill development “in relation to the goods of liberal education.”

Here is a taste:

Advocates of critical thinking contrast thinking critically with learning knowledge. College professors, they proclaim, teach a bunch of stuff (facts, dates, formulae) that students don’t need and won’t use. Instead, students need to have intellectual and cognitive skills. As New York Times columnist Thomas Friedman has proclaimed, “the world doesn’t care anymore what you know” but “what you can do.”

There are two problems with this perspective. First, it is fundamentally anti-intellectual. It presumes that the material colleges teach—the arts and sciences—does not matter, when, in fact, this is the very reason colleges exist. Second, these claims are wrong. Cognitive science demonstrates that if we want critical thinkers, we need to ensure that they have knowledge. Thinking cannot be separated from knowledge. Instead, critical thinking is learning to use our knowledge. The most effective critical thinkers, then, are those who learn history or physics. The stuff we learn about matters.

In many ways, the turn to skills is a defensive response. At a time when the humanities, in particular, are under attack, what better way to defend the humanities’ “useless knowledge” than by demonstrating that these are means to a larger end: critical thinking? However, one must acknowledge that these defenses reflect the capitulation of academics to utilitarian and pragmatic pressures. Lacking a convincing argument for the knowledge that anthropologists or historians have to offer, they instead proclaim that history and anthropology will serve employers’ needs better than will other fields. But if that’s the case, why does one really need to know anything about anthropology or history? Why should colleges hire anthropologists or historians instead of professors of critical thinking?

This is not an abstract question. When we turn from higher education to the K–12 system, we see that the focus on skills over knowledge has transformed the curriculum. Increasingly, especially under the Common Core State Standards, students devote their energies to learning skills, but they may not learn as much history or civics or science. Therefore, in contrast to the anti-intellectual rhetoric of many reformers, critical thinking must be defended because it encourages students to gain more insight from the arts and sciences.

Read the entire piece here.

Historians on Assessment

00b02-classroom-1

Enter a caption

In his recent book Why Learn History (When It’s Already on Your Phone), Stanford University professor Sam Wineburg challenges history teachers to develop new assessments of student learning to see if the study of history really does teach the skills we claim it teaches. (Wineburg is scheduled to visit The Way of Improvement Leads Home Podcast in the next few weeks to talk about the book). The chapter in Why Learn History is based on research conducted by Wineburg’s the Stanford History Education Group.  You can read more about that work here.

Yesterday at the annual meeting of the American Historical Association, a group of history educators explored some of Wineburg’s findings in a session titled “What Are We Learning?”: Innovative Assessments and Student Learning in College-Level History Classes.”  Colleen Flaherty of Inside Higher Ed reported on the session.   Here is a taste:

CHICAGO — A 2018 paper by members of the Stanford History Education Group called out historians for failing to value evidence of student learning as much as they value evidence in their historical analyses.

The authors’ occasion for rebuke? Their recent finding that many students don’t learn critical thinking in undergraduate history courses — a challenge to history’s sales pitch that its graduates are finely tuned critical thinkers.

Even among juniors and seniors in a sample of public university students in California, just two out of 49 explained that it was problematic to use a 20th-century painting of “The First Thanksgiving” to understand the actual 1621 event, wrote lead author Sam Wineburg, Margaret Jacks Professor of Education and professor of history at Stanford University, and his colleagues.

The paper, which included other similar examples, was distressing. But it wasn’t meant to damning — just a wake-up call, or, more gently, a conversation starter. And that conversation continued Thursday at the annual meeting of the American Historical Association. A panel of professors here urged a sizable crowd of colleagues to embrace not just grades but formative, ongoing assessment to gauge student learning or lack thereof in real(er) time.

Suggested formative assessments include asking students to engage with primary-source documents such as maps, paintings, eyewitness event accounts, newspaper ads and unconventional historical artifacts via specific prompts. Others include asking students to examine a symbol of American nationhood, a local historical site or how pundits use history to advance arguments.

Panelist Lendol Calder, professor of history at Augustana College in Illinois, ran a study very similar to Wineburg’s on his own campus, and said the disappointing results held up. In general, he said, students either take any historical source at face value or — when they discover it was created by a human being — dismiss it outright as “biased,” he said, to chuckles.

Partly in response to that finding, Calder and his colleagues have doubled down on their ongoing campaign to discuss historical “sourcing” in every single class. That is part of a larger, existing departmental motto: LASER, an acronym for Love history, Acquire and analyze information, Solve difficult problems, Envision new explanations, and Reveal what you know. Sourcing work, which Calder called a “threshold concept” in history, means asking students to evaluate the reliability of various historical texts. Who made it? When? Why? What value does it hold for historians, if any?

Read the rest here.

“Critical Thinking” and the University

College classroom 3

Over at his blog Blue Book Diaries, Jonathan Wilson reminds us that the teaching of “critical thinking” skills is not the primary purpose of a college education. (Neither is job training). Here is a taste of his piece “The Most Understood Purpose of Higher Ed.”

Let’s be realistic. Most of the time, in most institutions, both the notion that the academy is a free-for-all of critical thinking and the notion that it’s a re-education camp for the politically incorrect are myths. This is not to deny that ideological abuses of power do happen, nor that many students have rational awakenings in college, but neither is a realistic description of most people’s experiences in practice. And I don’t think they’re good descriptions of the academy’s behavior in theory either.

So what kind of thinking does the academy promote when it’s doing its job especially well? (For simplicity, let’s stick close to undergraduate applications.)

The key to provisional collective best thinking practices is that knowledge means something special to scholars, including successful college students. For scholarly purposes—and I believe this is true across disciplines—professional knowledge consists not simply of true beliefs, but of true beliefs reached in a valid way. And validity is judged not by the individual, but by a community of scholars in an ongoing conversation.

Here’s where things get truly scary: For rigorous scholarly purposes, knowledge includes in its implicit definition the possibility that it might ultimately be proven false. That’s the “ongoing conversation” part. The only thing that scholars, as such, know for sure (however certain they may feel) is that their knowledge hasn’t been discredited by valid scholarship yet.

Wilson argues that colleges and universities do not teach “that certain ideas are ‘true’ in an academic sense–as far as we know, according to the best available evidence so far–because we have worked them out in a collective process of examination.”  He adds,  “We teach truths that are provisional but have been reached through the collective best thinking.”

Amen.  This is a great argument for the communal nature of higher education.  Wilson concludes: “…the mark of truly well-educated (as opposed to well-trained or well-spoken) people is their grasp of the way knowledge is collectively created….”

Two quick responses from where I sit, as a history professor at a private liberal arts college:

First, this is yet another argument for why the liberal arts classroom must not be a place of indoctrination.  Our job is not to tell students what to believe, but to teach them how knowledge is created so that they can make their own decisions about what to believe.  This is something that those on the Left and the Right must understand, but in the context of academia it is something that is more pertinent to the Left.  The classroom is not a place for preaching.

Second, Wilson seems to be making an indirect argument for the disciplines.  Each liberal arts discipline offers a different way of examining the world and the human experience.  Each discipline provides a different set of skills and thinking habits for arriving at knowledge.  This is what makes me nervous about introducing “interdisciplinary” learning to college students so early in their college and university experience.  How does one learn to think in an “interdisciplinary” fashion without first learning the thinking skills and practices associated with the individual disciplines?

“People fall for fake news because they fail to think”

Think

Another argument for education:

According to a recent article in the journal Cognition, psychologists Gordon Pennycook (University of Regina) and David Rand (MIT) argue that “individuals who are more willing to think analytically…are less likely to think that fake news is accurate.”

The Pacific Standard has a piece on Pennycook’s and Rand’s study.  Here is a taste:

If humans have extraordinary reasoning abilities, why do so many of us fall for fake news? As the mid-term elections near, and our Facebook feeds load up with dubious posts, it’s an unusually urgent question.

One school of thought suggests our analytical skills can actually work against us, since we use them to convince ourselves of the correctness of our prejudices. While there is evidence of that distressing phenomenonnew research suggests the answer may be simpler—and perhaps even fixable.

It links susceptibility to misinformation with intellectual laziness.

“The evidence indicates that people fall for fake news because they fail to think,” report psychologists Gordon Pennycook of the University of Regina and David Randof the Massachusetts Institute of Technology. “Individuals who are more willing to think analytically … are less likely to think that fake news is accurate.”

In the journal Cognition, the researchers describe three studies featuring a total of 3,446 people. The main study, conducted in the summer of 2017, featured 2,644 participants recruited online, split nearly evenly between Donald Trump and Hillary Clinton supporters.

Participants were presented with 12 fake and 12 real news headlines, all presented in the format of a social media post. The fake items were split evenly between ones that would appeal to Trump supporters and Trump opponents.

Read the entire piece here.

Rod Dreher Interviews Alan Jacobs on *How to Think*

ThinkHere is a taste from Dreher’s blog:

I initially thought How To Think would be a basic primer of informal logic. It’s not that at all, but something more interesting. What’s the book about, and why did you write it? 

Last year, when the Presidential election campaign was ramping up here in the U.S., and my British friends were being roiled about by the Brexit debate, I was working on a different book (an academic one), but kept being distracted by all the noise. It seemed to me that everyone was lining up and shouting at everyone else, and no one seemed able to step back from the fray and think a bit about the issues at stake. More and more what attracted my attention was what seemed a complete absence of actual thinking. And then I asked myself: What is thinking, anyway? And what have I learned about it in my decades as a teacher and writer? I sat down to sketch out a few blog posts on the subject, and then realized that I had something a good bit bigger than some blog posts on my hands. So I set my other book aside and got to work.

You write, “The person who wants to think will have to practice patience and master fear.” What do you mean? 

Practicing patience because almost all of us live in a social-media environment that demands our instantaneous responses to whatever stimuli assault us in our feeds, and gives us the tools (reposts, likes, faves, retweets) to make those responses. Everything in our informational world militates against thinking it over. And mastering fear because one of the consequences of thinking is that you can find yourself at odds with groups you want to belong to, and social belonging is a human need almost as important as food and shelter. I’ve come to believe that our need — a very legitimate need! — for social belonging is the single greatest impediment to thinking.

Read the entire interview here. Learn more about How to Think here.

 

Alan Jacobs Teaches Us How To Think

ThinkBaylor University humanities professor Alan Jacobs‘s latest book is How to Think: A Survival Guide for a World at Odds.  Over at Religion News Service, Jacobs talks with journalist Jonathan Merritt about the book and the state of Christian thinking.

Here is a taste:

RNS: What do you see is the core problem with many “thinkers?”

AJ: It’s hard to name just one thing — there are so many problems! So much bad thinking! But if I were forced to name one universal one it would be a lack of awareness of our own motives and incentives. A failure to realize that there are forces at work on and in all of us to discourage thought or even prevent it altogether.

RNS: What about American Christians, generally speaking? Are they good thinkers?

AJ: Ummm … not so much.

RNS: How can followers of Jesus become better critical thinkers? Give us one or two points that come to mind.

AJ: Christians of all people ought to be attentive to our own shortcomings, and the ways our dispositions of mind and heart and spirit can get in the way of knowing what’s true. After all, we’e the people who are supposed to believe that “all have sinned and fallen short of the glory of God,” and “the heart is deceitfully wicked above all things” and that sort of stuff. If we want to think better, then the first step should be to take those beliefs as seriously as many of us say we do, and to turn a ruthlessly skeptical eye on ourselves — before we turn it on our neighbors. There’s a line about specks in our neighbors’ eyes and logs in our own that applies here.

There’s a lot more to say, obviously, but I think self-skepticism is the place to begin.

Read the entire interview here and find out why Jacobs think it is impossible to
“think for yourself.”

Think For Yourself

Ivy-League-Summer-Academy-CBL-1030x579

Fifteen Ivy League scholars have published a letter encouraging young people from the class of 2021 to think for themselves.  The letter appears on the website of Princeton University James Madison Program in American Ideals and Institutions. Signers include Yale historian Carlos Eire, Princeton political scientist Robert George, Princeton humanities professor Joshua Katz, and Harvard Law Professor Mary Ann Glendon.

Here it is:

We are scholars and teachers at Princeton, Harvard, and Yale who have some thoughts to share and advice to offer students who are headed off to colleges around the country. Our advice can be distilled to three words:

Think for yourself.

Now, that might sound easy. But you will find—as you may have discovered already in high school—that thinking for yourself can be a challenge. It always demands self-discipline and these days can require courage.

In today’s climate, it’s all-too-easy to allow your views and outlook to be shaped by dominant opinion on your campus or in the broader academic culture. The danger any student—or faculty member—faces today is falling into the vice of conformism, yielding to groupthink.

At many colleges and universities what John Stuart Mill called “the tyranny of public opinion” does more than merely discourage students from dissenting from prevailing views on moral, political, and other types of questions. It leads them to suppose that dominant views are so obviously correct that only a bigot or a crank could question them.

Since no one wants to be, or be thought of as, a bigot or a crank, the easy, lazy way to proceed is simply by falling into line with campus orthodoxies.

Don’t do that. Think for yourself.

Thinking for yourself means questioning dominant ideas even when others insist on their being treated as unquestionable. It means deciding what one believes not by conforming to fashionable opinions, but by taking the trouble to learn and honestly consider the strongest arguments to be advanced on both or all sides of questions—including arguments for positions that others revile and want to stigmatize and against positions others seek to immunize from critical scrutiny.

The love of truth and the desire to attain it should motivate you to think for yourself. The central point of a college education is to seek truth and to learn the skills and acquire the virtues necessary to be a lifelong truth-seeker. Open-mindedness, critical thinking, and debate are essential to discovering the truth. Moreover, they are our best antidotes to bigotry. 

Merriam-Webster’s first definition of the word “bigot” is a person “who is obstinately or intolerantly devoted to his or her own opinions and prejudices.” The only people who need fear open-minded inquiry and robust debate are the actual bigots, including those on campuses or in the broader society who seek to protect the hegemony of their opinions by claiming that to question those opinions is itself bigotry.

So don’t be tyrannized by public opinion. Don’t get trapped in an echo chamber. Whether you in the end reject or embrace a view, make sure you decide where you stand by critically assessing the arguments for the competing positions.

Think for yourself.

Conor Friedersdorf has some context at The Atlantic.

Are We Really Entitled To Our Own Opinions?

Over at The Baffler, Maximillian Alvarez, a graduate student at the University of Michigan, has an absolutely fascinating piece on the way our “opinions” are tied, in an unhealthy way, to our “identities.”

Here is a taste:

What do we really mean when we say we’re “entitled to our opinions”? So many questions have been asked over the past year with the hope that the answers to them may help us better understand how our dangerously absurd political moment came to be. But this question is way more revealing than most.

I’ve been fortunate enough to design and teach my own college courses exploring, from literary, historical, and philosophical angles, the many complex processes that led to a Donald Trump presidency. But, as a teacher of argumentative writing, I’ve also been given a window through which to observe some of those processes in action, to see how their effects manifest in the peculiar ways people—namely, my students—think and act. In classes where argumentation is the center of gravity for everything else we do, my students and I begin every term by discussing whether or not, in our classroom and in the world at large, we are, in fact, entitled to our opinions. 

On a purely literal level, the first implication of this common refrain is that, no matter how out of wack your opinion may be, you’re entitled to have it—no one can physically stop you. Sure. That’s reasonable, if kind of banal. (You can physically punish or silence people who have certain opinions, but can you actually stop them from having the opinions in the first place?) But, as it’s generally understood, the second implication of the phrase is more troublesome.

As Patrick Stokes, Senior Lecturer at Deakin University, explains it, the phrase suggests that you’re “entitled to have your views treated as serious candidates for the truth.” As if there’s a social law that says all opinions are equal and all deserve, by right, to be treated equally. This is where lines start to blur—when opinions themselves are seemingly given their own protective rights—and the common refrain that people are “entitled to their opinions” absorbs into itself the pseudo-noble cliché that we must always “respect other people’s opinions.” For Stokes, the obvious problem is that this kind of customary treatment devalues the ways that opinions are supposed to earnserious consideration through logical argumentation, persuasion, rigorous research, and expertise. When these are thrown out the window, people start to expect that their views deserve to not only be taken seriously, but to also be protected from serious challenges, because, well, it’s their opinion.

As Stokes argues, this shared belief that every opinion has an equal claim to being right or true leads to the twisted state of things we have today where, say, anti-vaxxer conspiracy theories or climate change denialism are given plenty of media time and mainstream consideration even when it can be shown that some of their claims are verifiably wrong and have serious negative consequences. Stokes, in other words, is on to something here, but the problem goes much deeper. This prevailing situation hinges less on differing opinions that claim, by their own merits, to be “serious candidates for the truth” and more on the ways that opinions have been given cultural and political protection in the “free market of ideas.” Opinions have been subsumed under the various and more totalizing categories of identity, which are understood to be “off limits.”

Read the rest here.

This piece takes my brain in so many different directions–the court evangelicals, the intellectual culture of the college where I teach, the place of social media in democratic discourse, and the people that cable news networks choose to put on the air–that I better stop writing before I write something I am not quite ready to “put out there” yet.

*How To Think*

jacobsThis is the title of Alan Jacobs‘s forthcoming book.

Here is what you can expect:

Hi. This is the site for my forthcoming book, How to Think, which will be published in the U.S. by Convergent Books, and in the U.K. by Profile Books, in October of 2017

Why did I write this book?

Across the political spectrum, people speak with a single voice on one point and one point only: our public sphere is a great big mess. Mistrust and suspicion of our neighbors, anger at their folly, inadvertent or deliberate misunderstanding of their views, attribution of the worst possible motives to those whose politics we despise: these are the dissonant notes we hear struck repeatedly every day, especially on social media. And while none of this began with the big political stories of 2016 — the Presidential election in the U.S., the Brexit decision in the U.K. — those events seem to have increased the volume pretty dramatically.

All this agitated hostility has grieved me, especially since I know and love people on all sides of the current culture wars. As someone who lives in both academic and religious communities, I am reminded every day of how deeply suspicious those groups can be of one another — and how little mutual comprehension there is. I’ve reflected a great deal on the major causes of our discontent and mutual suspicion, and I’ve wondered whether there might be some contribution I could make to the healing of these wounds.

Eventually two points occurred to me. The first is that many of our fiercest disputes occur because the people involved simply aren’t thinking: they’re reacting or emoting or virtue-signaling or ingroup-identifying. The second is that I have spent my entire career thinking and trying to teach others to think.

When those points became clear in my mind I understood what I needed to do. So I wrote this book.

Here are some of my key themes:

  • the dangers of thinking against others
  • the need to find the best people to think with
  • the error of believing that we can think for ourselves
  • how thinking can be in conflict with belonging
  • the dangers of words that do our thinking for u

Read more here.

 

What is Critical Thinking?

critical

Why should we study history?  Why does the college where I teach require students to take a history course?  I asked these questions to my students today.  A few them mentioned the phrase “critical thinking” in their answers.

As Georgia State University English professor Rob Jenkins notes in a recent piece at The Chronicle of Higher Education, “critical thinking” is an overused catchphrase in American higher education.

But it can be defined.

Here is a taste:

Critical thinking, as the term suggests, has two components. The first is thinking — actually thinking about stuff, applying your brain to the issues at hand, disciplining yourself (and it does require discipline) to grapple with difficult concepts for as long as necessary in order to comprehend and internalize them.

This is important because we live in a society that increasingly makes it easy for people to get through the day without having to think very much. We have microwaveable food, entertainment at our fingertips, and GPS to get us where we need to go. I’m not saying those things are bad. Ideally, such time-saving devices free up our brains for other, more important pursuits. But the practical effect is that we’ve become accustomed to setting our brains on autopilot.

Actual thinking requires deep and protracted exposure to the subject matter — through close reading, for example, or observation. It entails collecting, examining, and evaluating evidence, and then questioning assumptions, making connections, formulating hypotheses, and testing them. It culminates in clear, concise, detailed, and well-reasoned arguments that go beyond theory to practical application….

The second component of critical thinking is the critical part. In common parlance, “critical” has come to mean simply negative — as in, “I don’t like to be around him, he’s always so critical.” But of course that’s not what it means in an academic context.

Think of movie critics. They cannot simply trash every film they see. Instead, their job is to combine their knowledge of films and filmmaking with their extensive experience (having no doubt seen hundreds, even thousands of films) and provide readers with the most objective analysis possible of a given movie’s merits. In the end, what we’re left with is just one critic’s opinion, true. But it’s an opinion based on substantial evidence.

To be “critical,” then, means to be objective, or as objective as humanly possible. No one is capable of being completely objective — we’re all human, with myriad thoughts, emotions, and subconscious biases we’re not even aware of. Recognizing that fact is a vital first step. Understanding that we’re not objective, by nature, and striving mightily to be objective, anyway, is about as good as most of us can do.

Read the rest here.