So What CAN You Do With a History Major?–Part 58

Fauci

You can lead the country through the coronavirus pandemic just like Anthony Fauci, director of the National Institute of Allergy and Infectious Diseases.  (OK–he was technically a classics major at the College of the Holy Cross–close enough!).

Here is a taste of a piece on Fauci at the Holy Cross Magazine:

Anthony Stephen Fauci was born in New York City on Christmas Eve 1940, the second of Stephen and Eugenia Fauci’s two children. His parents, both the children of immigrants, met as students at Brooklyn’s New Utrecht High School and married when they were just 18. He grew up in Bensonhurst, Brooklyn, where his father, a Columbia University educated pharmacist, owned a neighborhood drugstore, at 13th Ave. and 83rd St. The family lived in an apartment above the store, and all pitched in when needed—his father in the back, his mother and older sister, Denise, at the register.

“I was delivering prescriptions from the time I was old enough to ride a bike,” Fauci recalls.

Routinely cited in recent decades for the length of his work day and the peripatetic nature of his job, Fauci took on these habits early and came to them naturally. He was that kind of kid, too.

He grew up surrounded by disparate influences that he seems to have enjoyed and that seem to have benefited him: There was his pharmacist father, known as “Doc” in the neighborhood—whom he describes as “laid back”—and his mother, also college educated, whom he describes as “goal oriented.” There was an attraction to medicine and science fostered from an early age, and a commitment to the humanities nourished by premedical studies at Holy Cross that also encompassed the study of Latin, Greek and philosophy.

And there is early evidence, as well, that Fauci had a streak in him that was something between puckish and perverse—a stubborn adherence to his own values and interests in the face of local prejudice that had to have been fierce. Growing up in post-war Brooklyn, playing baseball in Dyker Heights Park, on Gravesend Bay, in the era of Jackie Robinson and Pee Wee Reese, Fauci was a Yankees fan. Among his heroes were Joe DiMaggio and Mickey Mantle, which, he says, made him something of a sports outcast among his friends, Brooklyn Dodgers fans all.

If he had been a sports outcast, he was an athletic one. In a 1989 interview with the NIH Historical Office, he remembers, “We used to play basketball from the beginning of basketball season to the end, baseball through the spring and summer, and then basketball and football again in the winter.” When he was younger, he played CYO basketball in the neighborhood; in high school, he captained the basketball team. Today, he’s a daily runner who has completed the New York and Marine Corps marathons.

He attended Regis High School, a Jesuit school on Manhattan’s Upper East Side. And the distance he had to travel to get there is difficult to explain, for reasons of time or geography and also for reasons of culture. Time and geography matter, of course, in multiple ways: the trip took 75 to 80 minutes each way, a bus and three subways during rush hour in both directions. By rough calculation, all the time he spent commuting during his four years at Regis, it cost him more than 70 days. And he didn’t just let the time go: then, as now, he was focused and organized. He was the kid on the subway—packed up against the other passengers, elbows against his body, wrists and forearms folded inward, a book almost on top his face, reading—in his case, probably Ignatius Loyola, at some point or other, and likely in Latin.

Time and geography also matter because Brooklyn was further away from Manhattan in the 1940s and 1950s than it is today, and Bensonhurst is deep Brooklyn, just a short three or four miles—a few stops on what was then the BMT Seabeach local line—from Coney Island and the beach. New York is New York, but it’s also five boroughs and a million neighborhoods. And working class, Italian and Jewish Bensonhurst, might as well have been 15 light years away from Manhattan’s Upper East Side, then, as now, one of the country’s most affluent zip codes.

In his commencement address this past May, U.S. Poet Laureate Billy Collins ’63—whose time at Holy Cross overlapped with Fauci’s, although they didn’t know each other—spoke with some nostalgia of the 10 o’clock dorm curfew of that era, and how students learned to “black out” their rooms with towels, newspapers and tin foil.

“It was behind these drawn shades,” Collins said, “that we indulged in the nefarious act of reading.”

Fauci came to Holy Cross in the fall of 1958. He played intramural sports when he had the time, but his days of more organized competition were over. He had entertained the vague idea that he might make the basketball team as a walk on, but the competition was fierce, and he didn’t quite have the height. Always a fully engaged student, moreover, he took to his premedical studies with gusto; “the nefarious act of reading” didn’t leave him a lot of spare time.

“There was a certain spirit of scholarship up there,” he remembers, “that was not matched in anything that I’d experienced. The idea of seriousness of purpose—I don’t mean nerdish seriousness of purpose—I mean the importance of personal development, scholarly development and the high standard of integrity and principles that became a part of everyday life at Holy Cross. And that, I think, was passed down from the Jesuits and from the lay faculty to the students.”

The premed program covered enough science to get the students into medical school, but also stressed the humanities—a continuation, in some ways, of what he had been taught in high school. Fauci often credits part of his professional success to the inculcation of Jesuit intellectual rigor that was a core part of his education: an emphasis on organization and logic, on succinctness and clarity of expression. Arguably, the twinning of science and the humanities has proved useful in his dual roles as physician and researcher as well.

 Read the entire piece here.

HT: John Schmalzbauer on Facebook.

Don’t Vilify Educated People

Have you seen memes like this?:

Meme Philosophy job

Jonathan Couser, a history professor at Plymouth State University in Plymouth, New Hampshire, has some good thoughts about this meme.  Here is what he recently wrote on his Facebook page (used with permission):

Bash the meme time, children. This was recently shared by a friend who, appropriately, took it down. But it’s the kind of thing that circulates a lot so I’m going to share it myself – with some analysis.

At first glance, the meme appears to be pointing out the value of trade jobs, which provide solid employment with little or no college debt. That’s true enough, and valid. These careers are good options that young people should consider.

But that’s not all it’s doing.

It’s misleading on a number of points. While “Adam’s” $100K in college debt is not unheard of, it’s nowhere near typical. Actual average college debt is around $30K. Meanwhile, “Chris'” income figure is inflated – it’s possible to make $80K a year as an electrician, but the average figure is around half that, maybe three-quarters, depending on where you live.

The meme says that “Adam” can’t find “a philosophy job,” which is no-brainer because, outside of academia, where you’d need a PhD rather than a BA, there’s no such thing as “a philosophy job.” That makes a cheap shot easy for the meme-creator, but disingenuously hides the realities.

Philosophy majors (and majors in other supposedly “worthless” degrees like History or English) actually do very well on the job market. The major is not designed as job training. Instead, they go into all kinds of careers where skills in writing, communicating, or analytical thinking are beneficial. They are also much better prepared than most to go on to graduate programs like an MBA or JD and become lawyers or business executives.

In fact, according to Five-Thirty-Eight in 2015, the average income of a philosophy major was – guess what? – $80K – the amount that was the inflated claim for “Chris'” income.

After being dishonest, the meme gets ugly.

Supposedly, “Adam” thinks that “Chris” is stupid. Meanwhile, “Chris” gleefully disconnects “Adam’s” electricity.

This is the rhetoric of grievance. It vilifies the educated people of the world, the philosophers, as a bunch of snobs who carry an unjustified contempt for working people. And it relishes the sense of vengeance, of getting even, that “we” (since we’re clearly supposed to be cheering for “Chris” by the time we read this far down the meme) are going to stick it to “them.” There’s no sense of empathy for “Adam” losing his electricity or blame that “Chris” does this to him. We’re supposed to think it’s just deserts.

To be sure, there are some educated snobs in the world. But I spend my life in academia, and I can honestly say that I can’t think of any of my colleagues, nor students, ever expressing contempt for working people. It’s a myth.

What’s really going on here is not a positive promotion of the value of a good trade career. What’s really going on is a toxic attack on higher education. The meme is designed to promote a sense of grievance, of resentment, and of contempt for education and the educated. By encouraging the “Chris'” of the world to despise the foolish “Adams”, the meme tells people they don’t need to listen to reasoning, they don’t need to respect expertise, and thus makes them pliable to misinformation, fake news and propaganda.

I agree with every word of Couser’s analysis.

The Humanities Will Set You Up for Life

humanities text

More reasons to consider that humanities degree.

Here is a taste of Amanda Ruggeri’s article at the BBC website:

George Anders is convinced we have the humanities in particular all wrong. When he was a technology reporter for Forbes from 2012 to 2016, he says Silicon Valley “was consumed with this idea that there was no education but Stem education”.

But when he talked to hiring managers at the biggest tech companies, he found a different reality. “Uber was picking up psychology majors to deal with unhappy riders and drivers. Opentable was hiring English majors to bring data to restauranteurs to get them excited about what data could do for their restaurants,” he says.

“I realised that the ability to communicate and get along with people, and understand what’s on other people’s minds, and do full-strength critical thinking – all of these things were valued and appreciated by everyone as important job skills, except the media.” This realisation led him to write his appropriately-titled book You Can Do Anything: The Surprising Power of a “Useless” Liberal Arts Education.

Take a look at the skills employers say they’re after. LinkedIn’s research on the most sought-after job skills by employers for 2019 found that the three most-wanted “soft skills” were creativity, persuasion and collaboration, while one of the five top “hard skills” was people management. A full 56% of UK employers surveyed said their staff lacked essential teamwork skills and 46% thought it was a problem that their employees struggled with handling feelings, whether theirs or others’. It’s not just UK employers: one 2017 study found that the fastest-growing jobs in the US in the last 30 years have almost all specifically required a high level of social skills.

Or take it directly from two top executives at tech giant Microsoft who wrote recently: “As computers behave more like humans, the social sciences and humanities will become even more important. Languages, art, history, economics, ethics, philosophy, psychology and human development courses can teach critical, philosophical and ethics-based skills that will be instrumental in the development and management of AI solutions.

Read the entire piece here.

Take the Humanities Course!

Hopkins

Here is Ronald Daniels, president of Johns Hopkins University:

Last fall, on the campus of Johns Hopkins University, where I serve as president, I happened to overhear a conversation among a group of students. One student was telling the others that he had decided not to enroll in an introductory philosophy course that he had sampled during the “add/drop” period at the start of the semester. The demands of his major, he said, meant that he needed to take “practical” courses. With an exaggerated sigh, he mused that “enlightenment” would simply have to wait. For now, employability was paramount. What can you do? His friends shrugged. You gotta get a job.

The students’ conversation has stayed with me, in part because it fits into a larger, disconcerting narrative about the role of the humanities in higher education. In a time of dizzying technological achievement and of rapid scientific innovation, skeptics of the humanities may question the usefulness of studying Aristotle, the history of the Italian Renaissance or modern Chinese fiction. At many universities across the country, beset by low enrollments and a lack of university support, the number of humanities course offerings and faculty members are dwindling. At meetings of university presidents, the humanities are frequently referred to as the “fragile disciplines.”

In hindsight, I regret not barging into the conversation of that student I overheard to argue for taking that introductory philosophy course. I would have started by reminding him that, for much of America’s history, college graduates were not deemed truly educated unless they had mastered philosophy, literature, political theory and history. The core role of higher education was to invite students into the millennia-spanning conversations about matters including what it means to be alive, the definition of justice and the tension between tyranny and democracy. Fostering engagement with these issues is still an essential part of the university’s function in society.

Read the entire piece at The Washington Post.

The humanities may be “the least risky way to prepare for employment in the 21st century economy”

humanities text

Chris Gerhz, aka The Pietist Schoolman, makes another strong case for studying the humanities in college.

Here is a taste of his piece, “A Counterintuitive Economic Argument for Majoring in the Humanities.”

I know, I know: it seems risky to pick a major that doesn’t have an obvious pathway to a particular career. But hear me out…

First, you need to recognize that there may be a significant disconnect between your expectations for your kids and their actual working futures. If you’re a 40- or 50-something, you probably retain at least some sense of what it meant to grow up in an economy whose workers stayed in or close to one career, sometimes even at one or two employers, and retired at age 65. None of that is likely to be true for your child as she starts college in 2018.

On the other side of her college graduation is much less stability in employment at virtually every stage of a much longer work life. What else would you expect when life expectancy is increasing, technological and cultural change is accelerating, and both employers and employees seem to be interested in building a “gig economy” that doesn’t assume long-term working arrangements?

So while a college education remains one of the biggest investments of anyone’s life, it’s hard to know how best to use those expensive years to set someone up for future economic success. Do you encourage your child to pick a major because it aligns most closely with a career whose short-term employment prospects look good? You can… but they’ll risk joining a glut of increasingly similar candidates seeking jobs in a market whose bubble may well burst.

Instead, it might make longer-term sense to consider a major in a humanities field, for three reasons:

Read the entire piece here.

Episode 37: Should You Go to Grad School?

PodcastAnyone who has been paying attention to higher ed and the humanities knows that job prospects for recently minted Ph.Ds are abysmal. So why do people keep choosing to engage in such a difficult process that by many measures is unlikely to pay off? John Fea adds his thoughts to this question and they are joined by Erin Bartram (@erin_bartram), the author of the viral blog post, “The Sublimated Grief of the Left Behind.”

Erin Bartram: “The Sublimated Grief of the Left Behind”

Bartram

Mary Sanders Bracy (l) and Erin Bartram (r) at 2013 AHA in New Orleans

I am a big Erin Bartram fan.  We have been on a panel together.  She has written multiple posts here at The Way of Improvement Leads Home.  I have learned a lot from her about teaching.  Frankly, I can’t think of a person more deserving of a tenure-track teaching job in a college or university history department.

The academic profession needs to deal with her post about leaving academia:

Here is a taste:

It happened during AHA.

I was sitting at home, revising my manuscript introduction and feeling jealous of all of my historian friends at the conference, when I got an email telling me my last (and best) hope for a tenure-track job this year had evaporated.

I’d promised myself that this would be my last year on the market. Now, I’d promised myself that last year, and I’d decided to try again, but this time, I knew it was over.

I closed my laptop and walked out of my office. In that moment, I couldn’t bear to be surrounded by the trappings of a life that had just crumbled around me. The perfect reading lamp, the drawer of fountain pen ink, the dozens of pieces of scratch paper taped the walls, full of ideas to pursue. The hundreds of books surrounding me, collected over nearly a dozen years, seemed like nothing more than kindling in that moment.

I cried, but pretty quickly I picked myself up and started thinking about the future. The circumstances of the job I didn’t get were particularly distressing, so I discussed it with non-academic friends, explaining over and over again that yes, this is the way my field works, and no, it wasn’t surprising or shocking to me, and no, I won’t be able to “come back” later, at least in the way that I’d want to, and yes, this was probably what was always going to happen. And then I started looking forward.

Only now do I realize how messed up my initial reaction was.

I was sad and upset, but I didn’t even start to grieve for several weeks, not because I hadn’t processed it, but because I didn’t feel I had the right to grieve. After all, I knew the odds of getting a tenure-track job were low, and I knew that they were lower still because I didn’t go to an elite program. And after all, wasn’t this ultimately my failure? If I’d been smarter, or published more, or worked harder, or had a better elevator pitch – if my brain had just been better, maybe this wouldn’t have happened. But it had happened, and if I were ultimately to blame for it, what right did I have to grieve?

Read the rest here.  Today I grieve with her.

Want to Get a Good Job and Be Happy?

humanities

Go to college and major in the humanities.

A recent study from the American Academy of Arts & Sciences is positive news for humanities students.  It reports on something we humanities folks already knew:  humanities majors get jobs, make good money, and live fulfilling lives.

Here is a taste of the report:

This report, based largely on original research commissioned by the American Academy of Arts and Sciences’ Humanities Indicators, examines a broader range of measures about holders of four-year bachelor’s degrees, including graduates’ satisfaction with their jobs, finances, and lives generally. The evidence shows that humanities graduates earn less and have slightly higher levels of unemployment relative to science and engineering majors. With respect to perceived well-being, however, humanities majors are quite similar to graduates from other fields. The data cannot explain the disparity between the objective and subjective measures, but they should provide a starting point for a more nuanced discussion about the relationship between field of undergraduate study, employment, and quality of life.

Learn more here.

More Good Reasons to Study the Humanities

59c16-i_love_humanities_tshirt-p235524076469557183trlf_400These come from Ilana Gershon and Noah Berlastsky at The Pacific Standard.

Here is a taste of their piece “Studying Humanities Teaches You How to Get a Job.”

“If you’re studying interpretive dance, God bless you, but there’s not a lot of jobs right now in America looking for people with that as a skill set,” Kentucky governor Matt Bevin declared in September, at a conference about higher education. Bevin’s skepticism about the humanities and arts isn’t an anomaly; politicians regularly joke about the supposed uselessness of non-STEM training. In 2014, President Barack Obama told students to major in trades rather than art history. In 2011, Governor Rick Scott of Florida said that it wasn’t of “vital interest” to his state to have students major in anthropology. And so on. Math, engineering, science, trades: Those are practical, politicians agree. Literature, art, and anthropology? Those don’t help you get jobs.

In fact, the reverse is true: The skills you learn in the humanities are exactly the skills you use in a job search. The humanities teach students to understand the different rules and expectations that govern different genres, to examine social cues and rituals, to think about the audience for and reception of different kinds of communications. In short, they teach students how to apply for the kinds of jobs students will be looking for after college.

Read the rest here.

Hire a Humanities Major

StrossCheck out Scott Jaschik’s interview at Inside Higher Ed with Randall Stross, author of A Practical Education: Why Liberal Arts Majors Make Great Employees.  Stross has a Ph.D in Chinese history from Stanford and currently teaches business at San Jose State University.  (Yes, you read that last sentence correctly).

Here is a taste of the interview:

Q: Many admissions leaders at liberal arts colleges report increasing difficulty in making the case for the liberal arts. What is your advice for them?

A: If it seems difficult to make the case now, imagine how difficult it would have been in the depths of the Great Depression, when the unemployment rate was 16 percent and headed for 24 percent and market demand for liberal arts majors had evaporated. The talk in the air was of the need for more vocational education. Yet William Tolley, in his inaugural address as the president of Allegheny College, did not falter. He made the case for a broad liberal education in 1931 whose contemporary relevance should hearten all of us who advocate for liberal education. “Specialists are needed in all vocations, but only as long as their vocations last, and vocations have a tendency now to disappear almost overnight,” he observed. He reasoned that in an ever-changing world the broad knowledge covered at a liberal arts college is “the finest vocational training any school can offer.” The argument is no less powerful today. But to make it seem well grounded, admissions leaders should have at their fingertips stories to share of graduates who left their schools with liberal arts majors and have gone on to interesting professional careers.

Q: Politicians seem to love to bash the liberal arts, asking why various majors are needed. How should educators respond?

A: Many politicians — perhaps most politicians — view the labor marketplace in terms defined entirely by “skills”: employers need workers equipped with specific skills; students either arrive with those skills or lack those skills. This is new, historically speaking. In a bygone era, 60 years ago, many large corporations hired college graduates in bulk, paying little heed to their majors, and spent the first years training the new hires themselves. So the defense of the liberal arts today must be delivered using the vocabulary of “skills.” Fortunately, conscientious students in the liberal arts can demonstrate great skill in many things: learning quickly, reading deeply, melding information from diverse sources smoothly, collaborating with others effectively, reasoning logically, writing clearly. I will resist the temptation to point out the apparent absence of these skills among those who are doing the bashing.

Read the rest here.

The Next Step in the Humanities “Counterattack” is “Translation”

interview-1018333_960_720

In my book Why Study History: A Historical Introduction I wrote:

But there are also larger issues that history teachers and professors, and school and college administrators, must confront if they want to be effective career counselors.  For example, we must equip students to be confident in the skills that they have acquired as history majors….Rather than apologizing to potential employers about being history majors, our students should enter job interviews boldly, discussing their abilities to write, communicate, construct narratives out of small details, listen, empathize, analyze, and think critically.  As Stanton Green, a humanities administrator notes, “People find jobs where they look for jobs.”  We need to instill our students with confidence.  The ability to do this must somehow be embedded in a history department curriculum.

Over at Inside Higher Ed, University of North Carolina-Greensboro  Emily Levine and Nicole Hall describe this process as “translation.”  Here is a taste of their piece:

After years of being on the back foot, the humanities have launched a counterattack. A shelf of new books, including Scott Hartley’s The Fuzzy and the Techie: Why the Liberal Arts Will Rule the Digital World (Houghton Mifflin, 2017) and Gary Saul Morson and Morton Schapiro’s Cents and Sensibility: What Economics Can Learn From the Humanities (Princeton, 2017), attest to the usefulness of the humanities for the 21st-century job market. Their fresh message makes the old creed that the humanities are a “mistake” or not “relevant” seem out of touch. Surveying these works in the July-August 2017 issue of Harvard Business Review, J. M. Olejarz dubs this countermovement “the revenge of the film, history and philosophy nerds….”

But where we go from here requires the hard work of identifying just what is the common denominator being learned in the humanities and how to parlay that knowledge and those skills into professional success. How do you apply Virginia Woolf to write better code or marshal your skills conjugating Latin verbs to execute an IPO?

At the University of North Carolina Greensboro, we have taken the next step of improving career outcomes for our students in the humanities by implementing the Liberal Arts Advantage, a strategy that articulates the value of the humanities to students, their parents and the community.

Directors of career development are realizing that they can’t do this work alone. They must engage faculty as their partners.

Jeremy Podany, founder, CEO, and senior consultant of the Career Leadership Collective, a global solutions group and network of innovators inside and near to university career services, says that helping faculty teach career development is part of the job. “I actually think we need to go to the faculty and say, ‘Let me teach you how to have a great career conversation,’” said Podany. The relationship between faculty members and career development offices — experts in the humanities and careers — is essential to preparing students for the job market.

Why? Because the central issue in realizing a long-term strategy for student career development is translation. That is, how students translate the skills they learn in the classroom into workplace success. This is particularly true in the case of the metacognitive skills that professors in the humanities can, and should, help contribute in their students.

Read the entire piece here.

Neem: “The STEM rubric undermines the unity between the humanities and sciences.”

Bevin 2

Kentucky governor Matt Bevin

Back in June, we published a post on Kentucky governor Matt Bevin‘s endorsement of a bill allowing the Bible to be taught in the state’s public schools.  I later published a shorter version of this post at Religion News Service.

Governor Bevin is back in the news after his said that the state’s public universities should cut programs that are not “helping to produce” a  “21st century educated workforce.”  Bevin urged university administrators in his state to “find entire parts of your campus…that don’t need to be there.”  He singled out “Interpretive Dance.”  Back in January, he singled out “French Literature.”  Bevin wants to put money and energy into growing engineering and other STEM programs at Kentucky universities. Ironically, according to Inside Higher Ed‘s coverage of Bevin’s remarks, the governor has an East Asian studies degree from Washington and Lee University.

Sadly, the interim president of the University of Louisville, Dr. Greg Postel, seems to agree with the governor. Postel told the Lexington Herald-Leader that his university’s engineering program is growing, making Bevin’s ideas for funding more STEM initiatives a “natural fit” at Louisville.  “Universities have to be aware of where the jobs are,” he told the Herald-Leader, “and that has to advise us as to which programs we choose to grow and put our resources in.”  If I was a humanities or liberal arts faculty member at Louisville I would be up in arms right now.  Postel has no clue about two things:  1) college education is more than job training and 2) liberal arts majors contribute to the economy and do a variety of jobs.

Check out Inside Higher Ed‘s coverage here.  It includes several faculty members who have pushed back.

Western Washington University historian Johann Neem is not mentioned in the Inside Higher Ed article, but back in February he responded to Bevin’s earlier comments on STEM. Neem believes that “science” should not be part of the STEM equation.  As he puts it, “The STEM rubric undermines the unity between the humanities and sciences.”

Here is a taste of his piece at the blog of the University of Wisconsin-Madison’s School of Education:

In theory, there are two major faculties on American college campuses, those who teach in the liberal arts and sciences, and those who offer professional education in such fields as business, education, engineering, social work, and various health fields. The two types of faculties are not necessarily in opposition, but they have different missions because they are oriented toward different goals.

To faculty in the arts and sciences, undergraduate education is liberal in nature​ — it is about gaining a broad knowledge ​about how the human and natural worlds work, because doing so can inspire students and because it serves a broader public good to have well-educated adults. Ideally, and often, there is no specific vocational outcome to these majors. In fact, to ask a history, English, biology, or geology major, “​What are you going to do with that?” ought to be irrelevant since these are academic disciplines designed for academic purposes. When majors were first established, their goal was not job training but to offer intellectual depth ​and balance or, better put, to enhance a general education. Thus, majors in the arts and sciences exist for their educational purposes with no real or necessary relation to market needs.

Professional faculty, on the other hand, train people for specific jobs. Their success is measured by whether their students gain the knowledge and skills necessary for employment in specific fields. Students who major in engineering, for example, are right to ask their programs, “​What can I do with that?” Moreover, students who choose to major in these fields may not receive the same kind of liberal education as those in the arts and sciences. Instead, they seek a direct line to employment. These fields, in other words, are tied closely to market needs.

The rhetoric of “STEM” (Science, Technology, Engineering, and Math) seeks to professionalize science faculty by reorienting their core community of identity. The sciences are not job training but part of liberal education. Math is a humanistic pursuit. Ideally, faculty and students in the sciences and math have different goals, perspectives, and aspirations than those in engineering and technology-related fields. Traditionally, science and math faculty have identified themselves with the broader purposes of the liberal arts, of which they are a part.

The more we use the term STEM​ — in praise, condemnation, or simply as a descriptor​ — the more we divide the arts and sciences faculty from each other. The arts and sciences exist as the educational core of the undergraduate collegiate curriculum. They are tied together conceptually. There is in fact no difference, from the ​perspective of liberal education, in choosing to major in philosophy or chemistry. Faculty in both disciplines, in all the arts and sciences, believe in the value of intellectual pursuit, in fostering curiosity about the world, and in graduating students who have breadth and depth. Yet, increasingly on campuses across the United States, colleges of arts and sciences are dividing into two units, the humanities and social sciences in one, and the sciences and math in another.

Neem concludes:

The STEM rubric undermines the unity between the humanities and sciences. For many policymakers, this is no doubt desirable. Yet, if faculty in the sciences and mathematics are not careful about how they identify themselves, they will be party to the erosion of the ideal of liberal learning, of which they remain an essential part. If faculty in the humanities and social sciences are not careful, they will find themselves marginalized as the sciences abandon liberal education to join forces with market-driven technology and engineering programs. If Americans are not careful, we will soon find that we have fundamentally changed the purposes and goals of collegiate education.

Read Neem’s entire piece here.

Should Young Academics Be On Twitter?

f91dc-twitterOliver Bateman, a historian and journalist, explores this question over at The Atlantic.

Here is a taste:

Scholarly research has lent credence to anecdotal claims about social media’s growing importance as a networking tool for academics at all stages of their careers. In a 2012 paper that represented one of the first systematic studies of social media’s impact on academia, George Veletsianos, a professor at Royal Roads University in British Columbia, analyzed the usage patterns of academics. He concluded that “the participation observed on Twitter presents opportunities for … scholarly growth and reflection,” though it was still too early to make a definitive statement about what that might entail. (He also noted, rather tellingly, that “online practices may not be valued or understood by peers and academic institutions even though scholars themselves may have found scholarly value in participating in online spaces.”)

Four years later, the researchers Charles Knight and Linda Kaye evaluated the social-media practices of academics at a large university, determining that these academics’ “use of the [Twitter] platform for enhancing reputation is an implied acknowledgement of the importance of research within higher education and the increasingly public engagement agenda.” Professors on the campus they studied were far more likely to use Twitter for this purpose than they were for pedagogical reasons: “Academics want to use Twitter to inform the wider community of their activities rather than engage their students.” Networking, it seems, is one of social media’s principal purposes for those in academia.  

“Twitter is great for academic networking, because it can be an awesome way for introverts and people who aren’t already in close proximity with the people they want to talk with to start building genuine relationships,” said Jennifer Polk, a friend and academic and career coach who runs the From PhD to Life website. “Of course, it’s all public [unless you adjust your security settings], so you should be professional—whatever that means in your field. And I recognize that in this context, ‘professional’ is a loaded term.”

Read the rest here.

I think Twitter, Facebook, blogs, and other social media sites are great resources for networking, sharing ideas, and raising questions.  (Perhaps this is simply stating the obvious at this point in my career). Graduate students and young academics should be using them for these purposes.

But I also think graduate students and young academics should always remember that while social media is a very democratic space, academia is not.  Academic life, in order to function properly, must have some degree of hierarchy based on expertise and experience.  In other words, a young scholar who submits a journal article or book for review will inevitably have a senior scholar evaluate the manuscript and make a decision on it.  Senior scholars at colleges universities will often have a lot to say about who gets hired in their departments.  In the course of searches for academic appointments and fellowships that have residency requirements, the search committee will often contact outside scholars who might be familiar with the candidate’s work and sense of collegiality.  And yes, I have been asked about a job or fellowship candidate’s sense of collegiality based on their social media presence.  It has actually happened more than once.

I entertain several of these requests a month.  I have even been in a position where a person argued with me on Twitter in a very unprofessional way and then applied for a job in my history department.  When I saw the application I went back to review the series of tweets this person had written, but they were deleted.  This person did not get the job.  There were stronger applicants in the pool that better served the needs of our department.  But I would be lying if I said that this Twitter exchange did not influence the way I thought about this person’s application. And I can tell a host of other stories like this from other committees on which I have served.

In the best of all possible worlds, decisions about publishing and teaching jobs should be made entirely on the merits of a candidate’s scholarship or teaching, but we do not live in the best of all possible worlds.  Young academics should have this in mind whenever they tweet or post.  I am often amazed when I see graduate students picking fights on Twitter or Facebook with senior people who one day might have to make a decision about the course of their future career.  Hopefully, for the sake of the candidate, that senior scholar will lay aside their memory of these social media exchanges and judge the candidate on the merits of their work.  But to do so requires a superior degree of discipline and professionalism.

Episode 21: Why We Need More Historians in the Silicon Valley

podcast-icon1The liberal arts vs. STEM. A degree in the humanities vs. a degree in business. The current conversation around higher education consistently pits the study of history, philosophy, or English against more “practical” pursuits like engineering or computer science. But both data and the insights of business leaders tell us that this is a false dichotomy. Host John Fea and producer Drew Dyrli Hermeling discuss the value of the liberal arts within both the current economic and political climate. They are joined by venture capitalist Scott Hartley (@scottehartley), author of The Fuzzy and the Techie: Why the Liberal Arts will Rule the Digital World.

Want to Be a Doctor? Study the Humanities

Duke

George Mellgard is graduating from Duke University next month with a degree in Classics.  It sounds like he is heading to medical school.

Here is a taste of his story from the pages of the Duke student newspaper, The Chronicle.

When I first came to Duke, I knew I wanted to take the pre-medicine route. I enjoyed both sciences and the humanities and thought the path of a doctor was most suited for exploring my interests. Therefore, I decided that, like the model pre-medicine student, I wanted to major in Neuroscience or Biology.

It was not until second semester that a friend of mine challenged me to take a risk. At the time, I was taking Roman History and had expressed to him my excitement at taking courses in the Classics. He suggested that rather than major in the sciences, I pursue a non-traditional major that I consider majoring in the Classics.

While I first wrote him off as being idealistic, I eventually found that this non-traditional route was not merely a possibility, but something that I actually wanted to pursue. The rest will hopefully be history, since I hope to graduate in less than a month with a major in the Classics and having fulfilled the necessary requirements for pre-medicine.

Majoring in the Classics has provided me with an experience entirely different than the sciences. It has taught me to look to the past to inform myself about our present situation. It has also provided me with guiding principles not only for learning but also for how to live life. However, my college academic path could have very easily taken a different route had I not been pushed to seek something different.

For this column I want to talk about the ways in which we choose classes and learn at this university. After all, while I value my major, it does not directly relate to my chosen professional path. For many, classes serve not as a way to learn but as a means to an end. We take classes so that we can get that easy A, fulfill that ALP or QS and to ensure that we are able to get that pre-professional summer internship.

In fact, ask almost any student here about why they are taking their current classes and you will get something along the lines of “because I have to” or “because I should.” Less often than not will the answer be because “I want to.”

While it is important for us to consider our future careers and overall performance, we turn to the more nuanced purpose of learning today. What I hope tomorrow’s students avoid is picking a predetermined path or class for the sake of their future—for dodging risks or the Math Department in favor of the clearer and defined road.

Taking this road is understandable. After all, there will always be the sense of dread or the lingering suspicion that taking the risky class is a luxury just not afforded to the average Duke student. But taking that easy path prevents us from experiencing our authentic self, or even worse prevents us from discovering what might be our true calling in life.

Read the entire piece here.

23 Jobs for History Majors

3DCoverI just came across a really interesting website titled “Sell Out Your Soul: A Career Guide for Lost Humanities Majors”  It is run by James Mulvey, a former English student who now works at a global software company.  He started the site to “inspire others to run from the culture of fear, isolation, and single-mindedness that keeps many graduate students from finding employment outside of academia.”

Here is a taste of a post titled “23 of the Best Jobs for History Majors“:

If you’re wondering what careers are available for History majors, you’ve come to the right place. I’ve collected 23 of the best jobs for History majors—careers that pay well, complement the skills taught in History departments and have long-term growth.

Despite the lies you’ve been told from the annoying Engineering major or clueless Business major, History majors end up in a variety of interesting places.

So pour yourself a beer. Roll up your sleeves. And let’s take a fast tour of the best careers for History majors.

The point of the list isn’t to tell you the exact steps to get these careers. That would be a long post and I cover that in my book. Use this list to decide on a general direction. Then go and search those careers on the following sites: Glassdoor, LinkedIn advanced search, Twitter advanced search, and Reddit. This will give you a realistic view of what your day-to-day would be like and whether this career would be a good match for you.

The jobs include Exhibit Designer, Content Creator, Customer Success Manager, Business Analyst, Growth Hacker, Product Marketing, PR Manager, Internal Communications, Content Strategist, Web Developer, Journalist, Project Manager, Social Media Manager, Content Editor, Research Analyst at Think Tanks, Political Campaign Manager, Government work.

Read how James connects the skills of history majors to these jobs.

“The Myth of the Unemployable History Major Must Be Destroyed”

34da2-whystudyhistoryThis is the title of a great post at “One Thing After Another,” the blog of the History Department at Saint Anselm College in New Hampshire.  Here is a taste:

History classes stress the analysis of various media—usually texts but also sources like film, music, painting, and so on. History majors ask and answer questions such as, “Who produced this source?” “Why did she produce it?” and “Under what circumstances was this source produced?” Ours is a reading-intensive discipline because reading is the only way to become practiced at this sort of thing. Doing this kind of work requires the development of analytical skills that lead students to sharpen their judgment. They come to understand what is likely or what is true. At the same time, they are required to synthesize a great deal of material to form a comprehensive picture of how people, places, and things have worked in the past—and how they may work in the future. They are then prepared to answer questions such as, “Why did this happen?” and “How did it occur?” What’s more, students in History are compelled by the nature of the discipline to articulate their thoughts in a systematic and compelling manner, both through discussion and on paper. In addition to being a reading-intensive discipline, we are also a writing-intensive one. Finally, the study of history leaves students with an enormous amount of cultural capital. Among other things, they encounter great literature, music, painting, movies, and rhetoric.  At the same time, they also learn about important events and noteworthy civilizations that we should all know something about—such as Han China, the French Revolution, the Zulu Kingdom, the Progressive Era in America, and World War II. Students educated in this fashion thus add to their stock of experience which helps them confront the challenges of the present.

To summarize, the course of study that History majors undergo provides them with high-level analytical skills, a capacity to synthesize large chunks of information, and an ability to present logical arguments in a persuasive fashion. Not only that, but their training offers them knowledge that helps them navigate and understand the world. These are the kind of attributes employers are looking for even in an age where STEM seems to be king (see here, here, here, here, here, and here—you get the idea).

We know these things to be true because we see what happens to our own majors after they graduate from Saint Anselm College. Our department recently surveyed alums who graduated between 2012 and 2015 with a degree in History. We determined that out of the three-quarters who responded to the survey, 100% were employed or attending graduate school. We also found they attained success in a wide variety of fields, most of which have nothing to do with history. For sure, we always have a number of students who double-major in history and secondary education. We are proud of these students, many of whom are high achievers; in 2014 and 2015, the winner of the Chancellor’s Award for the highest GPA in the graduating class was a history major who went on to teach. And yes, we also have a small number of graduates who go on to work in history-related fields (see here and here). But around 75% of our graduates are scattered among a wide range of other jobs.

Recently, One Thing after Another engaged in the exercise of naming all the positions held by History alumni whom the blog personally knows. This list is obviously not scientific; other members of the History Department know different alums who hold even more positions. Yet what follows ought to give the reader a sense of the wild diversity of jobs open to those who major in History. One Thing after Another knows many history majors who have gone on to law school and have since hung out their shingle as attorneys. Many of our alumni also work for the FBI, the CIA, and the DHS. Others have found employment as police officers and state troopers. We have a number of alumni who currently serve as commissioned officers in the armed forces. Many have gone into politics, serving as lobbyists, political consultants, legislative aids, and town administrators. Others have been on the staffs of governors and mayors. Large numbers work in sales for a variety of industries. We have managers at investment firms and folks who work on Wall Street. Other history majors this blog knows are in the health insurance business, serve as economic consultants, hold positions in import-export businesses, have become construction executives, and work in public relations. They have also become dentists, software engineers, filmmakers, nurses, social workers, journalists, translators, college coaches, and executive recruiters. Some work in the hospitality industry as the managers of resorts, hotels, and convention centers. Others are to be found on college campuses as administrators, financial aid officers, reference librarians, and so on. And then there are the archivists, curators, and museum staffers. Remember, this list (which was compiled in a somewhat off-hand manner) is not exhaustive. It only consists of alumni whom One Thing after Another knows personally. There are many other history alums out there doing even more things.

Read the entire post here.

Let’s try to keep chipping away at this myth.  We at The Way of Improvement Leads Home have been trying to do our part through our “So What CAN You Do With a History Major Series” and several chapters in Why Study History?: Reflecting on the Importance of the Past.

 

 

“Stale Ph.Ds” and “Overqualified” Applicants

bowenThis morning Michael Bowen is back with more insight on the academic job market in history.   As you now know, Michael has been writing for us from the 2017 Annual Meeting of the American Historical Association in Denver.  I think this post raises some very important points about the hiring process in history departments around the country.   Read all of Michael’s posts from the 2017 AHA here. –JF

After studying the job market for well over a decade, some clear, systemic biases have become evident. It might be too strong to call them biases, but the cumulative effect is to disqualify many good applicants right from the start. These observations will come over two blog posts in the hopes that interested search committee members might at least be more cognizant of them and job seekers can be prepared and make smart decisions regarding publishing.

Some caveats are in order first. These are qualitative, not quantitative. I don’t have a spreadsheet in front of me crunching the statistics for every hire in the last decade. I am open to arguments that they may be unique to my situation, and I am sure that there are exceptions to every rule. These are also only valid for the initial screen, where committees determine their AHA/Skype lists. Since the majority of applicants for any given job never make it to the first interview, these decisions are the most crucial.

For this post, I want to focus on time from degree. The prevailing wisdom seems to be that job candidates from ABD to about four years from their defense date are hirable, while everyone beyond that is not. There are exceptions…I know of one person who went on the tenure track for the first time after ten years…but those hired after an extended time as contingent faculty are in the minority. The higher ed press refers to these individuals as “stale PhDs,” which is incredibly insulting and implies that good academic work can only be accomplished in the dissertation stage or on the tenure track. Controversy erupted a few years ago when a couple of English departments posted ads that explicitly required a degree received within the previous three years.  History has not been so brazen, but we have a similar bias.

Maybe once, long ago before postdocs were readily available and the academy shifted the burden of instruction to adjuncts and lecturers, it made sense to make a “first cut” of applicants based on time to degree. Now, with individuals stringing together years and years of contingent appointments and producing good scholarship in the meantime, it seems unwise to do so. My dissertation director always told my cohort that as long as we can add something substantive to our vita every year, we would be fine. That has been my goal, which has been met eleven out of eleven years. He never envisioned a scenario where I would need to do that for eleven years, but his advice is still good. I would argue that such a benchmark would be a better measure of a candidate’s employability, than an arbitrary line on the calendar.

However, the aforementioned measure for success runs counter to the second pattern prevalent in today’s job market; the “overqualified” applicant. With so many people finding survivable, contingent employment for extended periods of time, more and more applicants are going for assistant professor lines with books in hand and a significant number of courses under their belts. In theory, this should be a good thing…you can bring in a new faculty member who you do not have to train and needs little prep time. But it goes against the old idea that faculty lines are apprenticeships. An assistant professor must learn the ropes from their colleagues and, when deemed sufficiently qualified, be granted tenure. If someone exceeds those requirements from the start, should they be hired? In most cases, search committees say no.

Historians who are working on an extended contingent faculty track find themselves treading a fine line. Do you hold off publishing a book because it could hurt you on the job market, or do you go ahead and publish because it is ready? My first inclination is to say publish, but I have lost enough jobs (both VAPS and TT) to individuals with a single journal article or a handful of book reviews to question whether or not I should have published mine before I had a tenure track line.

If you are a search committee member, do you see a book as a sign that an applicant will produce no further research of merit? It is a valid question. We all know of professors who have an early burst of scholarly productivity, get tenure, and then coast for the next thirty years. There may not be an answer to this, but it would be nice if we would get past the traditional expectations for a hire and take into account how academia has changed. Committees should factor in both logged experience and future potential.

Reflections on the Academic Job Search

job-searcvhWe are very happy to have William S. Cossen writing for The Way of Improvement Leads Home this weekend from the 2017 Annual Meeting of the American Historical Association. William defended his dissertation, “The Protestant Image in the Catholic Mind: Interreligious Encounters in the Gilded Age and Progressive Era,” in October and graduated with a PhD in history from Penn State University in December. (Congratulations!). He is a faculty member of The Gwinnett School of Mathematics, Science, and Technology in Lawrenceville, Georgia.  Below you will find some of his reflections on Day 1 of the AHA.–JF

Greetings from sunny Denver!

Well, a little wishful thinking can’t hurt.

After a smooth flight from Atlanta and a scenic train ride from Denver International Airport to the city, I made it safely to AHA 2017.  I’ll be presenting on Saturday at the American Catholic Historical Association’s conference as part of a panel titled “Catholicism and Americanism in the 19th Century: New Perspectives on an Old Debate.”  It will be nice to have so much time before delivering my own paper to enjoy the rest of the conference.

Following a quick, efficient check-in process (thank you, AHA!), I made my way to my first panel of the conference, “Deciphering the Academic Job Search,” which was sponsored by the AHA’s Professional Division.  With the market seemingly getting tighter every year, I was eager to hear opinions on the process from a recent candidate, a search committee member, and an academic dean.

The recurring themes I picked up in all three presentations were the necessity of flexibility and the need for candidates to be able to compellingly present their research – specifically providing a clear answer to the “so what?” question, a skill which is also useful in academic publishing and grant writing – to those outside their fields.

The first presenter, Ava Purkiss of the University of Michigan, provided helpful advice for how candidates can make themselves stand out in the initial stages of the job search process.  One tip was for candidates to shop their job materials around widely before applying, not only among their advisors and committee members but also among other professors and graduate students.  A second tip was for candidates to seek out search committees’ evaluation and scoring criteria for job applications.  This might not be easy to find, but Dr. Purkiss mentioned an example of one university posting this information online publicly.  A final piece of advice, which is especially useful in an era of online applications, was to print out all components of the application before submitting them to search committees to find and fix any glaring errors.

The second presenter, Paul Deslandes of the University of Vermont, counseled prospective job candidates to be self-reflective.  He urged job seekers to answer an important question: What do you really want out of academia?  He noted importantly that if one does not see themselves enjoying teaching, then academia is probably not a good fit.  Dr. Deslandes emphasized one of the panel’s key themes, which was that job seekers need to learn how to communicate their research to departments in their entirety, or as he put it, “Speak the language of other people.”  Regarding job opportunities, he encouraged those on the job market to “be expansive.”

The final presenter, Catherine Epstein of Amherst College, offered practical advice for the all-important cover letter: the letter must make clear “why your work is interesting.”  While Dr. Epstein noted that candidates are not expected to write a brand new cover letter for each job, the letters need to be tailored to specific schools.  Responding directly to the job requirements found in a job advertisement demonstrates true interest in the position and shows search committees that a candidate has actually attempted to learn about the institution to which they are applying.

The question-and-answer session following the presentations reflected some of the larger anxieties of the current history job market, but I think that panel chair Philippa Levine’s reminder that this is very much an impersonal process is an important point for job seekers to take to heart, as difficult as that may be, if they are disappointed by the outcome of their search for employment in academia.  One essential fact is that the number of job seekers far outstrips the number of available tenure-track positions.  However, these sorts of panels do a good service for the profession by partially demystifying what is for many an often confusing, frequently disappointing process.

I’m excited for Friday’s full schedule of sessions – and, of course, also for the book exhibit.  As with other conferences of this size, I have upwards of ten panels which I would like to see simultaneously.  This is ultimately not a bad problem to have.  More to come!