The Humanities Will Set You Up for Life

humanities text

More reasons to consider that humanities degree.

Here is a taste of Amanda Ruggeri’s article at the BBC website:

George Anders is convinced we have the humanities in particular all wrong. When he was a technology reporter for Forbes from 2012 to 2016, he says Silicon Valley “was consumed with this idea that there was no education but Stem education”.

But when he talked to hiring managers at the biggest tech companies, he found a different reality. “Uber was picking up psychology majors to deal with unhappy riders and drivers. Opentable was hiring English majors to bring data to restauranteurs to get them excited about what data could do for their restaurants,” he says.

“I realised that the ability to communicate and get along with people, and understand what’s on other people’s minds, and do full-strength critical thinking – all of these things were valued and appreciated by everyone as important job skills, except the media.” This realisation led him to write his appropriately-titled book You Can Do Anything: The Surprising Power of a “Useless” Liberal Arts Education.

Take a look at the skills employers say they’re after. LinkedIn’s research on the most sought-after job skills by employers for 2019 found that the three most-wanted “soft skills” were creativity, persuasion and collaboration, while one of the five top “hard skills” was people management. A full 56% of UK employers surveyed said their staff lacked essential teamwork skills and 46% thought it was a problem that their employees struggled with handling feelings, whether theirs or others’. It’s not just UK employers: one 2017 study found that the fastest-growing jobs in the US in the last 30 years have almost all specifically required a high level of social skills.

Or take it directly from two top executives at tech giant Microsoft who wrote recently: “As computers behave more like humans, the social sciences and humanities will become even more important. Languages, art, history, economics, ethics, philosophy, psychology and human development courses can teach critical, philosophical and ethics-based skills that will be instrumental in the development and management of AI solutions.

Read the entire piece here.

Take the Humanities Course!

Hopkins

Here is Ronald Daniels, president of Johns Hopkins University:

Last fall, on the campus of Johns Hopkins University, where I serve as president, I happened to overhear a conversation among a group of students. One student was telling the others that he had decided not to enroll in an introductory philosophy course that he had sampled during the “add/drop” period at the start of the semester. The demands of his major, he said, meant that he needed to take “practical” courses. With an exaggerated sigh, he mused that “enlightenment” would simply have to wait. For now, employability was paramount. What can you do? His friends shrugged. You gotta get a job.

The students’ conversation has stayed with me, in part because it fits into a larger, disconcerting narrative about the role of the humanities in higher education. In a time of dizzying technological achievement and of rapid scientific innovation, skeptics of the humanities may question the usefulness of studying Aristotle, the history of the Italian Renaissance or modern Chinese fiction. At many universities across the country, beset by low enrollments and a lack of university support, the number of humanities course offerings and faculty members are dwindling. At meetings of university presidents, the humanities are frequently referred to as the “fragile disciplines.”

In hindsight, I regret not barging into the conversation of that student I overheard to argue for taking that introductory philosophy course. I would have started by reminding him that, for much of America’s history, college graduates were not deemed truly educated unless they had mastered philosophy, literature, political theory and history. The core role of higher education was to invite students into the millennia-spanning conversations about matters including what it means to be alive, the definition of justice and the tension between tyranny and democracy. Fostering engagement with these issues is still an essential part of the university’s function in society.

Read the entire piece at The Washington Post.

The humanities may be “the least risky way to prepare for employment in the 21st century economy”

humanities text

Chris Gerhz, aka The Pietist Schoolman, makes another strong case for studying the humanities in college.

Here is a taste of his piece, “A Counterintuitive Economic Argument for Majoring in the Humanities.”

I know, I know: it seems risky to pick a major that doesn’t have an obvious pathway to a particular career. But hear me out…

First, you need to recognize that there may be a significant disconnect between your expectations for your kids and their actual working futures. If you’re a 40- or 50-something, you probably retain at least some sense of what it meant to grow up in an economy whose workers stayed in or close to one career, sometimes even at one or two employers, and retired at age 65. None of that is likely to be true for your child as she starts college in 2018.

On the other side of her college graduation is much less stability in employment at virtually every stage of a much longer work life. What else would you expect when life expectancy is increasing, technological and cultural change is accelerating, and both employers and employees seem to be interested in building a “gig economy” that doesn’t assume long-term working arrangements?

So while a college education remains one of the biggest investments of anyone’s life, it’s hard to know how best to use those expensive years to set someone up for future economic success. Do you encourage your child to pick a major because it aligns most closely with a career whose short-term employment prospects look good? You can… but they’ll risk joining a glut of increasingly similar candidates seeking jobs in a market whose bubble may well burst.

Instead, it might make longer-term sense to consider a major in a humanities field, for three reasons:

Read the entire piece here.

Episode 37: Should You Go to Grad School?

PodcastAnyone who has been paying attention to higher ed and the humanities knows that job prospects for recently minted Ph.Ds are abysmal. So why do people keep choosing to engage in such a difficult process that by many measures is unlikely to pay off? John Fea adds his thoughts to this question and they are joined by Erin Bartram (@erin_bartram), the author of the viral blog post, “The Sublimated Grief of the Left Behind.”

Erin Bartram: “The Sublimated Grief of the Left Behind”

Bartram

Mary Sanders Bracy (l) and Erin Bartram (r) at 2013 AHA in New Orleans

I am a big Erin Bartram fan.  We have been on a panel together.  She has written multiple posts here at The Way of Improvement Leads Home.  I have learned a lot from her about teaching.  Frankly, I can’t think of a person more deserving of a tenure-track teaching job in a college or university history department.

The academic profession needs to deal with her post about leaving academia:

Here is a taste:

It happened during AHA.

I was sitting at home, revising my manuscript introduction and feeling jealous of all of my historian friends at the conference, when I got an email telling me my last (and best) hope for a tenure-track job this year had evaporated.

I’d promised myself that this would be my last year on the market. Now, I’d promised myself that last year, and I’d decided to try again, but this time, I knew it was over.

I closed my laptop and walked out of my office. In that moment, I couldn’t bear to be surrounded by the trappings of a life that had just crumbled around me. The perfect reading lamp, the drawer of fountain pen ink, the dozens of pieces of scratch paper taped the walls, full of ideas to pursue. The hundreds of books surrounding me, collected over nearly a dozen years, seemed like nothing more than kindling in that moment.

I cried, but pretty quickly I picked myself up and started thinking about the future. The circumstances of the job I didn’t get were particularly distressing, so I discussed it with non-academic friends, explaining over and over again that yes, this is the way my field works, and no, it wasn’t surprising or shocking to me, and no, I won’t be able to “come back” later, at least in the way that I’d want to, and yes, this was probably what was always going to happen. And then I started looking forward.

Only now do I realize how messed up my initial reaction was.

I was sad and upset, but I didn’t even start to grieve for several weeks, not because I hadn’t processed it, but because I didn’t feel I had the right to grieve. After all, I knew the odds of getting a tenure-track job were low, and I knew that they were lower still because I didn’t go to an elite program. And after all, wasn’t this ultimately my failure? If I’d been smarter, or published more, or worked harder, or had a better elevator pitch – if my brain had just been better, maybe this wouldn’t have happened. But it had happened, and if I were ultimately to blame for it, what right did I have to grieve?

Read the rest here.  Today I grieve with her.

Want to Get a Good Job and Be Happy?

humanities

Go to college and major in the humanities.

A recent study from the American Academy of Arts & Sciences is positive news for humanities students.  It reports on something we humanities folks already knew:  humanities majors get jobs, make good money, and live fulfilling lives.

Here is a taste of the report:

This report, based largely on original research commissioned by the American Academy of Arts and Sciences’ Humanities Indicators, examines a broader range of measures about holders of four-year bachelor’s degrees, including graduates’ satisfaction with their jobs, finances, and lives generally. The evidence shows that humanities graduates earn less and have slightly higher levels of unemployment relative to science and engineering majors. With respect to perceived well-being, however, humanities majors are quite similar to graduates from other fields. The data cannot explain the disparity between the objective and subjective measures, but they should provide a starting point for a more nuanced discussion about the relationship between field of undergraduate study, employment, and quality of life.

Learn more here.

More Good Reasons to Study the Humanities

59c16-i_love_humanities_tshirt-p235524076469557183trlf_400These come from Ilana Gershon and Noah Berlastsky at The Pacific Standard.

Here is a taste of their piece “Studying Humanities Teaches You How to Get a Job.”

“If you’re studying interpretive dance, God bless you, but there’s not a lot of jobs right now in America looking for people with that as a skill set,” Kentucky governor Matt Bevin declared in September, at a conference about higher education. Bevin’s skepticism about the humanities and arts isn’t an anomaly; politicians regularly joke about the supposed uselessness of non-STEM training. In 2014, President Barack Obama told students to major in trades rather than art history. In 2011, Governor Rick Scott of Florida said that it wasn’t of “vital interest” to his state to have students major in anthropology. And so on. Math, engineering, science, trades: Those are practical, politicians agree. Literature, art, and anthropology? Those don’t help you get jobs.

In fact, the reverse is true: The skills you learn in the humanities are exactly the skills you use in a job search. The humanities teach students to understand the different rules and expectations that govern different genres, to examine social cues and rituals, to think about the audience for and reception of different kinds of communications. In short, they teach students how to apply for the kinds of jobs students will be looking for after college.

Read the rest here.

Hire a Humanities Major

StrossCheck out Scott Jaschik’s interview at Inside Higher Ed with Randall Stross, author of A Practical Education: Why Liberal Arts Majors Make Great Employees.  Stross has a Ph.D in Chinese history from Stanford and currently teaches business at San Jose State University.  (Yes, you read that last sentence correctly).

Here is a taste of the interview:

Q: Many admissions leaders at liberal arts colleges report increasing difficulty in making the case for the liberal arts. What is your advice for them?

A: If it seems difficult to make the case now, imagine how difficult it would have been in the depths of the Great Depression, when the unemployment rate was 16 percent and headed for 24 percent and market demand for liberal arts majors had evaporated. The talk in the air was of the need for more vocational education. Yet William Tolley, in his inaugural address as the president of Allegheny College, did not falter. He made the case for a broad liberal education in 1931 whose contemporary relevance should hearten all of us who advocate for liberal education. “Specialists are needed in all vocations, but only as long as their vocations last, and vocations have a tendency now to disappear almost overnight,” he observed. He reasoned that in an ever-changing world the broad knowledge covered at a liberal arts college is “the finest vocational training any school can offer.” The argument is no less powerful today. But to make it seem well grounded, admissions leaders should have at their fingertips stories to share of graduates who left their schools with liberal arts majors and have gone on to interesting professional careers.

Q: Politicians seem to love to bash the liberal arts, asking why various majors are needed. How should educators respond?

A: Many politicians — perhaps most politicians — view the labor marketplace in terms defined entirely by “skills”: employers need workers equipped with specific skills; students either arrive with those skills or lack those skills. This is new, historically speaking. In a bygone era, 60 years ago, many large corporations hired college graduates in bulk, paying little heed to their majors, and spent the first years training the new hires themselves. So the defense of the liberal arts today must be delivered using the vocabulary of “skills.” Fortunately, conscientious students in the liberal arts can demonstrate great skill in many things: learning quickly, reading deeply, melding information from diverse sources smoothly, collaborating with others effectively, reasoning logically, writing clearly. I will resist the temptation to point out the apparent absence of these skills among those who are doing the bashing.

Read the rest here.

The Next Step in the Humanities “Counterattack” is “Translation”

interview-1018333_960_720

In my book Why Study History: A Historical Introduction I wrote:

But there are also larger issues that history teachers and professors, and school and college administrators, must confront if they want to be effective career counselors.  For example, we must equip students to be confident in the skills that they have acquired as history majors….Rather than apologizing to potential employers about being history majors, our students should enter job interviews boldly, discussing their abilities to write, communicate, construct narratives out of small details, listen, empathize, analyze, and think critically.  As Stanton Green, a humanities administrator notes, “People find jobs where they look for jobs.”  We need to instill our students with confidence.  The ability to do this must somehow be embedded in a history department curriculum.

Over at Inside Higher Ed, University of North Carolina-Greensboro  Emily Levine and Nicole Hall describe this process as “translation.”  Here is a taste of their piece:

After years of being on the back foot, the humanities have launched a counterattack. A shelf of new books, including Scott Hartley’s The Fuzzy and the Techie: Why the Liberal Arts Will Rule the Digital World (Houghton Mifflin, 2017) and Gary Saul Morson and Morton Schapiro’s Cents and Sensibility: What Economics Can Learn From the Humanities (Princeton, 2017), attest to the usefulness of the humanities for the 21st-century job market. Their fresh message makes the old creed that the humanities are a “mistake” or not “relevant” seem out of touch. Surveying these works in the July-August 2017 issue of Harvard Business Review, J. M. Olejarz dubs this countermovement “the revenge of the film, history and philosophy nerds….”

But where we go from here requires the hard work of identifying just what is the common denominator being learned in the humanities and how to parlay that knowledge and those skills into professional success. How do you apply Virginia Woolf to write better code or marshal your skills conjugating Latin verbs to execute an IPO?

At the University of North Carolina Greensboro, we have taken the next step of improving career outcomes for our students in the humanities by implementing the Liberal Arts Advantage, a strategy that articulates the value of the humanities to students, their parents and the community.

Directors of career development are realizing that they can’t do this work alone. They must engage faculty as their partners.

Jeremy Podany, founder, CEO, and senior consultant of the Career Leadership Collective, a global solutions group and network of innovators inside and near to university career services, says that helping faculty teach career development is part of the job. “I actually think we need to go to the faculty and say, ‘Let me teach you how to have a great career conversation,’” said Podany. The relationship between faculty members and career development offices — experts in the humanities and careers — is essential to preparing students for the job market.

Why? Because the central issue in realizing a long-term strategy for student career development is translation. That is, how students translate the skills they learn in the classroom into workplace success. This is particularly true in the case of the metacognitive skills that professors in the humanities can, and should, help contribute in their students.

Read the entire piece here.

Neem: “The STEM rubric undermines the unity between the humanities and sciences.”

Bevin 2

Kentucky governor Matt Bevin

Back in June, we published a post on Kentucky governor Matt Bevin‘s endorsement of a bill allowing the Bible to be taught in the state’s public schools.  I later published a shorter version of this post at Religion News Service.

Governor Bevin is back in the news after his said that the state’s public universities should cut programs that are not “helping to produce” a  “21st century educated workforce.”  Bevin urged university administrators in his state to “find entire parts of your campus…that don’t need to be there.”  He singled out “Interpretive Dance.”  Back in January, he singled out “French Literature.”  Bevin wants to put money and energy into growing engineering and other STEM programs at Kentucky universities. Ironically, according to Inside Higher Ed‘s coverage of Bevin’s remarks, the governor has an East Asian studies degree from Washington and Lee University.

Sadly, the interim president of the University of Louisville, Dr. Greg Postel, seems to agree with the governor. Postel told the Lexington Herald-Leader that his university’s engineering program is growing, making Bevin’s ideas for funding more STEM initiatives a “natural fit” at Louisville.  “Universities have to be aware of where the jobs are,” he told the Herald-Leader, “and that has to advise us as to which programs we choose to grow and put our resources in.”  If I was a humanities or liberal arts faculty member at Louisville I would be up in arms right now.  Postel has no clue about two things:  1) college education is more than job training and 2) liberal arts majors contribute to the economy and do a variety of jobs.

Check out Inside Higher Ed‘s coverage here.  It includes several faculty members who have pushed back.

Western Washington University historian Johann Neem is not mentioned in the Inside Higher Ed article, but back in February he responded to Bevin’s earlier comments on STEM. Neem believes that “science” should not be part of the STEM equation.  As he puts it, “The STEM rubric undermines the unity between the humanities and sciences.”

Here is a taste of his piece at the blog of the University of Wisconsin-Madison’s School of Education:

In theory, there are two major faculties on American college campuses, those who teach in the liberal arts and sciences, and those who offer professional education in such fields as business, education, engineering, social work, and various health fields. The two types of faculties are not necessarily in opposition, but they have different missions because they are oriented toward different goals.

To faculty in the arts and sciences, undergraduate education is liberal in nature​ — it is about gaining a broad knowledge ​about how the human and natural worlds work, because doing so can inspire students and because it serves a broader public good to have well-educated adults. Ideally, and often, there is no specific vocational outcome to these majors. In fact, to ask a history, English, biology, or geology major, “​What are you going to do with that?” ought to be irrelevant since these are academic disciplines designed for academic purposes. When majors were first established, their goal was not job training but to offer intellectual depth ​and balance or, better put, to enhance a general education. Thus, majors in the arts and sciences exist for their educational purposes with no real or necessary relation to market needs.

Professional faculty, on the other hand, train people for specific jobs. Their success is measured by whether their students gain the knowledge and skills necessary for employment in specific fields. Students who major in engineering, for example, are right to ask their programs, “​What can I do with that?” Moreover, students who choose to major in these fields may not receive the same kind of liberal education as those in the arts and sciences. Instead, they seek a direct line to employment. These fields, in other words, are tied closely to market needs.

The rhetoric of “STEM” (Science, Technology, Engineering, and Math) seeks to professionalize science faculty by reorienting their core community of identity. The sciences are not job training but part of liberal education. Math is a humanistic pursuit. Ideally, faculty and students in the sciences and math have different goals, perspectives, and aspirations than those in engineering and technology-related fields. Traditionally, science and math faculty have identified themselves with the broader purposes of the liberal arts, of which they are a part.

The more we use the term STEM​ — in praise, condemnation, or simply as a descriptor​ — the more we divide the arts and sciences faculty from each other. The arts and sciences exist as the educational core of the undergraduate collegiate curriculum. They are tied together conceptually. There is in fact no difference, from the ​perspective of liberal education, in choosing to major in philosophy or chemistry. Faculty in both disciplines, in all the arts and sciences, believe in the value of intellectual pursuit, in fostering curiosity about the world, and in graduating students who have breadth and depth. Yet, increasingly on campuses across the United States, colleges of arts and sciences are dividing into two units, the humanities and social sciences in one, and the sciences and math in another.

Neem concludes:

The STEM rubric undermines the unity between the humanities and sciences. For many policymakers, this is no doubt desirable. Yet, if faculty in the sciences and mathematics are not careful about how they identify themselves, they will be party to the erosion of the ideal of liberal learning, of which they remain an essential part. If faculty in the humanities and social sciences are not careful, they will find themselves marginalized as the sciences abandon liberal education to join forces with market-driven technology and engineering programs. If Americans are not careful, we will soon find that we have fundamentally changed the purposes and goals of collegiate education.

Read Neem’s entire piece here.

Should Young Academics Be On Twitter?

f91dc-twitterOliver Bateman, a historian and journalist, explores this question over at The Atlantic.

Here is a taste:

Scholarly research has lent credence to anecdotal claims about social media’s growing importance as a networking tool for academics at all stages of their careers. In a 2012 paper that represented one of the first systematic studies of social media’s impact on academia, George Veletsianos, a professor at Royal Roads University in British Columbia, analyzed the usage patterns of academics. He concluded that “the participation observed on Twitter presents opportunities for … scholarly growth and reflection,” though it was still too early to make a definitive statement about what that might entail. (He also noted, rather tellingly, that “online practices may not be valued or understood by peers and academic institutions even though scholars themselves may have found scholarly value in participating in online spaces.”)

Four years later, the researchers Charles Knight and Linda Kaye evaluated the social-media practices of academics at a large university, determining that these academics’ “use of the [Twitter] platform for enhancing reputation is an implied acknowledgement of the importance of research within higher education and the increasingly public engagement agenda.” Professors on the campus they studied were far more likely to use Twitter for this purpose than they were for pedagogical reasons: “Academics want to use Twitter to inform the wider community of their activities rather than engage their students.” Networking, it seems, is one of social media’s principal purposes for those in academia.  

“Twitter is great for academic networking, because it can be an awesome way for introverts and people who aren’t already in close proximity with the people they want to talk with to start building genuine relationships,” said Jennifer Polk, a friend and academic and career coach who runs the From PhD to Life website. “Of course, it’s all public [unless you adjust your security settings], so you should be professional—whatever that means in your field. And I recognize that in this context, ‘professional’ is a loaded term.”

Read the rest here.

I think Twitter, Facebook, blogs, and other social media sites are great resources for networking, sharing ideas, and raising questions.  (Perhaps this is simply stating the obvious at this point in my career). Graduate students and young academics should be using them for these purposes.

But I also think graduate students and young academics should always remember that while social media is a very democratic space, academia is not.  Academic life, in order to function properly, must have some degree of hierarchy based on expertise and experience.  In other words, a young scholar who submits a journal article or book for review will inevitably have a senior scholar evaluate the manuscript and make a decision on it.  Senior scholars at colleges universities will often have a lot to say about who gets hired in their departments.  In the course of searches for academic appointments and fellowships that have residency requirements, the search committee will often contact outside scholars who might be familiar with the candidate’s work and sense of collegiality.  And yes, I have been asked about a job or fellowship candidate’s sense of collegiality based on their social media presence.  It has actually happened more than once.

I entertain several of these requests a month.  I have even been in a position where a person argued with me on Twitter in a very unprofessional way and then applied for a job in my history department.  When I saw the application I went back to review the series of tweets this person had written, but they were deleted.  This person did not get the job.  There were stronger applicants in the pool that better served the needs of our department.  But I would be lying if I said that this Twitter exchange did not influence the way I thought about this person’s application. And I can tell a host of other stories like this from other committees on which I have served.

In the best of all possible worlds, decisions about publishing and teaching jobs should be made entirely on the merits of a candidate’s scholarship or teaching, but we do not live in the best of all possible worlds.  Young academics should have this in mind whenever they tweet or post.  I am often amazed when I see graduate students picking fights on Twitter or Facebook with senior people who one day might have to make a decision about the course of their future career.  Hopefully, for the sake of the candidate, that senior scholar will lay aside their memory of these social media exchanges and judge the candidate on the merits of their work.  But to do so requires a superior degree of discipline and professionalism.

Episode 21: Why We Need More Historians in the Silicon Valley

podcast-icon1The liberal arts vs. STEM. A degree in the humanities vs. a degree in business. The current conversation around higher education consistently pits the study of history, philosophy, or English against more “practical” pursuits like engineering or computer science. But both data and the insights of business leaders tell us that this is a false dichotomy. Host John Fea and producer Drew Dyrli Hermeling discuss the value of the liberal arts within both the current economic and political climate. They are joined by venture capitalist Scott Hartley (@scottehartley), author of The Fuzzy and the Techie: Why the Liberal Arts will Rule the Digital World.

Want to Be a Doctor? Study the Humanities

Duke

George Mellgard is graduating from Duke University next month with a degree in Classics.  It sounds like he is heading to medical school.

Here is a taste of his story from the pages of the Duke student newspaper, The Chronicle.

When I first came to Duke, I knew I wanted to take the pre-medicine route. I enjoyed both sciences and the humanities and thought the path of a doctor was most suited for exploring my interests. Therefore, I decided that, like the model pre-medicine student, I wanted to major in Neuroscience or Biology.

It was not until second semester that a friend of mine challenged me to take a risk. At the time, I was taking Roman History and had expressed to him my excitement at taking courses in the Classics. He suggested that rather than major in the sciences, I pursue a non-traditional major that I consider majoring in the Classics.

While I first wrote him off as being idealistic, I eventually found that this non-traditional route was not merely a possibility, but something that I actually wanted to pursue. The rest will hopefully be history, since I hope to graduate in less than a month with a major in the Classics and having fulfilled the necessary requirements for pre-medicine.

Majoring in the Classics has provided me with an experience entirely different than the sciences. It has taught me to look to the past to inform myself about our present situation. It has also provided me with guiding principles not only for learning but also for how to live life. However, my college academic path could have very easily taken a different route had I not been pushed to seek something different.

For this column I want to talk about the ways in which we choose classes and learn at this university. After all, while I value my major, it does not directly relate to my chosen professional path. For many, classes serve not as a way to learn but as a means to an end. We take classes so that we can get that easy A, fulfill that ALP or QS and to ensure that we are able to get that pre-professional summer internship.

In fact, ask almost any student here about why they are taking their current classes and you will get something along the lines of “because I have to” or “because I should.” Less often than not will the answer be because “I want to.”

While it is important for us to consider our future careers and overall performance, we turn to the more nuanced purpose of learning today. What I hope tomorrow’s students avoid is picking a predetermined path or class for the sake of their future—for dodging risks or the Math Department in favor of the clearer and defined road.

Taking this road is understandable. After all, there will always be the sense of dread or the lingering suspicion that taking the risky class is a luxury just not afforded to the average Duke student. But taking that easy path prevents us from experiencing our authentic self, or even worse prevents us from discovering what might be our true calling in life.

Read the entire piece here.

23 Jobs for History Majors

3DCoverI just came across a really interesting website titled “Sell Out Your Soul: A Career Guide for Lost Humanities Majors”  It is run by James Mulvey, a former English student who now works at a global software company.  He started the site to “inspire others to run from the culture of fear, isolation, and single-mindedness that keeps many graduate students from finding employment outside of academia.”

Here is a taste of a post titled “23 of the Best Jobs for History Majors“:

If you’re wondering what careers are available for History majors, you’ve come to the right place. I’ve collected 23 of the best jobs for History majors—careers that pay well, complement the skills taught in History departments and have long-term growth.

Despite the lies you’ve been told from the annoying Engineering major or clueless Business major, History majors end up in a variety of interesting places.

So pour yourself a beer. Roll up your sleeves. And let’s take a fast tour of the best careers for History majors.

The point of the list isn’t to tell you the exact steps to get these careers. That would be a long post and I cover that in my book. Use this list to decide on a general direction. Then go and search those careers on the following sites: Glassdoor, LinkedIn advanced search, Twitter advanced search, and Reddit. This will give you a realistic view of what your day-to-day would be like and whether this career would be a good match for you.

The jobs include Exhibit Designer, Content Creator, Customer Success Manager, Business Analyst, Growth Hacker, Product Marketing, PR Manager, Internal Communications, Content Strategist, Web Developer, Journalist, Project Manager, Social Media Manager, Content Editor, Research Analyst at Think Tanks, Political Campaign Manager, Government work.

Read how James connects the skills of history majors to these jobs.

“The Myth of the Unemployable History Major Must Be Destroyed”

34da2-whystudyhistoryThis is the title of a great post at “One Thing After Another,” the blog of the History Department at Saint Anselm College in New Hampshire.  Here is a taste:

History classes stress the analysis of various media—usually texts but also sources like film, music, painting, and so on. History majors ask and answer questions such as, “Who produced this source?” “Why did she produce it?” and “Under what circumstances was this source produced?” Ours is a reading-intensive discipline because reading is the only way to become practiced at this sort of thing. Doing this kind of work requires the development of analytical skills that lead students to sharpen their judgment. They come to understand what is likely or what is true. At the same time, they are required to synthesize a great deal of material to form a comprehensive picture of how people, places, and things have worked in the past—and how they may work in the future. They are then prepared to answer questions such as, “Why did this happen?” and “How did it occur?” What’s more, students in History are compelled by the nature of the discipline to articulate their thoughts in a systematic and compelling manner, both through discussion and on paper. In addition to being a reading-intensive discipline, we are also a writing-intensive one. Finally, the study of history leaves students with an enormous amount of cultural capital. Among other things, they encounter great literature, music, painting, movies, and rhetoric.  At the same time, they also learn about important events and noteworthy civilizations that we should all know something about—such as Han China, the French Revolution, the Zulu Kingdom, the Progressive Era in America, and World War II. Students educated in this fashion thus add to their stock of experience which helps them confront the challenges of the present.

To summarize, the course of study that History majors undergo provides them with high-level analytical skills, a capacity to synthesize large chunks of information, and an ability to present logical arguments in a persuasive fashion. Not only that, but their training offers them knowledge that helps them navigate and understand the world. These are the kind of attributes employers are looking for even in an age where STEM seems to be king (see here, here, here, here, here, and here—you get the idea).

We know these things to be true because we see what happens to our own majors after they graduate from Saint Anselm College. Our department recently surveyed alums who graduated between 2012 and 2015 with a degree in History. We determined that out of the three-quarters who responded to the survey, 100% were employed or attending graduate school. We also found they attained success in a wide variety of fields, most of which have nothing to do with history. For sure, we always have a number of students who double-major in history and secondary education. We are proud of these students, many of whom are high achievers; in 2014 and 2015, the winner of the Chancellor’s Award for the highest GPA in the graduating class was a history major who went on to teach. And yes, we also have a small number of graduates who go on to work in history-related fields (see here and here). But around 75% of our graduates are scattered among a wide range of other jobs.

Recently, One Thing after Another engaged in the exercise of naming all the positions held by History alumni whom the blog personally knows. This list is obviously not scientific; other members of the History Department know different alums who hold even more positions. Yet what follows ought to give the reader a sense of the wild diversity of jobs open to those who major in History. One Thing after Another knows many history majors who have gone on to law school and have since hung out their shingle as attorneys. Many of our alumni also work for the FBI, the CIA, and the DHS. Others have found employment as police officers and state troopers. We have a number of alumni who currently serve as commissioned officers in the armed forces. Many have gone into politics, serving as lobbyists, political consultants, legislative aids, and town administrators. Others have been on the staffs of governors and mayors. Large numbers work in sales for a variety of industries. We have managers at investment firms and folks who work on Wall Street. Other history majors this blog knows are in the health insurance business, serve as economic consultants, hold positions in import-export businesses, have become construction executives, and work in public relations. They have also become dentists, software engineers, filmmakers, nurses, social workers, journalists, translators, college coaches, and executive recruiters. Some work in the hospitality industry as the managers of resorts, hotels, and convention centers. Others are to be found on college campuses as administrators, financial aid officers, reference librarians, and so on. And then there are the archivists, curators, and museum staffers. Remember, this list (which was compiled in a somewhat off-hand manner) is not exhaustive. It only consists of alumni whom One Thing after Another knows personally. There are many other history alums out there doing even more things.

Read the entire post here.

Let’s try to keep chipping away at this myth.  We at The Way of Improvement Leads Home have been trying to do our part through our “So What CAN You Do With a History Major Series” and several chapters in Why Study History?: Reflecting on the Importance of the Past.

 

 

“Stale Ph.Ds” and “Overqualified” Applicants

bowenThis morning Michael Bowen is back with more insight on the academic job market in history.   As you now know, Michael has been writing for us from the 2017 Annual Meeting of the American Historical Association in Denver.  I think this post raises some very important points about the hiring process in history departments around the country.   Read all of Michael’s posts from the 2017 AHA here. –JF

After studying the job market for well over a decade, some clear, systemic biases have become evident. It might be too strong to call them biases, but the cumulative effect is to disqualify many good applicants right from the start. These observations will come over two blog posts in the hopes that interested search committee members might at least be more cognizant of them and job seekers can be prepared and make smart decisions regarding publishing.

Some caveats are in order first. These are qualitative, not quantitative. I don’t have a spreadsheet in front of me crunching the statistics for every hire in the last decade. I am open to arguments that they may be unique to my situation, and I am sure that there are exceptions to every rule. These are also only valid for the initial screen, where committees determine their AHA/Skype lists. Since the majority of applicants for any given job never make it to the first interview, these decisions are the most crucial.

For this post, I want to focus on time from degree. The prevailing wisdom seems to be that job candidates from ABD to about four years from their defense date are hirable, while everyone beyond that is not. There are exceptions…I know of one person who went on the tenure track for the first time after ten years…but those hired after an extended time as contingent faculty are in the minority. The higher ed press refers to these individuals as “stale PhDs,” which is incredibly insulting and implies that good academic work can only be accomplished in the dissertation stage or on the tenure track. Controversy erupted a few years ago when a couple of English departments posted ads that explicitly required a degree received within the previous three years.  History has not been so brazen, but we have a similar bias.

Maybe once, long ago before postdocs were readily available and the academy shifted the burden of instruction to adjuncts and lecturers, it made sense to make a “first cut” of applicants based on time to degree. Now, with individuals stringing together years and years of contingent appointments and producing good scholarship in the meantime, it seems unwise to do so. My dissertation director always told my cohort that as long as we can add something substantive to our vita every year, we would be fine. That has been my goal, which has been met eleven out of eleven years. He never envisioned a scenario where I would need to do that for eleven years, but his advice is still good. I would argue that such a benchmark would be a better measure of a candidate’s employability, than an arbitrary line on the calendar.

However, the aforementioned measure for success runs counter to the second pattern prevalent in today’s job market; the “overqualified” applicant. With so many people finding survivable, contingent employment for extended periods of time, more and more applicants are going for assistant professor lines with books in hand and a significant number of courses under their belts. In theory, this should be a good thing…you can bring in a new faculty member who you do not have to train and needs little prep time. But it goes against the old idea that faculty lines are apprenticeships. An assistant professor must learn the ropes from their colleagues and, when deemed sufficiently qualified, be granted tenure. If someone exceeds those requirements from the start, should they be hired? In most cases, search committees say no.

Historians who are working on an extended contingent faculty track find themselves treading a fine line. Do you hold off publishing a book because it could hurt you on the job market, or do you go ahead and publish because it is ready? My first inclination is to say publish, but I have lost enough jobs (both VAPS and TT) to individuals with a single journal article or a handful of book reviews to question whether or not I should have published mine before I had a tenure track line.

If you are a search committee member, do you see a book as a sign that an applicant will produce no further research of merit? It is a valid question. We all know of professors who have an early burst of scholarly productivity, get tenure, and then coast for the next thirty years. There may not be an answer to this, but it would be nice if we would get past the traditional expectations for a hire and take into account how academia has changed. Committees should factor in both logged experience and future potential.

Reflections on the Academic Job Search

job-searcvhWe are very happy to have William S. Cossen writing for The Way of Improvement Leads Home this weekend from the 2017 Annual Meeting of the American Historical Association. William defended his dissertation, “The Protestant Image in the Catholic Mind: Interreligious Encounters in the Gilded Age and Progressive Era,” in October and graduated with a PhD in history from Penn State University in December. (Congratulations!). He is a faculty member of The Gwinnett School of Mathematics, Science, and Technology in Lawrenceville, Georgia.  Below you will find some of his reflections on Day 1 of the AHA.–JF

Greetings from sunny Denver!

Well, a little wishful thinking can’t hurt.

After a smooth flight from Atlanta and a scenic train ride from Denver International Airport to the city, I made it safely to AHA 2017.  I’ll be presenting on Saturday at the American Catholic Historical Association’s conference as part of a panel titled “Catholicism and Americanism in the 19th Century: New Perspectives on an Old Debate.”  It will be nice to have so much time before delivering my own paper to enjoy the rest of the conference.

Following a quick, efficient check-in process (thank you, AHA!), I made my way to my first panel of the conference, “Deciphering the Academic Job Search,” which was sponsored by the AHA’s Professional Division.  With the market seemingly getting tighter every year, I was eager to hear opinions on the process from a recent candidate, a search committee member, and an academic dean.

The recurring themes I picked up in all three presentations were the necessity of flexibility and the need for candidates to be able to compellingly present their research – specifically providing a clear answer to the “so what?” question, a skill which is also useful in academic publishing and grant writing – to those outside their fields.

The first presenter, Ava Purkiss of the University of Michigan, provided helpful advice for how candidates can make themselves stand out in the initial stages of the job search process.  One tip was for candidates to shop their job materials around widely before applying, not only among their advisors and committee members but also among other professors and graduate students.  A second tip was for candidates to seek out search committees’ evaluation and scoring criteria for job applications.  This might not be easy to find, but Dr. Purkiss mentioned an example of one university posting this information online publicly.  A final piece of advice, which is especially useful in an era of online applications, was to print out all components of the application before submitting them to search committees to find and fix any glaring errors.

The second presenter, Paul Deslandes of the University of Vermont, counseled prospective job candidates to be self-reflective.  He urged job seekers to answer an important question: What do you really want out of academia?  He noted importantly that if one does not see themselves enjoying teaching, then academia is probably not a good fit.  Dr. Deslandes emphasized one of the panel’s key themes, which was that job seekers need to learn how to communicate their research to departments in their entirety, or as he put it, “Speak the language of other people.”  Regarding job opportunities, he encouraged those on the job market to “be expansive.”

The final presenter, Catherine Epstein of Amherst College, offered practical advice for the all-important cover letter: the letter must make clear “why your work is interesting.”  While Dr. Epstein noted that candidates are not expected to write a brand new cover letter for each job, the letters need to be tailored to specific schools.  Responding directly to the job requirements found in a job advertisement demonstrates true interest in the position and shows search committees that a candidate has actually attempted to learn about the institution to which they are applying.

The question-and-answer session following the presentations reflected some of the larger anxieties of the current history job market, but I think that panel chair Philippa Levine’s reminder that this is very much an impersonal process is an important point for job seekers to take to heart, as difficult as that may be, if they are disappointed by the outcome of their search for employment in academia.  One essential fact is that the number of job seekers far outstrips the number of available tenure-track positions.  However, these sorts of panels do a good service for the profession by partially demystifying what is for many an often confusing, frequently disappointing process.

I’m excited for Friday’s full schedule of sessions – and, of course, also for the book exhibit.  As with other conferences of this size, I have upwards of ten panels which I would like to see simultaneously.  This is ultimately not a bad problem to have.  More to come!

Bowen: The Historical Profession is “abjectly terrible at talking about the academic job market.”

bowenThis morning’s post by Mike Bowen resonated with many readers of The Way of Improvement Leads Home and struck a chord with folks attending the Annual Meeting of the AHA in Denver. Read it here.  In this post, Bowen offers some thoughts on the history job market.  –JF

Writing about the academic job market from the inside is very difficult. No one likes a braggart, and no one likes a complainer. If you stray too far in either direction your message can get lost amidst the visceral reactions emanating from the comment threads.

I tried writing about the job market once, back in the heady, pre-recession days of 2008. Frankly, the article is embarrassing and I wish I had never published it. It is too inflammatory and should have been more constructive and conciliatory. The response to the piece is why I spent the next nine years away from the topic.

Based on their reaction at the meeting in the graduate students/junior scholar job panel two days later, the AHA staff didn’t appreciate my contribution. There was no subsequent dialogue about any of the points I brought up. The AHA staff rediscovered a couple of those points in 2011 or 2012 on their own, and others have drilled down on the communication issue on non-academic sites, but there hasn’t been any substantive movement towards fixing the lack of communication or late notices for interviews.

More alarming to me were the grumblings among the job seeking community. You can see that a little bit of dialogue happened on the IHE comment thread, but the readership of the Chronicle forums was severely underchuffed. For the first time in my life, I was called a “special snowflake.” Someone said that I was “entitled.” God knows what would have happened if Twitter had been around back then.

The point for bringing all of this up…the profession is abjectly terrible at talking about the academic job market. Everyone knows that there is a major concern that needs to be addressed, but no one will actually make even a half-hearted effort to try. That has compounded the problem.

As the organization that is most closely associated with the job market, this situation comes back to the AHA somewhat. However, in late 2014, the executive director of the AHA wrote in Perspectives that the AHA is not here to help people find jobs in academia. I am legitimately, with no sarcasm intended, glad that he admitted this and has turned the organization to career diversity initiatives. I don’t need the help (see below), but I know others do.

The remaining stakeholders generally fall in to one of four camps. One small group outside the faculty wants to put everyone on five year contracts and do away with tenure. Some proposals have been more radical than that.  Another, slightly larger, group of contingent faculty wants to unionize. That may be a viable solution in some circumstances but, given today’s political climate I can’t envision a movement becoming so widespread that it works at every institution. The third camp is composed of job seekers who hope to God that they can land on their feet next academic year and are otherwise powerless.

The larger fourth camp is generally the rest of the profession, and they tend to ignore the situation. Job seekers make faculty members uncomfortable largely because, while many want to help, they can’t do much. You can’t really blame them either. Is it worth going to battle with a college administration, risking potential blowback down the line, to try to get more lines? More often than not, in an age of disinvestment in higher education and the dominance of STEM, the answer is no. So rather than confront the problem, the faculty retreats inwards and worries about themselves.

The net result is that we continue on the same path we have been on, motivated largely by inertia. We are a profession composed of highly-educated, socially-aware people, yet we have collectively thrown our hands up at a problem that we find too difficult to solve. I wish that we could engage in an honest discussion about this without politicizing it. Our discipline is fading , and the job crisis is part of the reason why.

Postscript: I have received  e-mails from people offering to help me transition out of academia. I appreciate the contacts, but it isn’t necessary. When I received notice of my non-renewal, I connected with a local job coach and subsequently landed a very good job in the editorial department at a research and publishing firm. I now manage a great team, am surrounded by wonderful co-workers, and have a supportive boss.

More importantly, I was able to get on with my life. That distance is what is allowing me to write these blog posts. I still adjunct at JCU one night class a semester to keep a foothold in the field and to supplement my income but, barring a miracle, the new job is my first priority now. It has to be. Do I want to get back into history full-time? Absolutely. I feel that teaching history is my vocation, but my past experience tells me that that is highly unlikely that there is a place for me to do so. That is just the reality.

 

What Happens After 9 Years as a Visiting Assistant Professor?

bowenI appreciate that Mike Bowen will be writing for us from Denver this week as part of our coverage of the 2017 Annual Meeting of the American Historical Association.  Bowen is adjunct instructor in history at John Carroll University and the former assistant director of the Bob Graham Center for Public Service at the University of Florida.  He is the author of The Roots of Modern Conservatism: Dewey, Taft, and the Battle for the Soul of the Republican Party (University of North Carolina Press, 2011).  -JF

Here is the first of his #aha17 posts:

I consider myself a veteran of the AHA annual meetings. My first was the 120th, held in January 2006 in Philadelphia. I was just a pup then…one semester away from defending with three chapters left to write. Like many, my goal was the elusive tenure-track line. I didn’t succeed in Philly, but that spring I worked something out in the secondary market, finished those chapters, and defended.

My AHA attendance has been sporadic since Philly, usually dependent on the prospects for a job interview. Those prospects have declined dramatically in recent years and became non-existent at the end of the 2014-15 academic year when, after nine consecutive one-year VAP/administrative appointments scattered across three states, my VAP line was terminated early. I was collateral damage to the administrative fallout from an accreditation decision. I remain an adjunct in good standing at that same institution and remain hopeful that there will be a full-time opportunity of some sort for me there. Even though I continue to apply to everything I can, there doesn’t seem to be much left for me as a working historian.

Barring a miracle of some sort, then, this will be my last trip to an AHA annual meeting. I don’t know what to expect, really. I am presenting what I imagine will be my last academic paper (Friday at 3:30, for those of you who are interested in moderate Republicans in the 1970s. I’ll be the one with the Southern accent). It is the fourth conference paper on the broad topic that I had planned to cover in my second book. Also, one of my former undergrads who is now a political organizer in Denver is going to meet up with me. That’s all I know. I plan to watch, observe, and ruminate on the job environment, the state of my field as I see it, and how the annual meeting has changed in my eleven years on the job market.

I will be writing from a position of tacit acceptance. Unfortunately, we have been beseeched in recent years with what scholars have come to call QuitLit. My posts will not be QuitLit because I do not want to quit, even though I likely will not be continuing as a historian. I am also not looking to trash the academy or the profession, because, even though I disagree with a number of their standards and practices, I would love to remain a member in good standing. I hope any criticisms I make will be taken in the constructive spirit in which they are offered. If my posts from the AHA can make people examine how they act when they are on search committees or can dispel some notions and biases that have worked against me, then I will have done a service.

Above all, I recognize that I am far luckier than most to have lasted almost a decade as a full-timer in this business. I do not want sympathy from the profession…I learned long ago that there the profession generally has little sympathy for those not on tenure track. If anyone wants to offer up an opportunity, though, I would gladly listen.