If Kanye West does not become president you can blame the iPhone. They are “notoriously faulty”

Celebrities At The Los Angeles Lakers Game

Here is Amber Neely at Apple Insider:

West’s lawyer is now claiming that the “notoriously faulty” clock of the iPhone may be one of many reasons his campaign team missed Wisconsin’s strict 5:00 p.m. ballot deadline.

Kanye West’s campaign team reportedly filed his Wisconsin presidential campaign ballot 14 seconds late on August 4, preventing his name from being added to the state’s presidential ballot. His campaign team and lawyer are now pushing back against Wisconsin in an attempt to get West added to the ballot.

A West campaign aide, Lane Ruhland, said that she had arrived at Wisconsin’s Elections Commission office before 5:00 p.m., though the door was locked, putting her behind schedule. By the time she reached the door, it was 14 seconds after the filing deadline.

West’s lawyer, Michael Curran, points that a Democratic Party staffer had timed her entrance with an iPhone video and that iPhone clocks are “notoriously faulty.”

Ruhland claims that the elections specialist had not provided any official timestamp, either.

“The elections specialist did not show us a clock, timer or recording showing the time of 5:00:14, nor did the filings receive a time stamp,” Ruhland said in her affidavit, highlighted by the Milwaukee Journal Sentinel.

Read the rest here.

The Author’s Corner with Daniel B. Rood

the reinvention of atlantic slaveryDaniel B. Rood is Associate Professor of History at the University of Georgia. This interview is based on his book, The Reinvention of Atlantic Slavery: Technology, Labor, Race, and Capitalism in the Great Caribbean (Oxford University Press, 2020).

JF: What led you to write The Reinvention of Atlantic Slavery?

DR: Arguments over the role of enslaved people in the growth of modern western capitalism had always intrigued and inspired me, but I found contemporary scholars often lapsed into abstract phrases when actually making the case.  Sugar plantations were “industrial,” planters were “rational” and “innovative,” there were railroads and machines in slave societies, etc., etc. I felt like, in depending on these loaded terms, there was a bit of a black box effect going on. What does “industrial” mean, exactly?  What is that machine in the artist’s rendering of a plantation?  What is it doing there?  Why do we care? So, I wanted to open that box back up and re-build arguments about slavery and capitalism from the ground up, i.e. from examining and reflecting upon the micro-processes of labor, technology, and ecology on plantations and in workshops, factories, warehouses, transport systems, and markets.

JF: In two sentences, what is the argument of The Reinvention of Atlantic Slavery?

DR: In an age of industrial growth and expanding antislavery movements, ambitious planters in the Upper US South, Cuba, and Brazil forged a new set of relationships with one another to sidestep the financial dominance of Great Britain and the northeastern United States. Hiring a transnational group of chemists, engineers, and other “plantation experts,” they sought to adapt the technologies of the Industrial Revolution to suit “tropical” needs and maintain profitability, while depending on the know-how of slaves alongside whom they worked.

JF: Why do we need to read The Reinvention of Atlantic Slavery?

DR: First, my book shows that a cotton nexus connecting the Deep South to Lancashire mills and Liverpool banks was far from the only story to tell about antebellum slavery and capitalism. I also demonstrate that sustained attention to how commodities are made and moved around can generate broader insights into the histories of slavery, the African diaspora, and race. Among other things, the book shows that changes in racist ideology were profoundly entangled with changes in capitalist productive technologies.  Modern “white” commodities like sugar and flour emerged together with transformed “white” and “black” racial categories in the same mid-19th century Atlantic World matrix. It is a flashy thing to assert, but I work hard to substantiate it. I think the journey is worthwhile for the reader, whether or not they are always convinced.

JF: When and why did you decide to become an American historian?

DR: I’m not sure I ever did. I was an English major as an undergrad. I only remember taking one history class; I mostly remember reading lots of Keats and Wordsworth.  A faculty mentor encouraged me to do American Studies at NYU, which was a deeply generative, if sometimes cringe-inducing, time for me. That was when I first spent a lot of time with Marx’s writings, and where I was introduced to scholars like Eric Williams, CLR James, and Sidney Mintz who centered the African Diaspora in the making of the modern world. I was fascinated by the questions they were asking, and wanted to explore more.  Becoming a historian, and becoming an Americanist, happened accidentally on the way.

JF: What is your next project?

DR: I am currently writing a book on the history of plantations from 1500-present. I have also been working sporadically over the past few years on a micro-history of post-emancipation black landowners in and around Athens, Georgia. Finally, I plan to write a history of southern forests from pre-Columbian times to the present. After that it’s back to Keats and Wordsworth.

JF: Thanks, Daniel!

Any College Professor Can Relate to What Happened in Iowa Last Night

Smartphone-apps-e1513261472286-1

You spent hours preparing.   You get to class early so you can project your PowerPoint and queue your videos.  You are going to utilize that digital pedagogical tool you learned about in the required professional development workshop you took last May.

But when the time comes to execute, the technology doesn’t work.

Here is The New York Times:

The app that the Iowa Democratic Party commissioned to tabulate and report results from the caucuses on Monday was not properly tested at a statewide scale, said people who were briefed on the app by the state party.

It was quickly put together in just the past two months, said the people, some of whom asked to remain anonymous because they were not authorized to speak publicly.

And the party decided to use the app only after another proposal for reporting votes — which entailed having caucus participants call in their votes over the phone — was abandoned, on the advice of Democratic National Committee officials, according to David Jefferson, a board member of Verified Voting, a nonpartisan election integrity organization.

Late Monday night, that chain of events came to a head when results from the Iowa caucuses were significantly delayed. While vote counts in the past have typically been reported earlier in the evening, the Iowa Democratic Party held a conference call with representatives from each campaign at around 10:30 p.m. Eastern time to tell them that roughly 35 percent of precincts had reported, but that it would provide no other details about the results.

Read the rest here.

What Kind of Technology Do Undergraduates Want?

c78b4-messiahcollegeboyerhallcopy_2

Messiah College participated in this survey

According to the EDUCAUSE Center for Analysis and Research, undergraduates:

  • want mostly face-to-face learning environments.
  • want lectures, student presentations, question and answer sessions, and class discussions to take place in a face-to-face learning environment , as opposed to homework, exams, and quizzes.
  • really like degree audits and degree planning tools.
  • want Wi-Fi in the library and classrooms.
  • think that their professors do a good job in using technology to enhance their learning.
  • who have disabilities are not happy with, or upset with, their access to technology on campus.

Dig deeper here.

Lock Up the Phones!

SBS

This morning we published a post about smartphones in school.

Over at Newsday, education reporter John Hildebrand writes about the evening smart phone policy at The Stony Brook School on Long Island.  (Full disclosure, we lived in a dorm at The Stony Brook School from 1995-2000 where Joy was Assistant Dean of Students and Director of Residential Life.  I taught AP US History there during the 1999-2000 school year).

Here is a taste:

Separating teenagers from their cellphones — a difficult thing, as many parents can attest — is a weeknight reality at a prep-school campus on Long Island’s North Shore.  

On Monday through Thursday nights, student boarders at The Stony Brook School lock up their cellphones in special neoprene pouches during study hours from 7:45 to 9:45 p.m. The 96-year-old private academy also prohibits use of the phones during lunches, dinners and twice-a-week chapel services. 

The partial phone ban, which started with the opening of classes in late August, is designed to help students concentrate on work that includes rigorous college-level courses such as Advanced Placement Latin. 

Read the rest here.

Humanities in a “Tech World”

59c16-i_love_humanities_tshirt-p235524076469557183trlf_400Mike Walden is a William Neal Reynolds Distinguished Professor and Extension Economist at North Carolina State University.  In this piece he explains why the humanities are needed in a “tech world.”

Here is a taste:

 

There’s another reason for the relevance of humanities in our current world. Some thinkers say the application of the next level of technology to human use will require a cultural change, and developers of new technology will have to understand this cultural shift in order to be successful.

Robots and driverless vehicles are good examples. Although it’s fun to think of these tools in abstract, when they become a reality, how will we react? Robots and driverless vehicles mean a shift in control and power from humans to machines that we have never experienced before. How will we react? Will robots and driverless vehicles be commercial successes or financial flops because people couldn’t adapt to them?

Obviously developers and manufacturers want to know, and who better than to guide them than individuals who have studied human culture – that is, those who have studied the humanities.

There have already been studies indicating a new found appreciation of humanities experts in today’s high-tech economy. Many companies have discovered humanities majors make excellent managers and decision-makers.

So in the race between the STEMS and the HUMIES (my short-cut for the humanities), it may be too early for us to decide who will come out on top!

Read the entire piece here.

Teaching is a Human Act

College-classroom

For several years now Jonathan Rees has been railing against MOOCs and other forms of automated teaching.  I appreciate his insights.  I am a regular reader of his blog More or Less Bunk.

Rees’s most recent reflection on teaching and technology appears today at The Chronicle of Higher Education.

Here is a taste of “You Can’t Automate Good Teaching”:

A few years ago, I spilled an awful lot of pixels over at my blog trying to come to grips with the implications of Massive Open Online Courses (or MOOCs). They were supposed to be the innovation that would not only make most college professors obsolete, but force countless colleges to close as every student would prefer to hear Harvard’s best lecture rather than get their  course content from the community-college professor in their neighborhood.

Of course, any college professor who cares one whit about teaching understands that education involves a lot more than just conveying information. There’s the teaching of particular skills. There’s applied learning. There’s the unpredictable relationship between two humans whenever they try to to accomplish anything complicated.

In other words, good teaching is just one long series of “edge cases.” You may come into class with the same lecture notes every semester, but unless you spend all your time staring up at the ceiling, how your students interpret the material you’re teaching is going to affect the way you choose to teach it. They don’t even have to stop you and ask questions while you’re talking. So long as you and they are in the same room — with you conveying information in real time — you will see how your material is going over and can adjust your presentation accordingly.

Even if you really could deliver the same exact lecture every time, you will never get the same result twice because the learning process is never entirely predictable. If we automated learning, information would still travel from the brain of the professor to the brain of the student, but we’d never know exactly how well students understood it. You might as well just hit “play” on a tape of someone else’s lecture, then leave the room to do something else.

Read the entire piece here.

Why Computer Scientists Should “Stop Hating” the Humanities

HartleyThis issue keeps coming up.

Yesterday during a faculty meeting I listened to a colleague explain digital humanities to a group of more traditional-minded humanists.  He discussed the digital humanities as an effort to bridge the divide between computer scientists and humanistic inquiry.

Last weekend we dropped Episode 21 of The Way of Improvement Leads Home Podcast.  Our guest was Scott Hartley, a venture capitalist who came of age in the Silicon Valley.  Hartley’s new book The Fuzzy and the Techie: Why the Liberal Arts Will Rule the Digital World argues that liberal arts graduates usually have the most creative and successful business ideas.

Now Wired magazine is getting into the act.  Check out Emma Pierson‘s piece “Hey, Computer Scientists! Stop Hating on the Humanities.

Here is a taste:

As a computer science PhD student, I am a disciple of big data. I see no ground too sacred for statistics: I have used it to study everything from sex to Shakespeare, and earned angry retorts for these attempts to render the ineffable mathematical. At Stanford I was given, as a teenager, weapons both elegant and lethal—algorithms that could pick out the terrorists most worth targeting in a network, detect someone’s dissatisfaction with the government from their online writing.

Computer science is wondrous. The problem is that many people in Silicon Valley believe that it is all that matters. You see this when recruiters at career fairs make it clear they’re only interested in the computer scientists; in the salary gap between engineering and non-engineering students; in the quizzical looks humanities students get when they dare to reveal their majors. I’ve watched brilliant computer scientists display such woeful ignorance of the populations they were studying that I laughed in their faces. I’ve watched military scientists present their lethal innovations with childlike enthusiasm while making no mention of whom the weapons are being used on. There are few things scarier than a scientist who can give an academic talk on how to shoot a human being but can’t reason about whether you should be shooting them at all.

Read the rest here.

 

Pope Francis Gives a TED Talk

Pope Francis’s 17-minute videotaped talk was shown today at the international TED conference.  Read all about it in Colby Itkowitz’s article at The Washington Post.

Here is a taste of Itkowitz’s article:

Pope Francis used a world forum dedicated to promoting cutting-edge ideas to spread his own revolutionary message: “We all need each other.”

“When there is an ‘us,’ there begins a revolution,” the world’s most powerful religious leader told the room of scientists, academics, tech innovators, investors and cultural elites in a surprise videotaped message at the international TED conference Tuesday evening.

Keeping with the intent of the week-long conference to share strategies to make the world better, Francis’s contribution to that conversation was to urge the people gathered here to use their influence and power to care for others.

“How wonderful would it be if the growth of scientific and technological innovation would come along with more equality and social inclusion,” he said to applause. “How wonderful would it be, while we discover faraway planets, to rediscover the needs of the brothers and sisters orbiting around us.”

When Francis appeared on screen, the room erupted in applause, and one woman exclaimed, “No way.” Though he wasn’t standing center stage in front of TED’s signature red blocks letters, but rather seated at a desk at the Vatican, his speech had all the hallmarks of a TED Talk. His began with a personal narrative and wove in big ideas around hope, inclusion and starting a “revolution of tenderness.”

Read the rest here.

 

Smallpox: The Video Game

Smallpox.  No disease in history has taken more lives.  Sam Kean, writing at Humanities, describes its devastating influence on the history of the world and the vaccine that triumphed over its deadly power.  He also informs us of a group of humanists, led by historian Lisa Rosner at Stockton College (NJ), who are working with a grant from the National Endowment of Humanities to create a video game, “Pox in the City.” The role-playing game will bring the 18th and 19th-century wars over the use of the smallpox vaccine to general audiences.

Here is a taste:

The game immerses players in early 1800s Edinburgh,a prestigious medical center and a major front in winning acceptance for vaccines. It offers the chance to play one of three roles: a doctor trying to open a vaccine clinic; an immigrant worker trying to avoid smallpox; or, unusually, a smallpox virus trying to infect the masses. To recreate classic Edinburgh neighborhoods, Rosner ’s team will draw on contemporary images from the archives of the College of Physicians in Philadelphia and from visits to Edinburgh. She hopes that players can someday even, say, duck into an eating hall and hear people singing Robert Burns’s poems. For now, her team is concentrating on building the basic levels for the character of Doctor Alexander Robertson.

The real Alexander Robertson wrote an outstanding thesis on vaccine science in 1799, says Rosner. After that, he disappears from the historical record, but “he’s absolutely the kind of young physician who would have taken up vaccination in an entrepreneurial way,” she adds, therefore making him an appropriate character. Rosner drew on diaries of Edinburgh doctors and other primary sources to flesh out the milieu in which a Robertson would have worked.

At its most basic level, the game requires Robertson to persuade people to try vaccines, and he has to tailor his pitch to whomever he encounters. With a young Irish washerwoman, Robertson might do well to drop her priest’s name. For a striving merchant, Robertson could establish his scientific credentials, or mention that vaccination is all the rage in London. Other aspects of game play are more like quests, with multiple goals and subgoals along the way. For instance, one proposed subplot involving a corrupt doctor might require wheedling information from a drunken bar patron, haggling with journalists, and sneaking into the crooked doctor ’s office to gather evidence.

Check out the “Pox in the City” blog to see how the development of the game is progressing.

Some Technology in Education Articles You May Have Missed

These are hilarious.  Thanks Kerry Soper.

Here are a few of my favorites:

“Department chair supports ‘digital humanities’ without knowing exactly what it is”

“Widely published senior colleague still unable to detect Nigerian e-mail scams without help of college IT guy”

“Middle-aged history professor gets Twitter account; makes one tweet before losing login information”

“Full professor reprimanded a third time by Wikipedia administrators for attempting to create an entry on himself”

Say Goodbye to the Bullpen Phone

Paul Lukas reports on the end of the bullpen phone, a baseball tradition that has been around since the 1930s.  Sad.

Here is a taste of his piece at the new New Republic:

The call to the bullpen is one of baseball’s time-honored rituals, right up there with spitting tobacco juice and arguing with the ump. And for the past 80 years or so, that call has always been made on a traditional, hard-wired land line.

It’s not clear when the first phone was installed in a baseball dugout, but baseball historian Peter Morris says direct lines from the dugout to the bullpen have been in use at least since 1930. But with the rest of the world moving to cell phones, Major League Baseball has decided to get with the program. In a classic example of “It wasn’t broken, but we fixed it anyway,” MLB has struck a deal with T-Mobile that will result in each team’s dugout being outfitted with a kiosk containing four Samsung Galaxy S III phones, which managers and coaches will be able to use to call the bullpen.

The whole arrangement is supposedly very high-tech and secure, but you already know what’s going to happen. For example:

• A manager will grab the cell phone and be unable to get a signal.
• A call will be dropped just as the manager is telling the bullpen coach who should start warming up.
• Some 13-year-old kid will figure out how to hack the system and will then call the bullpen every single inning with the instructions, “Tell Cy Young to start loosening up.”
• The players will use the phones to watch internet porn in the dugout.

Such practical considerations notwithstanding, there’s also something unseemly about a crusty old skipper or pitching coach using a cell phone. Does anyone really want to see Tigers manager Jim Leyland fumbling around with the latest high-tech phone designed for kids a quarter of his age? It’ll be like watching your grandfather trying to navigate a video game while he gets a tattoo—undignified at best, cringe-inducing at worst.

Wilfred McClay: The Toquevillean Moment for Higher Education

In a recent essay in The Wilson Quarterly, social critic Wilfred McClay uses Alexis de Tocqueville and Democracy in America to make sense of the current changes in American higher education.

First, McClay lays out the challenges faced by higher education today:

To say that we are living through a time of momentous change, and now stand on the threshold of a future we could barely have imagined a quarter-century ago, may seem merely to restate the blazingly obvious. But it is no less true, and no less worrisome, for being so. Uncertainties about the fiscal soundness of sovereign governments and the stability of basic political, economic, and financial institutions, not to mention the fundamental solvency of countless American families, are rippling through all facets of the nation’s life. Those of us in the field of higher education find these new circumstances particularly unsettling. Our once-buffered corner of the world seems to have lost control of its boundaries and lost sight of its proper ends, and stands accused of having become at once unaffordable and irrelevant except as a credential mill for the many and a certification of social rank for the few. And despite all the wonderful possibilities that beckon from the sunlit uplands of technological progress, the digital revolution that is upon us threatens not only to disrupt the economic model of higher education but to undermine the very qualities of mind that are the university’s reason for being. There is a sense that events and processes are careening out of control, and that the great bubble that has so far contained us is now in the process of bursting.

Then he introduces us to Tocqueville’s understanding of liberal education:

But more than anything else, Tocqueville praised Americans for their embrace of the principle of self-interest rightly understood. It was a foregone conclusion, in his view, that self-interest had replaced virtue as the chief force driving human action. To tell an American to do virtuous things for virtue’s sake, or at the authoritative direction of priests, prelates, or princes, was futile. But the same request would readily be granted if real benefits could be shown to flow from it. The challenge of moral philosophy in such an environment was to demonstrate how “private interest and public interest meet and amalgamate,” and how one’s devotion to the general good could also promote one’s personal advantage. Belief in that conjunction—that one could do well by doing good—was exactly what was meant by the “right understanding” of self-interest.

Hence, it was imperative to educate democratic citizens in this understanding, to teach them how to reason their own way to acceptance of the greater good. The American example made Tocqueville hopeful that the modern principle of self-interest could be so channeled, hedged about, habituated, and clothed as to produce public order and public good, even in the absence of “aristocratic” sources of authority. But it would not happen of its own accord.

“Enlighten them, therefore, at any price.” Or, as another translation expresses it, “Educate them, then.” Whatever else we may believe about the applicability of Tocqueville’s ideas to the present day, we can be in no doubt that he was right in his emphasis upon education. But not just any kind of education.  He was talking about what we call liberal education, in the strictest sense of the term, an education that makes men and women capable of the exercise of liberty, and equips them for the task of rational self-governance. And the future of that ideal of education is today very much in doubt.

And finally, he uses Tocqueville to defend the traditional liberal arts:

Wilfred McClay

So we must be Tocquevillean. That means we should not be too quick to discard an older model of what higher education is about, a model that the conventional four-year residential liberal-arts college, whatever its failures and its exorbitant costs, has been preeminent in championing. And that is the model of a physical community built around a great shared enterprise: the serious and careful reading and discussion of classic literary, philosophical, historical, and scientific texts. 

What we may need, however, is to be more rigorous in thinking through what we want from such a model of education, and what we can readily dispense with. Perhaps we do not need college to be what it all too often has become: an extended Wanderjahre of post-adolescent entertainment and experimentation, played out in the soft, protected environment of idyllic, leafy campuses, less a rite du passage than a retreat to a very expensive place where one can defer the responsibilities of adult life. 

At the very least, such an education ought to help us resist the uncritical embrace of technological innovation, and equip us to challenge it constructively and thoughtfully—and selectively. There is, for example, no product of formal education more important than the cultivation of reflection, of solitary concentration, and of sustained, patient, and disciplined attention—habits that an overwired and hyperconnected way of life is making more and more difficult to put into practice. If we find it increasingly difficult to compose our fragmented and disjointed browsings into coherent accounts, let alone larger and deeper structures of meaning, that fact represents a colossal failure of our educations to give us the tools we need to make sense of our lives. Colleges and universities should be the last institutions to succumb to this tendency. They should resist it with all their might, because that is precisely what they are there for.

Read the entire piece.

Historians Skyping

In the last couple of years I have been invited to Skype with history classes about my books or some other topic related to my work and/or blog.  Once I became familiar with the technology, I realized that this could be a lot of fun.  Last year I met, via Skype, with an AP U.S. History class taught by a former student of mine, Kim Johnson.  I fielded questions about the First Great Awakening and the Enlightenment.  It must have gone well, since I have been invited to do it again in a few weeks. 

I also did John Turner’s American Religious History class at the University of South Alabama (although if I remember correctly, I don’t think we actually used Skype).  The topic was Was America Founded as a Christian Nation?: A Historical Introduction.  This year I will be doing Barton Price’s American Civilization course at Grand Valley State in western Michigan. 

The Skype format is especially useful for classes that have read something I have written.  The session can be focused on a Q&A with the author.

I found this session between Civil War Memory‘s Kevin Levin and students in a course at Skidmore College to be a nice example of how Skype can be used effectively in a college classroom.  It also provides some insight into the origins of Levin’s popular blog:

Skyping With Skidmore from Kevin Levin on Vimeo.

The National Museum of American History Remembers Steve Jobs

The blog of the National Museum of American History includes several reflections from the staff of the Lemelson Center for the Study of Invention and Innovation on the life and legacy of the late founder of Apple Computers.

My favorite reflection comes from Monica Smith, an exhibit project manager at the Lemelson Center:

An Iconic Place of Invention
Few places are as iconic in the lore of modern invention as the “Apple garage.” This humble attachment to Steve Jobs’ parents’ ranch-style home in Los Altos, California, is where he famously teamed up with Steve Wozniak to develop and sell the Apple personal computer. They formed the Apple Computer company here on April Fools’ Day (clearly they had a sense of humor) in 1976.

I believe the enduring fame of this place of invention—a pilgrimage site for computer history buffs—is based on its ordinariness. A lab or factory may seem alien, but most of us know the look, smell, and feel of a garage. To think that our individual personal computers have their roots in such a simple suburban location is somehow endearing. No matter how rich and famous Jobs became, he seemed accessible in part because of the garage story and how it has inspired generations of budding innovators.

Are You on the Job Market? Think Hard About Your Web Presence

In this day and age many academic job seekers are wondering how much of a web or social media presence they should have when they are on the market.

As someone who has served on several search committees, I would advise job seekers to think about having a presence online, but to do so with caution.  Ramp up the privacy controls on your Facebook account.  Enhance your Linkedin page to highlight your academic connections.  Be very careful about what you write on your blog so that your opinions do not alienate anyone on the search committee. 

If you do have a website or a blog, it should focus on scholarship and teaching.  Stay away from politics, your favorite music groups, your desire for a writing shed in your backyard, or your pets.  A blog or website can be a great place to display your vita and give a committee a sense of your research.

Jentery Sayers, writing at ProfHacker, wonders why more job candidates don’t have websites.  She weighs the pros and cons of having a web presence while on the market.  Here, according to Sayers, are some of the pros:

In the interest of full disclosure, I’ve had my own website since 2003, about a year or so before I started graduate school in 2004. As I was preparing for the job market, I decided to revise the content of my site from its status as an occasionally updated blog to a more professional academic site, including my bio, CV, teaching philosophy, and portfolio. As such, many of the perks mentioned below emerge from my own experiences and biases in the humanities. I should also mention that I have never served on an academic job search committee.

When people ask me why bother with a dedicated site, my first response is usually that it allows me to document and exhibit the work—or better yet, the processes—involved in what’s ultimately presented as my CV. That is, a website is not only less formal (or less standardized) than a CV; it can also be a portfolio for “middle-state publishing,” described by The New EverydayKari Kraus, for drawing my attention to this term.) For example, you may be working on a digital project, a static glimpse of which you want to share without offering audiences full access. In your portfolio, you could provide a screenshot of the project, together with an abstract and/or a development timeline. As another example, you might wish to include photos, videos, or audio recordings of you teaching a course or a workshop. Such use of evidence could reinforce claims made in the teaching philosophy you send to search committees… MediaCommons project as “a web publication that exists ‘between a blog and a journal.’” (Thank you,

 Some other perks to having your own site during the job search include:

  • Sharing your work with audiences you may not expect (e.g., those who stumble upon your site through Google or Bing),
  • Constructing a well-organized database of your work that exceeds your own memory, or a database that can be searched when you cannot recall dates, titles, locations, and other details,
  • Learning enough about e-portfolios and websites that you can help students and colleagues do the same,
  • Sending a URL (instead of DVD or CD) when a portfolio or evidence of digital research is requested, and
  • Letting the site grow with your career, or adding material after the job search is finished, in order to keep colleagues up-to-date about your work.

Pew Study on Online Education

The Pew Research Center has just released its recent study, “Digital Revolution and Higher Education.”  Here are some of the findings:

  • 29% of the American public believe that online courses “offer equal value compared with courses taken in the classroom.”
  • 51% of college presidents “say online courses provide the same value” compared with courses taken in the classroom.
  • 77% of college presidents “report that their institutions now offer online courses.”
  • 89% of public colleges and universities offer online classes, but only 60% of private four-year schools offer online classes.
  • Nearly 25% of four-year college graduates have taken a course online.  Of those students, 39% believe there online course had the same “educational value” as a course taken in the classroom.

Why do more college and university presidents believe that online education is comparable to traditional classroom courses than students who take online courses? Any thoughts?

Here is another interesting finding related to mission:

College presidents’ beliefs about the mission of higher education are linked to their views and experiences with online learning. Among those who believe the most important role college plays is to prepare students for the working world, 59% say online classes provide the same educational value as in-person classes. Among presidents who say the role of college is to promote personal and intellectual growth, only 43% say online learning offers an equal value.

Technology and Mission

Over at ProfHacker, Ryan Cordell asks a very interesting set of questions: “Does your institutional culture lend itself to certain technologies?”  Or, does your institutional culture argue against certain innovations?  How should a college’s technological decisions be shaped by local or institutional factors?

These are very important questions.  At my institution–an almost entirely residential, Christian, liberal arts college–we have been having a lot of discussions lately about how to engage more fully with social media, technology in the classroom, online courses, etc….  But we don’t talk enough about how our adoption of technology meshes with our mission as a school that values community, face-to-face engagement, and a commitment to the liberal arts as a means of promoting human flourishing.