Academic Tenure In America A Historical Essay Papers

My grandmother worked in a school cafeteria. My mother taught second grade. Nearly two decades ago, I resolved to enter public education, too, but with plans to rise even higher. I would become a college professor, advancing the scholarship of my discipline, free from the petty bureaucratic concerns that hamstrung my mother's career.

From 1998 until 2012, I pursued that objective with extraordinary focus. I graduated from college at 19. I went to law school and passed the bar exam. At 24, I was admitted to the history PhD program at the University of Pittsburgh. There, I made connections with brilliant academics, won prestigious fellowships and grants, and, at the age of 29, just five years after starting graduate school, I landed a tenure-track job.

I can't understate how rare this opportunity is: Tenure-track jobs at large state universities are few and far between. Landing one without serving a postdoctoral appointment or working as a visiting assistant professor is about as likely as landing a spot on an NBA team with a walk-on tryout — minus the seven-figure salary, naturally.

I had read all of the doom-and-gloom think pieces about the status of the American university system, of course, but it felt like none of that applied to me. I had a full-time position, secured early in my career — the possibilities were endless. Although a legal historian by training, I viewed myself as beyond such simple labels: I was a cultural historian, in command of critical theory and immersed in the latest and best work on gender and sexuality. Activism informed my teaching; I exhorted my students to transcend and transform the status quo. I coached my university's legal debate team to a national championship bid and served on nearly a dozen PhD and EdD dissertation committees. I launched several digital humanities initiatives and curated a museum exhibit about professional wrestling, attracting mainstream attention in the process.

I had not just survived the academic Hunger Games — I had emerged triumphant.


Then it all began to fall apart.

First there was sniping, from peers and administrators. Critiques of my teaching and debate team coaching, often made through backchannels and delivered to me secondhand or not at all, centered on my easygoing personal style (He doesn't use the title "doctor!" He teaches in T-shirts!), my effusive student evaluations (If he's pleasing them, he must be doing something wrong!), and my relatively calm demeanor (If a young academic doesn't seem stressed beyond capacity, he's not working hard enough!).

Then there was official pushback and politics. A proposal to create interactive teaching materials from archival materials was derided as bewildering and gimmicky. I learned that the public outreach in which I engaged — that is, publishing in popular magazines — had ruffled certain feathers. I watched administrators and donors who had championed my career be shown the door, or at least swept under the rug, by an incoming presidential administration — proving that the autonomy I had imagined upon entering academia really was an illusion.

Finally, I realized not even students were too invested. When my best friend visited my campus to give a talk, he observed one of my lectures. I've got many shortcomings as an academic, but lecturing isn't one of them. I've been on TV, radio, podcasts — you name it. By professor standards, which admittedly aren't that high, I could rock the mic. But while my friend sat there, semi-engrossed in the lecture, he found himself increasingly distracted by the student in front of him.  That student, who like all in-state students was paying $50 per lecture to hear me talk, was watching season one of Breaking Bad. In a class with no attendance grade, where the lectures were at least halfway decent, he was watching Breaking Bad.

Later during that same visit, my friend asked me, in total sincerity, "Why aren't you doing something meaningful with your life?"

"This is important," I insisted. But there was no passion behind my words. I was a priest who had lost his faith, performing the sacraments without any sense of their importance.

Op-eds about the failings of higher education are like certain unmentionable body parts: Everybody's got one. Professors are or aren't afraid of their liberal students, adjuncts are underpaid and exploited, grade inflation is rampant, college graduates can't find jobs, student loan debt will doom us all.

But these are just parts of a larger and even more troubling story. After spending four years working in higher education, trying to effect piecemeal improvements, I'm convinced that the picture is more dire than most people realize: There's no one single problem to fix or villain to defeat, no buzzword-y panacea that will get things back to normal.

And so now, after devoting nearly 20 years to this life, I've decided to walk away. I'm quitting my tenure-track position; by May of next year, I'll be out of this side of academia forever.

Here are some departing thoughts.

1) Too many people go to college

As recently as a year ago, I remained willing to work inside that fractured system of pay-to-play higher education. If students wanted to take out federal loans to buy degrees, who was I to stop them? Let the chips fall where they may; graduate them all and let the invisible hand sort them out.

The humanities have become a place to warehouse students seeking generic bachelor's degrees


But that system is unsustainable. Liberal arts programs, and the humanities in particular, have become a place to warehouse students seeking generic bachelor's degrees not out of any particular interest in the field, but in order to receive raises at work or improve their position in a crowded job market.

Once upon a time, in a postwar America starved for middle managers who could file TPS reports, relying on the BA as an assurance of quality, proof of the ability to follow orders and complete tasks, made perfect sense. But in today's world of service workers and coders and freelancers struggling to brand themselves, wasting four years sitting in classes like mine makes no economic sense for the country or for the students — particularly when they're borrowing money to do so.

Every so often, we're treated to an essay about how liberal arts majors can prepare students to make creative contributions to an employer's bottom line. Do you know how else you can prepare to make these vague creative contributions, much more cheaply and efficiently? By sitting around in your parents' basement and reading great works of literature. Yes, lectures and classroom discussions might help open your mind to new possibilities, but so will skillfully produced videos that are freely available on YouTube. Expert oversight is valuable — but how valuable is it really? I imagine most people wouldn't fork over $50 an hour for the privilege, regardless of their respect for the stellar minds whose contributions to society can rather easily be accessed and understood for free.

2) Online education isn't the solution

Despite my department boasting more than 20 full-time faculty with solid research and teaching credentials, a majority of history students don't come anywhere near their classrooms. Instead, they're remote students, enrolled in an online education.

For some, online degree programs are a solution to the cost and time problem. If there's mass demand for BAs, but the time and expense of real college doesn't make sense for most people, why not provide a similar service digitally? Online classes could unite knowledge seekers from around the world, advocates say, allowing them to get a version of the university experience more compatible with the demands of the modern world.

But in practice, online education isn't a solution — it's a Band-Aid on an infected wound.

Everything you need to know about college costs

In place of thought-provoking video chats and genuinely creative software applications the theory promises, most online students get Blackboard — a cumbersome and inefficient program that only a bureaucracy could love.  The "lectures" amount to little more than uploaded PowerPoints that may or may not be accompanied by instructor narration. Usually a single module serves as the university-wide template for an entire mandatory subject, such as US history to the Civil War, allowing professors to be replaced by "graders" capable of administering these courses for even less than the pittance paid to adjuncts. At my university, for example, a grader for one of our online courses supervises approximately 30 to 50 students for an entire course. The grader typically makes $700.

Meanwhile, online classes are — in defiance of all reason — generally longer and more involved than in-person classes. To make up for the lack of in-person instruction, they gorge on assignments, sometimes featuring as many as 60 quizzes in a term. The consequence is cheating as often as education; if you've got a willing partner or three, you could theoretically divide up the coursework and hope the underpaid grader doesn't notice.

Completion rates for online courses are dismal as well, especially at places such as the University of Phoenix Online, which has invested heavily in front-end services like financial aid advising but far less in teachers and student support.

All of this makes perfect sense from an economic standpoint: University administrators are rational actors, and what they're incentivized to maximize are paid student enrollments. There's still no real penalty for failing to graduate students, so why not chase that easy federal money and focus all the effort on upfront enrollment? But what's clear is that this system does not offer a viable, sensible alternative for students; it just allows administrations to exploit the crisis in education to make even more money with even less effort or investment.

3) Tenured professors pity adjuncts. But we can't help them.

We all went into this business with the best of intentions. Those of us who sought PhDs in overpopulated and declining fields knew that the market was not only rough but absolutely brutal; dark humor about the impossible odds facing PhD seekers is part and parcel of the whole grad student experience.

Among the handful of academics who do land tenure-track jobs, one finds little sympathy for the less fortunate


Among the handful of academics who do land tenure-track jobs, one finds little sympathy for the less fortunate. Lip service, to be sure, but academia is a bloodless, endless game of Survivor in which every winner is saying to himself or herself, "There but for the grace of God go I" — or, more likely, "Sucks for them, but what can you do?"

As someone who has sat in department meetings, served on hiring committees, and powwowed with other "real" academics at conferences, I can offer the following statement with confidence: No matter how bad things are for the adjuncts, they're effectively non-people to their ostensible colleagues. We won't save you. It's not that we full-timers don't care; it's that we can't. The rules of the game for tenure are simple and terrible — "do twice as much as you think you need to do" — and there's no time to worry about the fallen when your own pay lags well behind the national average.

Life for the liberal arts adjuncts, who surely deserve better, is only getting worse as enrollments climb. University administrators maximize the bottom line, and the bottom line at most non-elite schools is tuition-paying customers. If you can pay someone to teach five history classes for $15,000 or pay someone else $60,000 to teach those same five classes, why bother with the latter? People complain, but there's no real evidence showing that loss of business from students turned off by less-qualified instructors is even close to competing with the savings.

The incentives are especially destructive in the humanities. When administrators do decide to invest in faculty, they tend to favor STEM professors. Those guys rake in the valuable grant money, and thanks to the miracle of co-authored papers, they produce far longer CVs with far better citation counts, a valuable asset when chasing a higher school ranking and the cash that comes with it.

The situation has become dire enough that I often think the only feasible solution would be to eliminate tenure altogether. Morally, such a plan would be repugnant: Academics deserve the freedom to work at their pace and without the fear of too much administrative interference. But economically, it might be the only thing that allows for real labor market flexibility, forcing out elderly and ineffective professors and driving a rise in the standard of living for those many talented adjuncts who are unable to find work under prevailing conditions.

4) "Alt-academia" isn't a solution — it's surrender

So if not to the wretched life of an adjunct, whither our underpaid, overeducated PhDs? The notion of "alternative academic" careers has become a rallying cry for many, particularly those whose alternative academic position involves finding alt-ac jobs for other PhDs.

Briefly put, "alternative academia" is a catchall term for the process wherein individuals, unsuccessful in their quest to become university professors or disillusioned with that sort of work, seek alternative employment at places like libraries, nonprofits, university presses, and private sector think tanks.

There's not enough room in academia. Go find a job in a different field.


These positions are typically filled by people with master's degrees or other terminal credentials; those with doctorates, goes the reasoning, would be able to use their critical thinking skills to excel in such fields, which lack many of the pressures associated with the tenure track but still offer opportunities to undertake meaningful, exciting work.

The concept is good enough in theory, but in practice it's just another way of phrasing the problem: There's not enough room in academia. Go find a job in a different field.

Some blame scholars themselves for the problem — claiming that today's PhD holders aren't as capable or as qualified as generations past. But after sitting on hiring committees and reading hundreds of CVs and writing samples, I refuse to blame the earnest applicants whose sole crime was being told scholarship was a worthwhile pursuit and believing it. If anything, market pressures have resulted in the production of some of the finest scholarship in generations, with even many adjuncts having a handful of great publications under their belts. The problem is that the system is more than happy to take their money and use their services from undergrad all the way to their doctoral graduation, but when it comes time to pay it off with a real job? Sorry — best look somewhere "alternative."

Recently, an article circulated that urged PhD seekers to view their degrees as a six-year, time-limited job, after which they should expect to move on to something else. That's all well and good, but like my $50-a-pop lectures, is that something you'd want to invest in? When presented with such stark questions, I'd imagine most people would say no. Forcing people to master multiple languages, paleography, archival research, coding, yet all the while reminding them they need to be ready to retool as academic advisers or advertising executives, isn't a solution to the academic crisis — its outright surrender to it.

5) The students and professors aren't the problem; the university system is

All of these issues lead to one, difficult-to-escape conclusion. Despite all the finger-pointing directed at students ("They're lazy! They're oversensitive! They're entitled!"), and the blame heaped on professors ("Out of touch and irrelevant to a man"), the real culprit is systemic. Our federally backed approach to subsidizing higher education through low-interest loans has created perverse incentives with disastrous consequences. This system must be reformed.

When I started out, I believed that government regulation could solve every problem with relatively simple intervention. But after four years of wading though this morass, I'm convinced these solutions should be reevaluated constantly. If they're not achieving their objectives, or if they're producing too much waste in the process, they ought to be scrapped. We can start with federal funding for higher education.

The quickest and most painful solution to the crisis would involve greatly reducing the amount of money that students can borrow to attend college. Such reductions could be phased in over a span of years to alleviate their harshness, but the goal would remain the same: to force underperforming private and public universities out of business. For-profit universities — notorious for their lack of anything resembling good academic intention — should be barred altogether from accessing these programs; let them charge only what consumers in a genuinely free market can afford to pay for their questionable services.

Without the carrot of easy access to student loans, enrollments would shrink. Universities would be forced to compete on a cost-per-student basis, and those students still paying to attend college would likely focus their studies on subjects with an immediate return on investment. Lower tuition costs, perhaps dramatically lower at some institutions, would still enable impoverished students eligible for Pell Grant assistance to attend college.  Vocational education programs, which would likely expand in the wake of such a massive adjustment, would offer inexpensive skills training for others. The liberal arts wouldn't necessarily die out — they'd remain on the Ivy League prix-fixe menu, to be sure, and curious minds of all sorts would continue to seek them out — but they'd no longer serve as a final destination for unenthusiastic credential seekers.

In the time that's allotted to us to in life, we have to make many choices. Opting to pursue an unmarketable career solely because one loves it is an available option. But that decision has consequences. In a university system like ours, where supply and demand are distorted, many promising young people make rash decisions with an inadequate understanding of their long-term implications. Even for people like me, who succeed despite the odds, it's possible to look back and realize we've worked toward a disappointment, ending up as "winners" of a mess that damages its participants more every day.

Had I known sooner, I would've given up on this shrinking side of academia many years ago, saving myself plenty of grief while conserving the most valuable quantity of all: time. No one should have to wait so long or sacrifice so much of it for a system like this. Time is money, and we must spend it wisely. Until something is done — something that isn't just a quick fix, something that looks long and hard at the structure of the present university system and tears it up from the foundation, if that's what it takes — the academy is no longer an investment of time worth making.

Oliver Lee is an attorney and assistant professor of history. His writing has appeared in the Atlantic, VICE, Salon, Mic, and Al Jazeera America.


First Person is Vox's home for compelling, provocative narrative essays. Do you have a story to share? Read our submission guidelines, and pitch us at firstperson@vox.com.


A few years ago, when I was a graduate student in English, I presented a paper at my department’s American Literature Colloquium. (A colloquium is a sort of writing workshop for graduate students.) The essay was about Thomas Kuhn, the historian of science. Kuhn had coined the term “paradigm shift,” and I described how this phrase had been used and abused, much to Kuhn’s dismay, by postmodern insurrectionists and nonsensical self-help gurus. People seemed to like the essay, but they were also uneasy about it. “I don’t think you’ll be able to publish this in an academic journal,” someone said. He thought it was more like something you’d read in a magazine.

Was that a compliment, a dismissal, or both? It’s hard to say. Academic writing is a fraught and mysterious thing. If you’re an academic in a writerly discipline, such as history, English, philosophy, or political science, the most important part of your work—practically and spiritually—is writing. Many academics think of themselves, correctly, as writers. And yet a successful piece of academic prose is rarely judged so by “ordinary” standards. Ordinary writing—the kind you read for fun—seeks to delight (and, sometimes, to delight and instruct). Academic writing has a more ambiguous mission. It’s supposed to be dry but also clever; faceless but also persuasive; clear but also completist. Its deepest ambiguity has to do with audience. Academic prose is, ideally, impersonal, written by one disinterested mind for other equally disinterested minds. But, because it’s intended for a very small audience of hyper-knowledgable, mutually acquainted specialists, it’s actually among the most personal writing there is. If journalists sound friendly, that’s because they’re writing for strangers. With academics, it’s the reverse.

Professors didn’t sit down and decide to make academic writing this way, any more than journalists sat down and decided to invent listicles. Academic writing is the way it is because it’s part of a system. Professors live inside that system and have made peace with it. But every now and then, someone from outside the system swoops in to blame professors for the writing style that they’ve inherited. This week, it was Nicholas Kristof, who set off a rancorous debate about academic writing with a column, in the Times, called “Professors, We Need You!” The academic world, Kristof argued, is in thrall to a “culture of exclusivity” that “glorifies arcane unintelligibility while disdaining impact and audience”; as a result, there are “fewer public intellectuals on American university campuses today than a generation ago.”

The response from the professoriate was swift, severe, accurate, and thoughtful. A Twitter hashtag, #engagedacademics, sprung up, as if to refute Kristof’s claim that professors don’t use enough social media. Professors pointed out that the brainiest part of the blogosphere is overflowing with contributions from academics; that, as teachers, professors already have an important audience in their students; and that the Times itself frequently benefits from professorial ingenuity, which the paper often reports as news. (A number of the stories in the Sunday Review section, in which Kristof’s article appeared, were written by professors.) To a degree, some of the responses, though convincingly argued, inadvertently bolstered Kristof’s case because of the style in which they were written: fractious, humorless, self-serious, and defensively nerdy. As writers, few of Kristof’s interlocutors had his pithy, winning ease. And yet, if they didn’t win with a knock-out blow, the professors won on points. They showed that there was something outdated, and perhaps solipsistic, in Kristof’s yearning for a new crop of sixties-style “public intellectuals.”

As a one-time academic, I spent most of the week rooting for the profs. But I have a lot of sympathy for Kristof, too. I think his heart’s in the right place. (His column ended on a wistful note: “I write this in sorrow, for I considered an academic career.”) My own theory is that he got the situation backward. The problem with academia isn’t that professors are, as Kristof wrote, “marginalizing themselves.” It’s that the system that produces and consumes academic knowledge is changing, and, in the process, making academic work more marginal.

It may be that being a journalist makes it unusually hard for Kristof to see what’s going on in academia. That’s because journalism, which is in the midst of its own transformation, is moving in a populist direction. There are more writers than ever before, writing for more outlets, including on their own blogs, Web sites, and Twitter streams. The pressure on established journalists is to generate traffic. New and clever forms of content are springing up all the time—GIFs, videos, “interactives,” and so on. Dissenters may publish op-eds encouraging journalists to abandon their “culture of populism” and write fewer listicles, but changes in the culture of journalism are, at best, only a part of the story. Just as important, if not more so, are economic and technological developments having to do with subscription models, revenue streams, apps, and devices.

In academia, by contrast, all the forces are pushing things the other way, toward insularity. As in journalism, good jobs are scarce—but, unlike in journalism, professors are their own audience. This means that, since the liberal-arts job market peaked, in the mid-seventies, the audience for academic work has been shrinking. Increasingly, to build a successful academic career you must serially impress very small groups of people (departmental colleagues, journal and book editors, tenure committees). Often, an academic writer is trying to fill a niche. Now, the niches are getting smaller. Academics may write for large audiences on their blogs or as journalists. But when it comes to their academic writing, and to the research that underpins it—to the main activities, in other words, of academic life—they have no choice but to aim for very small targets. Writing a first book, you may have in mind particular professors on a tenure committee; miss that mark and you may not have a job. Academics know which audiences—and, sometimes, which audience members—matter.

It won’t do any good, in short, to ask professors to become more populist. Academic writing and research may be knotty and strange, remote and insular, technical and specialized, forbidding and clannish—but that’s because academia has become that way, too. Today’s academic work, excellent though it may be, is the product of a shrinking system. It’s a tightly-packed, super-competitive jungle in there. The most important part of Kristof’s argument was, it seemed to me, buried in the blog post that he wrote to accompany his column. “When I was a kid,” he wrote, “the Kennedy administration had its ‘brain trust’ of Harvard faculty members, and university professors were often vital public intellectuals.” But the sixties, when the baby boom led to a huge expansion in university enrollments, was also a time when it was easier to be a professor. If academic writing is to become expansive again, academia will probably have to expand first.

Photograph by Martine Franck/Magnum.

0 thoughts on “Academic Tenure In America A Historical Essay Papers

Leave a Reply

Your email address will not be published. Required fields are marked *