Tenure: Enjoy It While It Lasts
“The single most important factor preventing change in higher education is tenure.” Wow. That was the sentiment expressed in 2010 by Mark C. Taylor, then chair of Columbia’s Department of Religion, and every critic of higher education in the United States seemed to agree with him. Tenure, they charged, was the place where deadbeat faculty could go for a rest cure, protected from critical standards, working as little as they could—and generally sending a once world-renowned system to the backwater, behind the rising tide of Asia and Europe.
Not quite. The idea of tenure—promoted by John Dewey, the Columbia University philosophy professor who in 1915 founded the American Association of University Professors—meant only that a faculty member couldn’t be dismissed without evidence of incompetence, professional misconduct or program discontinuance because of serious financial difficulty at the school.
And even in 2010, the tenure critics were beating a dead horse. The number of tenure-track faculty was dropping like a stone, from 57 percent of faculty at its peak in 1975 to just above 30 percent today. The current American higher education workforce is more than two-thirds part-time, adjunct or limited-contract hires. Tenure is going, going—and probably in another 50 years, with the exception of those 100 top colleges and universities that compete with each other for faculty—gone.
The idea of tenure was born of trustee, donor and presidential abuse, the destruction of the German and European universities by Hitler, the extraordinary transformation of American higher education after World War II from mediocrity to world-dominating excellence and the enormous demand for talent that took America to the top of the academic mountain in the 30 years of the Golden Age of research, from 1945 to 1975. Tenure was part of that Golden Age. Let’s take a look at the history.
Freedom from Intolerance
In Puritan America, if you were a faculty member at one of the theocratic colleges like Harvard, Yale, Brown or Princeton, it was certain that you adhered to the sectarian doctrine of Congregationalism, Calvinism, Baptism or Evangelical Presbyterianism, the only theological system permitted on your campus. Tolerance of other doctrines was not a characteristic of 17th-century college life. Faculty knew when to shut up.
When Thomas Jefferson created the University of Virginia in 1819—free, he hoped, from the religious intolerance of earlier colleges—the terrified faculty slept with pistols under their pillows for fear that they would be murdered at night by a band of drunken students. Tenure was the last thing on their minds.
Toward the end of the 19th century, when America adopted the German model of the research university and private philanthropists named Rockefeller, Vanderbilt, Carnegie, Mellon, Stanford, Hopkins and Cornell provided the financial resources, these benefactors believed that they and their families had the right to determine who would serve on the faculty—and who could remain. In 1900, Leland Stanford’s widow, sitting on the Stanford board, ordered Professor E.A. Ross to resign or be fired for attacking the railroad industry. His colleagues helped pack his bags.
Sometimes presidents did it themselves. Nicholas Murray Butler led Columbia from 1902 to 1945 and was never a friend of independent spirits. In 1910, he fired the literary scholar Joel Spingarn for defending a colleague at a faculty meeting. Butler was also an ardent supporter of American entry into the Great War; any faculty member who spoke in opposition was gone.
During the McCarthy era of the early 1950s, 69 faculty who thought they had lifetime appointments at their colleges around the country, as well as hundreds of junior faculty, were fired by compliant boards who joined in the hunt for communists on campus. At Tufts, no one lost a job. Four senior faculty members—Marston Balch, Newman Birk, George Halm and Albert Imlah—signed a letter published in the Tufts Weekly on June 16, 1953, condemning the congressional probe, and President Nils Wessell vowed that no loyalty oath would be imposed. At nearby Harvard and MIT, meanwhile, there was enormous pressure to sign—or get out.
But the McCarthy purge was an aberration in the course of tenure, which had been gaining support since 1945. At the end of World War II, American higher education exploded out of the starting gate. The vaunted European universities lay in intellectual ruins, having been corrupted by Hitler’s racial science. We got the refugee scientists, poured billions into research and development and opened the college doors to returning GIs and everyone else who wanted an education.
A Nice Perk
A faculty shortage loomed over this system that was growing by leaps and bounds. For the next 30 years, any faculty member could go anywhere. California’s new system of junior colleges, colleges and universities was a powerful academic magnet, as was the new SUNY/CUNY system in New York.
The only way competing institutions could hope to attract and keep scarce faculty was through the enticement of a lifetime job. No one thought tenure was about academic freedom: by the 1960s, we all thought we had it, and faculty involvement in the Vietnam War debates was loud and visible. Tenure was conceived earlier as necessary for protection against nasty trustees or autocratic presidents, but it became a reality because of faculty supply and demand. That economic reality lasted only 30 years.
The need to offer permanent jobs in higher education began to disappear in the late 1970s, when the economic boom ended, expansion slowed, we were producing too many Ph.D.s and the academic Golden Age gave way to a Silver Age, with the gradual disappearance of tenure-track positions at all but the most competitive institutions, which are always looking to grab faculty from somewhere else.
Ironically, along with the economic reality, the old need for protection returned, as faculty, now accustomed to opening their mouths on all subjects, received pushback from trustees and alumni who didn’t like some of the faculty opinions all over this thing called the Internet, and once again, firing for unpopular opinions returned.
In another 50 years, there will be no more tenure. Faculty will have learned where the borders of “anything goes” speech are, and will have conformed to the reality of an age where going over the top in comments on explosive subjects will simply be avoided. Then, in another century perhaps, after some awful abuse by a domineering board of trustees or regents, an opinionated spouse of a founding philanthropist or an autocratic president who does not tolerate criticism, there will be a faculty movement to reestablish the idea of due process. Another John Dewey will arise to plant the seeds of tenure in the minds of faculty.
Right now, those seeds are slightly shriveled and could use some watering at the other 4,000 colleges and universities that hire the overwhelming majority of academic labor.
With or without tenure, this anarchic madhouse called American higher education will never be supplanted by anybody else’s system. What we have is messy and often ungovernable; American faculty really don’t believe they work for anyone. But the intellectual freedom they have attained is the reason no other nation—not China, Germany, India or Brazil—can push us off the top of the mountain. It was tenure that got us here.
Sol Gittleman, the Alice and Nathan Gantcher University Professor, has been a professor of German, Judaic studies and biblical literature and is a former provost of Tufts University.
This article first appeared in the Summer 2015 issue of Tufts Magazine.