2,699
Views
1
CrossRef citations to date
0
Altmetric

In Short

  • Americans’ use of higher education to build the nation-state, fight wars, and pursue social equity has entailed ongoing negotiations about what higher education is for and who should pay for it.

  • The 20th-century Cold War created conditions for the globally unprecedented expansion of the academic social contract; the close of the Cold War and subsequent social change suggests the exhaustion of that contract.

  • Revisiting the political circumstances of 1947 and contrasting them with those of the present day offers fresh perspectives on the dynamic.

In the middle of the 20th century, the nation’s academic and political leaders made an historic bargain: colleges and universities would enjoy generous public subsidy and a good deal of autonomy over their own affairs in exchange for national service in the form of worker training, citizen uplift, and progress in basic and applied science. The Truman Report exemplifies this bargain. Called by President Harry Truman to rethink the role of science and education in an ambitious nation in the wake of a wartime victory, the so-called Truman Commission released a six-volume report between December 1947 and March 1948 that culminated in “the U.S. government’s first effort to set national goals for higher education.”

The Servicemen’s Readjustment Act of 1944, popularly known as the GI Bill, had already planted seeds for massive postsecondary expansion by providing generous subsidies for war veterans to pursue college degrees. The Truman Report challenged the nation to grow the nation’s colleges and universities exponentially further. As the political settlements that concluded World War II transmogrified into the global Cold War, the Truman Report offered a powerful vision for how postsecondary expansion might ensure a virtuous, self-correcting, broadly participatory capitalist society. Not for nothing was the report’s blunt title, Higher Education for American Democracy.

While the six-volume report scarcely predicted them, the historical events immediately preceding its publication created conditions for what sociologists Christopher Jencks and David Riesman aptly called an academic revolution. Generous public subsidy for tuition and research would transform the national postsecondary ecology from a loose constellation of mostly provincial institutions serving specific religious and regional communities into a juggernaut of scientific productivity and a key arbiter of social status and economic opportunity in America. At the height of their influx into the U.S. higher education system, in 1947, fully half of the nation’s college students were military veterans. Within a decade, the GI Bill alone would send over two million people to college—making boundaries of class, religion, region, and even race at least somewhat more permeable for eligible recipients. Research in post–World War II American universities grew similarly. Sustained public funding for the expansion of higher education in the 1950s and 1960s, fueled substantially by myriad Cold-War competitions, brought higher education as an institutional sector increasing prominence.

In its explicit linkage of higher learning, scientific research, and democratic citizenship, the Truman Report was ahead of its time, but its authors likely could not have foreseen the amount of money that would ultimately be spent to expand higher education in subsequent decades. Consider the National Science Foundation (NSF), established by Congress in 1950 after a multiyear battle pitting New Deal senator Harley Kilgore, who favored a centralized federal science policy, against Vannevar Bush, who wanted a decentralized system run by academic scientists. Bush’s vision won out, and the NSF commenced a still-flowing river of grants from Washington to universities nationwide. Bush’s famous 1945 report on the “frontiers of knowledge” concisely distilled a longstanding U.S. national ambition for intellectual self-sufficiency: “We cannot any longer depend upon Europe as a major source of … scientific capital,” he wrote. By the end of the Korean War in 1953, NSF expenditures for academic research approached $5.5 billion a year.

Considerably more public monies flowed into universities via the Atomic Energy Commission and the National Institutes of Health, other new federal entities charged with pushing Bush’s “endless frontier” of science in the name of global U.S. influence. The Soviets’ launch of the Sputnik I satellite into Earth’s orbit in 1957 added urgency, and the passage of the National Defense Education Act supported graduate fellowships and research centers on the Soviet Union and other “strategic areas” of the globe, as well as behavioral and social science research on a wide range of stateside problems. Humanists within and beyond the academy benefitted too. Writers, dancers, visual artists, and historians received financial support and the status attending government patronage through national endowments for the arts and humanities, all part of a diffuse national project to nurture and showcase the might and influence of a rising American civilization on the international stage.

In the wake of this series of historical contingencies, much of the organizational architecture called for in the Truman report had been put into place by the end of the 20th century. Yet the close of the U.S./Soviet Cold War, the advances of civil-rights and subsequent tax-revolt movements, and myriad shifts in the position of the United States in the global political economy would render the relation between higher education and American society continually open for debate.

This special issue of Change recognizes the 75th anniversary of the publication of Higher Education for American Democracy not to revalorize or even further revisit that peculiar period of academic and political history. The authors assembled here seek instead to remind readers and ourselves of the enduring legacy of negotiation between universities and government—and, by extension, the American people—about the worthy purposes of public investment in higher education. This legacy of negotiation is what we call the academic social contract: the dynamic bargain universities strike with their publics to exchange research, teaching, and cultural prestige for financial support and a good measure of autonomy over academic affairs.

Always diffuse and often opaque even to those it most benefits, the academic social contract has enabled colleges and universities to lend capacity to a highly federated polity that is constitutionally suspicious of “big” government. Aiding the nation in fighting wars, pushing frontiers, enabling social mobility, and abetting democracy have been no small academic–government collaborations. As the United States and its postsecondary ecology have reciprocally coevolved, the academic social contract has changed in turn. If we examine our national history from the perspective of the academic social contract, we see the modern research university continuously caught between the benefits conferred by its surrounding state and society, and the demands state and society have made on higher education in return.

Evolution of the Academic Social Contract

In 1865, Ezra Stiles, the seventh president of Yale, announced that “[m]uch of all universities consist of endowments by individual persons and public communities or States or Princes.” As one of the original colonial colleges, we might today think of Yale as the quintessential private university. But in fact, like most of the earliest institutions in America, these colleges did not fit neatly into a public/private dichotomy. Many of the eight “private” schools that now comprise the Ivy League sought public charters that permitted giving degrees, in some cases royal support of the crown, and land and cash grants to support their fledgling institutions.

And they wanted autonomy to create their own governing structures. These arrangements worked because academic patrons viewed colleges as largely serving the public good. Colleges were willing to curtail some of that freedom by, for example, putting a local representative on the corporation board in exchange for the autonomy to create their own governing structures and agendas. As John S. Whitehead wrote in The Separation of College and State: Columbia, Dartmouth, Harvard, and Yale, 1776–1876, “College leaders realized that the alternative to a recognition of public interest would be a schoolhouse presided over by a cleric with a small group of students. This was hardly an institution that would have an effect on the society of the time” (52). Another way to say this is that responsibility was the price to pay for relevance and funds.

The relationships between colleges, their leaders, and perspectives on governance and funding in early America were dynamic. Consider the very different views of government patronage put forth by two different Harvard presidents, Samuel and Charles Eliot, who also were father and son. In 1848, the elder Eliot declared Harvard’s commitment to the state when he announced that the “most truly democratic mode … would be for the government itself to provide … all those schools, colleges … in order best to provide for the physical, intellectual, and moral wants that are felt throughout the community.” By 1873, the younger Eliot, the more famous of the two and Harvard’s longest-serving president, was cutting ties to the public purse, declaring that “the habit of being helped by the government, even if it be to things good in themselves—to churches, universities, and railroads—is a most insidious and irresistible enemy of republicanism.”

Between the Harvard reigns of these two Boston brahmins, the Civil War brought higher education the first Morrill Act (1862), which envisioned a new kind of state-funded institution delivered in a uniquely American way: through plurally funded and locally governed institutions emphasizing the mechanical and agricultural arts. The so-called Land Grants mark another beginning of a people’s university. But the earlier history reminds us that the distinction Americans often make between public and private colleges and universities has long been hard to parse. For most of U.S. history the reality has been a more fluid interaction between private and public resources, goods, and ends. Indeed, Higher Education for American Democracy is emblematic of this mixing, calling as it does for massive public investment in a postsecondary ecology populated by many officially private schools.

In 19th-century America, a college education was perceived by donors, presidents, and the public sector as preparation for public service: whether in churches, schools, philanthropic organizations, or government office. It was accordingly supported by low tuition rates and scholarship aid. Many colleges promised that those studying for ministry would be exempted from fees. In 1887, in North Carolina, students who said they would teach in public school in 1887 were granted free tuition at the University of North Carolina—a kind of Teach for America avant la lettre.

Most official public institutions began charging tuition and fees around 1900, but the commitment persisted to keep tuition low and to support with direct aid those students who could not afford it. Donors such as John D. Rockefeller, a business magnate and philanthropist whose father’s benefaction had seeded both the University of Chicago and Rockefeller University, had long supported the idea that the purpose of universities was to train public servants. But by the 1920s, that implicit connection between contribution and service waned and with it the belief in the social benefit of this arrangement that underwrote student aid. In 1927, when Rockefeller campaigned to charge students the full cost it took to educate them—and move from an aid system to one of loans—debate about the terms of the academic social contract shifted to a new key.

The Contract After the Cold War

Scholars tend to think wistfully about what was in many ways a golden age for publicly funded academic research during the 20th-century Cold War. But that golden age had unsavory dimensions: systematic exclusion of women and people of color from the most elite cadres of science, greatly disproportionate funding for a handful of almost-exclusively White universities employing especially well-connected research faculty, and explicit contract service to a worldwide military empire. As in the previous two hot wars, scholars arguably gave up too much, including the transparency of research projects and autonomy of interests, as was noted by the Columbia University sociologist C. Wright Mills and other skeptics at the time. Alvin Weinberg, a physicist at Oak Ridge National Laboratory and originator of the term “big science,” observed at the height of cultural discontent in 1968 that it was difficult “to tell whether the MIT [Massachusetts Institute of Technology] is a university with many government research laboratories appended to it or a cluster of government research laboratories with a very good educational institution attached to it.”

In search of new vocabulary to describe the post–World War II university, critics have invented terms like “multiversity,” the “corporate university,” the “instrumental university,” and the “neoliberal university”—missing the strong continuity of schools and their leaders negotiating with deep-pocketed outside parties to secure patronage. That history is important because we currently live in an academic era that bears traces of the one Ezra Stiles predicted years ago. Over 70 percent of the officially private Stanford University’s Research and Development budget came from the federal government in 2012. Only about 10 percent of the officially public University of Virginia (UVA)’s academic budget comes from state appropriations, and that number is even less, 5.8 percent, when the calculations include all of UVA’s many organizational divisions. Public and private interests and money have long commingled in American higher education, rendering a nominal distinction between public and private institutions more deceptive than descriptive.

We propose that a better way of thinking about U.S. higher education is to see it as a dynamic contract between government, universities, and third-party patrons of wide description. The terms and parties of the contract change over time, but they are always plural, often contested, and perhaps even contradictory. While Higher Education for American Democracy emphasized the public funding sources and civic purposes of the mid-20th-century contract, any comparable document written today would need to account for additional revenue streams and value propositions.

Americans now hold $1.7 trillion in student loan debt, more than they have on their credit cards. Tax-subsidized universities nationwide now compete fiercely to admit the children of affluent families in China and India. The U.S. postsecondary ecology is increasingly populated by venture capital–backed businesses purveying a wide array of credentials, even while they are not technically schools: Cisco, Coursera, and General Assembly are but a few examples. Beyond instruction, universities now have serious rivals in knowledge production: Alphabet, Alibaba, Facebook/Meta, and Microsoft are among the mega-firms that collectively have amassed more capital and arguably more data than any government research program in world history.

Under conditions such as these, how should the terms of the academic social contract be written now?

The articles assembled in this special issue provide fresh takes on this question. Historians Elizabeth Tandy Shermer and Ethan W. Ris each shine different lights on the political currents that flow through Higher Education for American Democracy. Shermer surfaces just how audacious the document’s ambitions were, given the circumstances of a postwar America deeply riven by caste-like racial segregation and inequality, the exclusion of women from entire domains of learning and work, and the reciprocal competition and hostility among Protestants, Catholics, and Jews. The report’s authors offered a vision of a very different nation, in which broad access to higher education would enable the development of civil equality and critical dialogue across lines of difference. That vision was not without vocal detractors, some of whom included written dissents from the report’s recommendations about desegregation and federal funding of higher education. Nevertheless, the continuities between progressive conceptions of the salutary role of higher education in liberal democracy then and now are strong.

Ris’ contribution takes a very different critical angle, aptly noting the extent to which Higher Education for American Democracy served the parochial interests of academics and university leaders. By framing publicly funded expansion of higher education as the linchpin of a virtuous society, the authors of the Truman Report wrote themselves into history and the government’s largesse. Ris shows what a deft rhetorical accomplishment this was, especially in the authorship of George Zook, president of the American Council on Education, the higher education sector’s primary lobbying agency in Washington then and now, chair of the Truman Commission, and lead author of its six-volume output. In that sense the report was a stunning act of academic statecraft—or what Ris wryly calls “personnel politics”—an effort to tug at the levers of government to pursue sacred and profane ambitions all at once.

History, of course, kept moving. Sociologists Kelly Nielsen and Laura T. Hamilton rely on their rich comparative study of the University of California campuses at Riverside and Merced to paint a sobering picture of the evolving fate of state-supported institutions at very different moments in postwar history. The campus at Riverside (UCR), founded in the heady early days of postsecondary massification for the sorts of students imagined to be drawn to a “Swarthmore of the West,” underwent an identity crisis as the environmental degradation of Riverside County spurred white flight and a demographic transformation in the region.

UCR leaders ultimately leveraged these crises to create a whole new kind of institution, laser-focused on enabling social mobility among racially minoritized groups. It worked. Founded under a regime of austerity politics in 2003, the campus at Merced (UCM) serves a student population that is sociologically comparable to Riverside’s but must do so without the advantage of UCR’s legacy of physical and reputational endowments from Cold War–era public munificence. UCM has instead been forced to rely on untested financial instruments to leverage its future in optimistically named “public–private partnerships” with commercial firms more interested in their own profits than academic public goods. Taken together, the fates of the two campuses betray both the great promise of higher education as a mechanism for forwarding progressive social goals, but also the starkly different racial and economic politics that have shaped higher education in the Cold War and present eras.

Sociologists Jeffrey L. Kidder and Amy J. Binder make the provocative point that the massification of higher education over the last 75 years has remade the character of U.S. national politics. University campuses are now essential features of the national public sphere and training grounds for every next generation of political actors and activists. Drawing on a multi-site qualitative study of the organization of activism among university students on the political left and right, Kidder and Binder depict very different architectures and pathways of political engagement and socialization. Right-wing students often feel like outliers in generally liberal campus cultures, but they also enjoy the systematic tutelage of national organizations specifically purposed with empowering young conservatives with the tools to mobilize on their own campuses, and pipelines into post-collegiate careers of lobbying and government service. Left-wing students may enjoy a sense of cultural ownership on campuses where faculty and administrators tend to share their progressive political views, yet they have far fewer supports than their fellow students on the right for translating campus activism into careers of political service. The consequences of this asymmetry for the character of the academy and the broader polity, and the rising conflict between these two, are haunting.

Read together, these contributions speak to the increasing complexity of the academic social contract as U.S. history unfolds. Higher Education for American Democracy presented a vision for how government-funded expansion of the postsecondary order might create a more perfect union. That expansion created unprecedented opportunities for millions of Americans of every race, class, and creed, but it did far less to remediate inequality and disunion than refract them in unanticipated ways. This melancholy conclusion does not diminish our wonder at how profoundly the 20th century’s academic revolution has remade American democracy, or our responsibilities as educators to continually appraise the value and consequences of our work for our fellow citizens. 

Additional information

Notes on contributors

Emily J. Levine

Dr. Emily J. Levine is Associate Professor of Education and (by courtesy) History at Stanford University. She is the author of Allies and Rivals: German-American Exchange and the Rise of the Modern Research University (University of Chicago Press, 2021), and Dreamland of Humanists: Warburg, Cassirer, Panofsky, and the Hamburg School (University of Chicago Press, 2013), which was awarded the Herbert Baxter Adams Prize by the American Historical Association. Levine has published in the New York Times, Washington Post, LA Review of Books, and Foreign Policy, as well as in top scholarly journals.

Mitchell L. Stevens

Mitchell L. Stevens is Professor of Education and (by courtesy) Sociology at Stanford University, where he also co-leads the Stanford Pathways Lab (pathwayslab.stanford.edu). His most recent books are Remaking College: The Changing Ecology of Higher Education (Stanford University Press, 2015) and Seeing the World: How US Universities Make Knowledge in a Global Era (Princeton University Press, 2018).

Further Reading

  • Adam, T. (2020). The history of college affordability in the United States from the colonial times to the Cold War. Lexington Books.
  • Geiger, R. (2015). The history of American higher education: Learning and culture from the founding to World War II. Princeton University Press.
  • Jencks, C., & Riesman, D. (1968). The academic revolution. Doubleday.
  • Kirst, M. P., & Stevens, M. L. (Eds.). (2015). Remaking college: The changing ecology of higher education. Stanford University Press.
  • Leslie, S. W. (1993). The Cold War and American Science: The Military-Industrial-Academic Complex at MIT and Stanford. Columbia University Press.
  • Levine, E. J. (2021). Allies and rivals: German-American exchange and the rise of the modern research university. University of Chicago Press.
  • Loss, C. P. (2012). Between citizens and the state: The politics of American higher education in the twentieth century. Princeton University Press.
  • O’Mara, M. P. (2005). Cities of knowledge: Cold War science and the search for the next Silicon Valley. Princeton University Press.
  • Thelin, J. R. (2019). A history of American higher education. (3rd ed.). The Johns Hopkins University Press.
  • Whitehead, J. S. (1973). The separation of college and state: Columbia, Dartmouth, Harvard, and Yale, 1776–1876. Yale University Press.