an illustrated research timeline
Illustration: Cristina Spano
Science & Tech

A Brief History of U.S. Research Funding
Before World War II, the federal government didn’t fund research. But after scientists with the Manhattan Project helped win the war, the U.S. was convinced that university research was a great national investment. Until now. A look back at how we got here.

By Will Bunch ’81 / Fall 2025
September 21st, 2025

It was August 2023 and Casey Harrell and his family were weeping tears of joy. Harrell, who has ALS, was able to speak the first words his five-year-old daughter had ever heard him say, thanks to a brain-computer interface that translates brain signals into speech with up to 97 percent accuracy. The breakthrough that allowed Harrell to speak was the work of a consortium of scientists called BrainGate, led by Brown researchers and school of engineering alumni. The funding came from the federal government, specifically grants from the National Institutes of Health (NIH).  

Twelve years earlier, in 2011, Brown had hired Lawrence Larson as first dean of the newly elevated School of Engineering. Larson had been a department chair at the University of California-San Diego. Then, and now, UC San Diego was one of the nation’s top schools for securing federal research grants, and Larson said it was no secret that Brown hoped he could bring some of that same mojo to College Hill.

“I was basically brought to Brown in order to try to up our research profile,” Larson, now a Brown engineering professor, said in a recent interview, recalling his dozen years as dean. Thanks to a generous era of grants from federal agencies such as the National Science Foundation (NSF), Larson largely accomplished his mission. As it roughly doubled outside research funding, the School of Engineering set about hiring top academic talent and building the $88 million Engineering Research Center, with spacious new labs in state-of-the-art fields such as nanotechnology.

It was this growing and fruitful partnership with Washington that sparked the breakthrough that allowed Casey Harrell to speak to his daughter for the first time ever. “Not being able to communicate is so frustrating and demoralizing. It is like you are trapped,” Harrell said at the time. “Something like this technology will help people back into life and society.”

Harrell’s computer-aided speech grew the long list of innovations credited to the 80-year partnership between the federal government and America’s top universities that—at its best—has cured or diminished diseases, made America the global leader of a so-called knowledge economy, and turned the United States into a magnet for top scientists from all over the world. But this poignant moment of discovery may also have foreshadowed the end of an era.

Building an American empire of knowledge
Before the January arrival of a new administration in Washington precipitated seismic shifts in the landscape for federal research funding, the history of the modern research ecosystem in American academia was almost literally the story of acceleration from zero to 60. Zero was essentially the level of federal research support before the start of World War II, while in 2023, $60 billion flowed from the government in D.C. to America’s universities to invest in research of all kinds.

“The quest for knowledge could replace the vanishing American frontier as the new source of American freedom and creativity.” 

It’s a partnership that has been so mutually beneficial that it’s hard to believe it might never have happened. The role that university-based scientists played in the Allied World War II victory—most famously with the Manhattan Project that produced the atom bomb, but also with critical innovations from advanced radar to penicillin—sparked a national conversation as the war ended in 1945 about how to sustain that effort. And two critical, intertwined decisions—to support basic science and to do so through grants to top schools like Johns Hopkins or MIT, rather than a centralized government agency—were the choices that launched an American empire of knowledge.

“They said, essentially, that we want to be efficient but we don’t want to be too socialist,” says Luther Spoehr, Brown senior lecturer emeritus in education, who teaches the modern history of universities. Led by the father of the government-academia partnership—the MIT professor Vannevar Bush, who supervised the Manhattan Project and pressed for the creation of the NSF—a forward-looking Washington urged top scientists to compete for federal grants with their best ideas.

Brown was not at the head of this parade. In those postwar years, it boosted its national reputation more through an emphasis on undergraduate teaching and curriculum than through research superstars. But while it is still not ranked near the top in federal research dollars, it has steadily—perhaps inevitably—gotten on board. Now, the growth of research at Brown’s School of Engineering has been overtaken by federal investment in the life sciences, which these days account for nearly 70 percent of Brown’s research portfolio through grants centered at the Warren Alpert Medical School and School of Public Health. In all, federal grants to Brown reached roughly $250 million a year through the 2020s. 

The virtuous cycle of upward investment that drove the Cold War “space race,” a government war on cancer, the creation of new computers, and an internet that powered Silicon Valley was so popular and so noncontroversial that it mostly didn’t occur to its key players that someday a federal administration might express open hostility toward science and cultural grievances against higher education. Or that a White House might use its huge financial leverage to demand radical change.

illustration of a researcher at work
illustration: Cristina Spano

And yet polls report that growing numbers of voters fear that colleges are steeped in what some have claimed is liberal indoctrination. Since the change in federal administrations in January, the government has stunned academia with sudden and often unexplained funding freezes in core programs like the NIH and NSF, but also targeted attacks on campuses from Harvard to the University of Virginia over issues like diversity programs, transgender athletes, and the conduct of protesters during 2024’s pro-Palestinian encampments.

Along with a handful of leading research universities, Brown has found itself in the middle of this maelstrom. In April, University officials braced against a threatened halt in $510 million in federal funding as the government investigated Brown’s approach to combating antisemitism and to diversity, equity, and inclusion (DEI) programs. In addition, ongoing pauses in the NIH grants that make up 70 percent of Brown’s federal research dollars had a devastating impact. In late June, in extending a campuswide hiring freeze, University President Christina Paxson said Brown was losing $3.5 million a week in unpaid bills from existing NIH grants. Then, on July 30, she announced an agreement to restore the flow of funds (see “Brown cuts a deal,” at right).

Columbia and the University of Pennsylvania made deals before Brown did, while Harvard, Northwestern, Cornell, and Princeton were, at press time, among the institutions still facing funding threats. With possible courtroom fights ahead involving some of these institutions, the federal pullback is likely to remain the biggest story at college campuses across the nation as the new academic year begins. But the ongoing crisis also raises questions about the past. How did we get to this place where America’s top universities became so dependent on research dollars from Washington—a Jenga block increasingly propping up the research enterprise in higher education, now threatened by a federal administration toying with pulling it out?

The story is very much an affirmation of a cliché, that necessity is the mother of invention. In 1943, the necessity was defeating the Nazis and winning World War II.

Exploring a new American frontier
Vannevar Bush was one of the most significant Americans of the 20th century that most Americans have never heard of. A top physicist at MIT, Bush was a man who had ideas ahead of his contemporaries—as a 1930s computing pioneer who developed concepts that later evolved into the internet—and a great sense of timing. Moving to Washington in 1939 to head the Carnegie Institution for Science, Bush became a leading adviser to President Franklin Roosevelt when World War II erupted. As chair of the National Defense Research Committee, he became Washington’s point man for the Manhattan Project, helping recruit America’s top university physicists—most notably the University of California’s J. Robert Oppenheimer—to build the world’s first atomic bomb in the New Mexico desert.

The contrast between the new American model of academic freedom and competitive grants and the Soviet system of rigidly controlled government- directed research was a tale that D.C.’s new spin doctors wanted to tell.

While the Manhattan Project and Oppenheimer became legends, they were also the leading edge of a bevy of scientific breakthroughs, driven by the urgency of war, that gave U.S. troops and their allies an advantage in both Europe and the Pacific. These included the mass production of penicillin, which saved thousands of wounded soldiers, the development of advanced radar, and the proximity fuze, which triggers a bomb to explode as it nears its target.

In 1944, as the tide turned in the Allies’ favor, Roosevelt was increasingly frail yet seeking an unprecedented fourth term. He made two consequential decisions that changed the future of higher education in America. He supported and signed the law known as the G.I. Bill, which offered a free college benefit for veterans that kicked off decades of rising college enrollment. But he also asked Bush to craft a report on how the government could continue to foster scientific innovation after the war—partly out of hopes that new technology could continue to sustain a suddenly booming U.S. economy.

“The fear as the war was coming to a close was that maybe the depression would resume,”  says G. Zachary Pascal, author of the biography Endless Frontier: Vannevar Bush, Engineer of the American Century. What’s more, Bush had noted that foreign-born scientists had played a key role in the Manhattan Project, fueling his desire to train more American experts. Pascal said Bush saw “the need for the U.S. to produce, to create, and spawn more scientists and researchers in their own country.”

Bush’s landmark report—Science: The Endless Frontier, published in July 1945, just days before the first A-bomb was dropped on Hiroshima—built on his longtime ideas that, as Pascal wrote, “the quest for knowledge could replace the vanishing American frontier as the new source of American freedom and creativity.” Bush argued that—in the post-war world—this would mean putting faith in the top academic experts and backing their ideas with government grants, sponsoring basic science instead of narrowly targeted projects.

illustration of a researcher with results
illustration: Cristina Spano

The elevated prestige of Bush after the Manhattan Project all but assured that he would have little trouble convincing new President Harry Truman to back his plan, although Truman did insist that his proposed National Science Foundation not have an independent board but come under administration control—a decision that would have consequences 80 years later. Even so, it took five long years for Congress to finally fund the NSF with Bush as its founding chief; other agencies like the Naval Office of Research and the Atomic Energy Commission, which flowed out of the Manhattan Project, filled the gap as the Cold War with the Soviet Union heated up on World War II’s smoldering embers.

In many ways, that conflict with the USSR shaped the first decades of Washington’s growing relationship with universities, not just in the substance of the research—from building even more powerful atomic weapons to the accelerating race to send satellites and humans into space—but also in the approach. The Cold War was in many ways defined by increasing use of propaganda, and the contrast between the new American model of academic freedom and competitive grants and the Soviet system of rigidly controlled government-directed research was a tale that D.C.’s new spin doctors wanted to tell.

“Science in the United States was supposed to be basic science,” says Audra J. Wolfe, the author of Freedom’s Laboratory: The Cold War Struggle for the Soul of Science. “It wasn’t supposed to be applied. It wasn’t supposed to be materialist. It wasn’t driven by material needs of the people as you have under Communism. It was driven by the world of ideas, and that science in the United States could be intensely international and cooperative.” That narrative was meant to appeal to the world’s developing nations, many on the precipice between communism and capitalism.    

Becoming a global beacon for scientific progress
Despite its historical significance, federal funding of campus research in the early 1950s was just a fraction of what it would become by its 2024 peak, and it was heavily concentrated in a handful of schools such as Johns Hopkins, MIT, Stanford, and Berkeley. But the race to expand research and compete for government grants would touch almost all of academia, including Brown. During the 1950s, the university’s engineering department began a long era of expansion, while the university attracted top émigré scientists such as William Prager, who had left his native Germany in 1933 and launched Brown’s applied math department. The flood of foreign scientists served as confirmation that America was becoming the global beacon for scientific progress.

The appeal of federal dollars funding new scientific frontiers outweighed any concerns that forging closer ties with the government might someday threaten academic freedom.

Not surprisingly, some tension between colleges’ traditional core mission of educating undergraduates and the new zeitgeist of recruiting academic heavyweights more focused on groundbreaking research began to emerge. At Brown, Henry Wriston, who served as university president from 1937 to 1955, coined the phrase “university-college” to describe the institution in 1948, at the dawn of the post-war research boom. He described Brown as an “institution which puts primary emphasis upon the liberal arts, bringing to their cultivation the library, laboratories, and personnel resources of a university.”

But the appeal of federal dollars funding new scientific frontiers outweighed any concerns that forging closer ties with the government might someday threaten academic freedom, and the naysayers were few. That did change somewhat in the late 1960s and early 1970s, when student unrest over the Vietnam War sparked protests over university research ties to U.S. weapons like napalm, culminating in the bombing of an Army Math center at the University of Wisconsin in 1970 that killed a graduate student. But the tragedy did not prevent an era of ever-expanding growth.

In 1957, the USSR’s launch of the Sputnik satellite spurred an alarmed Congress to pass the National Defense Education Act, which greatly boosted federal research aid while also establishing the infrastructure for student loans aimed at producing more scientists and engineers. In the mid-1960s, President Lyndon B. Johnson’s Great Society agenda greatly expanded the reach of federal grants to include more public universities from coast to coast, creating a new archipelago of gleaming research centers. 

What’s surprising is that even after the dour economy of the 1970s and a rising strain of conservatism that led to Ronald Reagan’s 1980 election as president, the rate of government support for university research actually grew sharply in the 1980s. A 1980 reform backed by free-market conservatives that gave universities, and not the government, patent rights for new discoveries spurred those institutions to place an even greater stress on promoting research—the start of what some would call “Big Science.”

Roger Geiger, a Penn State professor emeritus who has authored a number of books on the modern history of higher education, noted that increasingly, federal dollars were going to the politically popular cause of biomedical research, as well as the late 20th century revolution in computer technology. 

“The neoliberal era began around 1980—and [with it] the realization of how important academic research was for American technology,” Geiger says. “And so that provided a lot of interest to keep federal funding. And also, federal funding was then oriented toward cooperation with industry”—especially the rising economic engine that became California’s Silicon Valley.

illustration of a researcher at work
illustration: Cristina Spano

In 1983, a young academic named Subra Suresh arrived on the Brown campus as an assistant professor of engineering at the dawn of a career that saw him become head of the NSF during the Barack Obama administration and also the Vannevar Bush Professor of Engineering at MIT. During those years, he saw how funding from an alphabet soup of federal agencies—the Department of Energy, Defense Advanced Research Projects Agency, Army Research Office, Office of Naval Research, and Air Force Office of Scientific Research—allowed Brown to create a materials science lab that would become a bedrock of its School of Engineering.

Suresh, who returned to Brown in 2023 as professor at large in the School of Engineering, said in an email interview that the flow of federal dollars has allowed Brown to build new laboratory space and a computing infrastructure that “trained several generations of students at Brown who have subsequently had major roles in academia, industry, and government throughout the U.S. and abroad.”

A “research university” 
The momentum for this emphasis on research loomed large over the 2001-2012 tenure of Brown President Ruth Simmons, who kept the phrase “university-college” in the school’s mission statement but leaned even more heavily into the university part, punctuated by the creation of the School of Engineering in 2010 and, just as President Paxson took the reins, the creation of the School of Public Health in 2013. During the 2000s, federal research grants to Brown grew by 57 percent.

Larson, who came to Brown at the end of this era, says he believes that the stronger focus on research grants ultimately benefited undergraduates seeking to become the next generation of William Pragers or Subra Sureshes. 

“I think that the way to think about it is that we want our undergraduates to be taught by the best people in the world—the most active and intellectual leaders of their field,” Larson  says. “And having great research at Brown, not only do we have an amazing societal impact with the research that we do, but we also get our undergraduates incredibly excited about research themselves, and becoming leading edge intellectuals.”

The change in approach to research science came gradually—a generation of attacks on climate science that were funded by Big Oil, for example—and then all at once. As recently as 2022, then-President Joe Biden advocated for a “cancer moonshot” that aimed to send the government-university partnership into a new orbit. This year’s rapid reversal of fortune has stunned campus leaders and scientists, who already are talking about a golden age as if it were a bygone era.

“During the 1960s through early 2000s, a significant fraction—as much as 80 percent or more—of computer science research in American universities, which positioned the U.S. as the unquestioned leader in semiconductor, mobile, and internet technologies, was funded by the National Science Foundation,” notes Suresh, the former NSF leader. He says NSF-backed projects also led to 268 American Nobel prizes.

The national crisis in federal research funding has top researchers anxious for the future of science in America, stunned by what looks like a sudden divorce after an 80-year-relationship—and asking whether they should have seen it coming. “It’s a huge self-inflicted wound,” Larson said of the impasse. “It’s terrible.”

Will Bunch ’81 is national opinion columnist for the Philadelphia Inquirer and author of several books including 2023’s After the Ivory Tower Falls.

What do you think?
See what other readers are saying about this article and add your voice. 
Related Issue
Fall 2025