Experienced UIUC graduate student specializing in Math, Computer Science, and Writing
Availability:
Every day, 10:00am-10:00pm PST
Subjects:
Math
Computer Science
Writing
In recent years, divestment movements among universities have found momentum. Many students and activists believe that academia should be a place to pursue knowledge without the potentially corrupting influence of commercial interests from industry ties. After decades of protests and complaints, large amounts of money have been divested from fossil fuel industries, and many protests continue to do the same with regards to the Israel-Palestine conflict.
For these activists, there’s a white whale driving the military-academic-industrial complex they hold responsible for funding the causes they protest against: the Department of Defense.
The United States has remained the world leader in domestic R&D investments with over $700 billion in 2020. That’s more than the United Kingdom, Japan, Germany, South Korea, France, India, and Russia combined. In 2022, the Pentagon spent at least $100 million with 14 universities each on next generation technology research, and as much as $1.4 billion in the case of Johns Hopkins University.
With so much taxpayer funding being directed into these military projects on college campuses, it’s worth diving into the history of military-academia relationships to understand how institutions of higher learning play a role in US war and defense strategies.
To some degree, academia has always been involved in government projects across the world, but America lagged behind Europe in this area for much of the 19th century. While nations in Europe were used to funding research into basic science through public governance and progressed rapidly as a result, America was more concerned about practical industry innovations that helped grow and expand their territories.
In fact, AT&T and General Electric were among the first pioneers of major industrial research organizations, with AT&T giving birth to the famous Bell Labs that was later responsible for major innovations in computer software and engineering industries. Aside from a few colleges focused on agricultural research, federal support was sorely lacking for either university researchers or private industrial research organizations.
At the beginning of the 20th century, wartime initiatives became especially powerful in incentivizing funding towards research to improve military operations. The increases in spending, whether from additional tax-based revenue streams or borrowing money on bonds, lead to innovations in military-related technology with broad public support for doing whatever it takes to win a conflict. However, it was a rough transition between the very different priorities of military officers and science researchers.
During World War I, advances in submarines, aircraft, radio communications, sonar, and other wartime technologies were made with the support of physics researchers, but that was in spite a lot of mistrust between academics and public officials. Scientists were skeptical about creating weapons of destruction that militaries might use for all kinds of conflicts going forward, justified or not.
Meanwhile, many in the military believed war doesn’t change and would always be fought by infantry with bayonets, so working with researchers was unnecessary. According to former Harvard President James Conant, the American Chemical Society offered help to the US military given the extensive use of poison gas and explosives in the war, but the Secretary of War at the time claimed that the military employed one chemist and that would be enough.
The Head of the US Office of Scientific Research and Development (OSRD) at the time, Vannevar Bush, wrote that labs in the military were “dominated by officers who made it utterly clear that the scientists and engineers employed in these laboratories were of a lower caste of society” and “did not have a ghost of an idea concerning the effects of science on the evolution of techniques and weapons.”
During World War II, advances in technology played a more substantial role in the progress of the war than any conflict before thanks to increased widespread industrialization of economies across the world, building on many of the new inventions from World War I.
Bush observed the lack of coordination between scientists and the military during World War I. In 1940, he pushed for the creation of the National Defense Research Council (NDRC) to President Franklin Roosevelt, which gave “a small company of scientists and engineers, acting outside established channels, got hold of the authority and money for the program of developing new weapons.”
The NDRC created the Radiation Laboratory at MIT, which contributed to radar, and the Underwater Sound Laboratory in Connecticut, which contributed to sonar. They brought many civilian scientists into military laboratories through government contracts, an effort that was boosted by the autonomy and leadership that scientists were given.
More significantly, NDRC handled the Manhattan Project which developed nuclear weapons for the United States. In the same year that the plutonium was isolated by the University of California, NDRC began research into uranium and plutonium to test the viability of an atomic bomb. Texas A&M University was part of the Los Alamos National Laboratory where the Project developed the first nuclear weapons (covered recently in the biographical movie Oppenheimer).
Other branches of the military made similar efforts to work with researchers as well, such as the US Navy which coordinated projects into weapons development and intelligence gathering.
Fido — codename for an anti-submarine torpedo used critically in the Atlantic sea lanes in 1944 — was the result of collaboration between engineers at Bell Labs and the Harvard Underwater Sound Lab. Many civilian scientists had no prior experience in weapons development, but their background in physics and engineering contributed enormously.
As another example, Harvard-educated planktologist Dr. Mary Sears became one of the first oceanographers employed by the US Navy and directed a unit consisting of 12 women and 3 men. None of them had prior experience with military planning, same as the teams behind Fido, but they made great strides in transforming oceanographic data into intelligence reports for military operations.
When the war ended, some colleges wanted to return to the pre-war status quo where their research motivations were separate from commercial interests and national politics. Other scientists had become excited by the successes brought by Uncle Sam’s deep pockets and the ability to see their work make real-world impacts through government policies and programs.
In 1945, Vannevar Bush proposed a national strategy (Science: The Endless Frontier) for scientific progress involving the coordination between public agencies, private companies, and academic researchers. Given widespread recognition of the need to maintain supremacy over the Soviet Union through the Cold War, Congress created the Office of Naval Research (ONR) to fund basic science in universities.
While many researchers embraced the incentives provided by the Navy, others were still reluctant to be involved with military applications even though they enjoyed the benefits from working with a federal program. Another initiative Bush pushed for led to the development of the National Science Foundation (NSF), which had distance from the DoD and allowed funds to be distributed to non-military projects. This gave the federal government ways to motivate scientific development across the nation through and for the military even during times of peace.
Throughout the Cold War, the United States viewed scientific progress as an issue of national security to maintain both practical and symbolic dominance over the Soviet Union. The Manhattan Project became the permanent Atomic Energy Commission. The ONR created the Naval Research Laboratory. The Army and the Air Force also developed their own research divisions.
The funding from the Department of Defense helped evolve many university laboratories and even entire disciplines. Although the initial DoD goal was to broadly support “physical research,” colleges like Stanford and MIT developed specializations like nuclear physics, materials science, and aerospace engineering as labs prioritized different projects.
Although there was a branch called the National Advisory Committee for Aeronautics (NACA) since 1915, the USSR’s launch of Sputnik started a space race in 1957 that called for revamping the US approach to investing in aerospace technology. This gave way to the NASA we know today, which landed a man on the moon.
The spending in physical sciences was also accompanied with an increase of spending in the social sciences. Although the initial intentions behind funding anthropological, sociological, and psychological research was to devise methods to slow or halt the spread of Communist influence into cultures across the world, researchers were able to develop complex theories that are still discussed today in critical analysis about societal institutions.
Perhaps the most significant achievements were in the field of computer science. Throughout the 50s and 60s, both the Navy and the Air Force were interested in developing new systems that involved automation and improved communication. The research into circuits and miniaturized computers they funded became commercialized by IBM, Control Data Corporation, and Honeywell for industry uses.
In 1958, the DoD created the Advanced Research Projects Agency (ARPA) for missile defense and nuclear test technology, especially computer networking technology to increase computing power and facilitate information-sharing. The Cuban Missile Crisis in 1962 almost resulted in nuclear war just because of the failure to effectively share data across military systems.
Through the ARPANET project, which used concepts of distributed communication networks, the infrastructure for the modern internet — the World Wide Web — was born. In 1969, a computer at the Stanford Research Institute made contact with a computer at UCLA’s Boelter Hall.
From then on, the computer industry in the US began to thrive with a strong partnership between a defense industry looking to modernize its weapons, a military trying to maintain global dominance, and an academic environment excited to push the boundaries.
Today, the DoD is responsible for 70% of all government funds that are directed towards research and development, while the NSF is responsible for less than 5%. In absolute terms, the US has higher military spending right now than at any point in the Cold War. Although withdrawals from the Middle East led to a downsizing of the Pentagon budget in the early 2010s, spending started to increase again around 2016 for reasons including the rivalries with China and Russia, especially the invasion of Ukraine.
Now, Artificial Intelligence looks to be the next frontier for military technology, leading to a prevalent sense of urgency among military and public officials to get ahead of international competition with it. Here are just a few highlights of many:
In 2020, UT Austin launched a new robotics lab through a partnership with the Army Futures Command.
Texas A&M ‘s University Consortium for Applied Hypersonics has a kilometer-long tunnel to test the Pentagon’s hypersonic missiles.
In 2022, the Pentagon spent at least $100 million with 14 universities each on next generation technology research, and as much as $1.4 billion in the case of Johns Hopkins University.
In 2023, the Department of Defense announced the creation of the first university-affiliated research center (UARC) sponsored by the US Air Force, which was to also be the first UARC established at a Historically Black College or University (HBCU). It focuses on tactical autonomy, which involves using artificial intelligence.
At the 2024 Billington Cybersecurity Summit, the Pentagon’s Deputy Chief Digital AI Officer Margaret Palmieiri said that “AI is kind of like electricity. It’s not a specific thing. It’s an enabler of a bunch of different mission areas.” She went on to say that the DoD has over 1000 AI applications in use with about $1.5 billion in funds for them.
As an Artificial Intelligence specialist myself, I also had a contract with the DoD to investigate wildfire information-sharing technology in response to the worsening natural disasters we’ve experienced in California, as well as other regions of the world.
In all likelihood, the relationship between the military and academia will continue to grow stronger, especially through the adoption of AI. For many commercial applications, using intelligent algorithms might be overkill and not worth the return on investment, but the massive scale of data that modern intelligence apparatuses work with could benefit enormously from using AI.
Many have concerns about the ethics of using AI in war, from malfunctions or unintended consequences to facilitating war crimes with brutal efficiency. Organizations like the International Committee for Robot Arms Control (ICRAC) have called for an international conversation about these potential outcomes, but countries and companies seem to be plowing ahead anyways.
Every country is concerned that forcing its AI industry to abide by stricter regulations and seriously grapple with ethical considerations might handicap it in the global race to have better technology. As long as that’s true, you can count on seeing the military race ahead to develop the next 1000 AI applications in partnership with universities and the defense industry.