Vaccines and Military, A Beneficial Relationship

Before World War II, soldiers died more often from vaccine-preventable diseases than of battle injuries
normandy d-day
(Precision Vaccinations)

War and disease have marched arm in arm for centuries. Diseases not only posed risks to military readiness, but also to civilian populations, wrote Dr. Kendall Hoyt, in an article published in NavyTimes on June 6, 2019.

Excerpts from this article are below:

‘Wars magnify the spread and severity of disease by disrupting populations. As large groups of people move across borders, they introduce and encounter diseases in new places.

Before World War II, soldiers died more often from disease, than of battle injuries.

The ratio of disease-to-battle casualties was approximately 5-to-1 in the Spanish-American War and 2-to-1 in the Civil War.

Improved sanitation reduced disease casualties in World War I, but it could not protect troops from the 1918 influenza pandemic. During the outbreak, flu accounted for roughly half of US military casualties in Europe. 

As the Second World War raged in Europe, the U.S. military recognized that infectious disease was as formidable an enemy as any other they would meet on the battlefield. So they forged a new partnership with industry and academia to develop vaccines for the troops. 

Vaccines were attractive to the military for the simple reason that they reduced the overall number of sick days for troops more effectively than most therapeutic measures. 

This partnership generated unprecedented levels of innovation that lasted long after the war was over. 

As industry and academia began to work with the government in new ways to develop vaccines, they discovered that many of the key barriers to progress were not scientific but organizational. 

In 1941, fearing another pandemic as it braced for a second world war, the U.S. Army organized a commission to develop the first flu vaccine. The commission was part of a broader network of federally orchestrated vaccine development programs. 

These programs enlisted top specialists from universities, hospitals, public health labs, and private foundations to conduct epidemiological surveys and to prevent diseases of military importance. 

These programs were not a triumph of scientific genius but rather of organizational purpose and efficiency. 

Scientists had been laying the groundwork for many of these vaccines, flu included, for years before. It was not until World War II, however, that many basic concepts were plucked from the laboratory and developed into working vaccines. 

The newly formed flu commission pulled together knowledge about how to isolate, grow and purify the flu virus and rapidly pushed development forward, devising methods to scale-up manufacturing and to evaluate the vaccine for safety and efficacy. 

Under the leadership of virologist Thomas Francis Jr, the commission gained FDA approval for their vaccine in less than two years. 

It was the first licensed flu vaccine in the US. In comparison, it takes eight to fifteen years on average to develop a new vaccine today. 

Wartime programs, like the flu commission, developed or improved a total of 10 vaccines for diseases of military significance, some in time to meet the objectives of particular operations. 

Some of these vaccines were crude by today’s standards. 

In fact, some might not receive broad FDA approval today, but they were effective and timely.

The government used “No loss, no gain” contracts that covered the cost of research and, occasionally, indirect costs, but did not provide a profit. 

Under normal circumstances, universities would have resisted this technocratic reorganization of their research agenda, but the threat of war softened opposition.

Manufacturers also began to work on projects with little to no profit potential.

Because vaccines were recognized as an essential component of the war effort, participating in their development was seen as a public duty.

At the time intellectual property protections were less of a barrier to information sharing than they are today. Without these restrictions, teams were able to consolidate and apply existing knowledge at a rapid rate.

This cooperative, duty-driven approach to vaccine development persisted into the postwar era. 

Don Metzgar, a virologist who began working in the vaccine industry in the 1960s explained to me in an interview that, “pharmaceutical companies looked at vaccine divisions as a public service, not as huge revenue generators.”

Whether at war or in peace, timely vaccine development is vital.

Scientific obstacles can be formidable, as our continued struggle to develop vaccines for tuberculosis, malaria and HIV demonstrate.

Mobilizing federal resources on a massive scale, as we did in the 1940s, is not a sustainable solution, but we can still take a page out of the World War II playbook.’

Dr. Kendall Hoyt is an Assistant Professor at Dartmouth’s Geisel School of Medicine at Dartmouth and a lecturer at the Thayer School of Engineering at Dartmouth College where she teaches courses on technology and biosecurity. She serves on the National Academy of Sciences Committee on the Department of Defense’s Programs to Counter Biological Threats and on the advisory board of the Vaccine and Immunotherapy Center at Massachusetts General Hospital.