War and disease have marched arm in arm for centuries. Wars magnify the spread and severity of disease by disrupting populations. As large groups of people move across borders, they introduce and encounter disease in new places. Often, they move into crowded, resource-poor environments that allow diseases to thrive.
Before World War II, soldiers died more often of disease than of battle injuries. The ratio of disease-to-battle casualties was approximately 5-to-1 in the Spanish-American War and 2-to-1 in the Civil War. Improved sanitation reduced disease casualties in World War I, but it could not protect troops from the 1918 influenza pandemic. During the outbreak, flu accounted for roughly half of US military casualties in Europe.
As the Second World War raged in Europe, the US military recognised that infectious disease was as formidable an enemy as any other they would meet on the battlefield. So they forged a new partnership with industry and academia to develop vaccines for the troops. Vaccines were attractive to the military for the simple reason that they reduced the overall number of sick days for troops more effectively than most therapeutic measures.
This partnership generated unprecedented levels of innovation that lasted long after the war was over. As industry and academia began to work with the government in new ways to develop vaccines, they discovered that many of the key barriers to progress were not scientific but organisational.
World War II sped the development of flu vaccine
In 1941, fearing another pandemic as it braced for a second world war, the US Army organised a commission to develop the first flu vaccine. The commission was part of a broader network of federally orchestrated vaccine development programmes.
These programmes enlisted top specialists from universities, hospitals, public health labs and private foundations to conduct epidemiological surveys and to prevent diseases of military importance.
Wartime vaccine programmes expanded the scope of the military’s work in vaccines well beyond its traditional focus on dysentery, typhus and syphilis. These new research initiatives targeted influenza, bacterial meningitis, bacterial pneumonia, measles, mumps, neurotropic diseases, tropical diseases and acute respiratory diseases. These diseases not only posed risks to military readiness, but also to civilian populations.
These programmes were not a triumph of scientific genius but rather of organisational purpose and efficiency.
Scientists had been laying the groundwork for many of these vaccines, flu included, for years before. It was not until World War II, however, that many basic concepts were plucked from the laboratory and developed into working vaccines.
The newly formed flu commission pulled together knowledge about how to isolate, grow and purify the flu virus and rapidly pushed development forward, devising methods to scale-up manufacturing and to evaluate the vaccine for safety and efficacy.
Under the leadership of virologist Thomas Francis Jr, the commission gained FDA approval for their vaccine in less than two years. It was the first licensed flu vaccine in the US. In comparison, it takes eight to fifteen years on average to develop a new vaccine today.
Flu vaccine, as the Army later discovered, required annual tweaking to match circulating strains of the virus, which it still does today. Even so, the timeline from development to use was a remarkable achievement.
Military needs drove vaccine development
Wartime programmes, like the flu commission, developed or improved a total of 10 vaccines for diseases of military significance, some in time to meet the objectives of particular operations.
For instance, botulinum toxoid was mass-produced prior to D-Day in response to (faulty) intelligence that Germany had loaded V-1 bombs with the toxin that causes botulism. Japanese encephalitis vaccine was developed in anticipation of an Allied land invasion of Japan.
Some of these vaccines were crude by today’s standards. In fact some might not receive broad FDA approval today, but they were effective and timely.
How did these programmes develop so many vaccines, so fast?
Scientists often conducted research at their home institutions, which allowed the military to gain access to valuable expertise and facilities in the civilian sector.
The government used “No loss, no gain” contracts that covered the cost of research and, occasionally, indirect costs, but did not provide a profit. Under normal circumstances, universities would have resisted this technocratic reorganisation of their research agenda, but the threat of war softened opposition.
Manufacturers also began to work on projects with little to no profit potential. Because vaccines were recognised as an essential component of the war effort, participating in their development was seen as a public duty.
With industry as a willing partner, wartime programmes forged a new research format that effectively translated laboratory findings into working products.
At the time intellectual property protections were less of a barrier to information sharing than they are today. Without these restrictions teams were able to consolidate and apply existing knowledge at a rapid rate.
Borrowing management techniques from industry, flu commission head Francis and his fellow project directors exercised top-down authority, transferring people, resources and ideas to the most compelling projects.
Project directors also managed development in an integrated fashion, coordinating activities across disciplines and developmental phases so that everyone involved understood the upstream and downstream requirements for vaccine candidates.
Working together for the greater good
This cooperative, duty-driven approach to vaccine development persisted into the postwar era, even after the urgency and structure of wartime programmes dissolved. This contributed to high rates of vaccine innovation through the middle of the 20th century.
Don Metzgar, a virologist who began working in the vaccine industry in the 1960s explained to me in an interview that, “pharmaceutical companies looked at vaccine divisions as a public service, not as huge revenue generators.”
When the military requested limited-use vaccines, such as meningococcal meningitis and adenovirus, industry obliged. But a series of legal, economic and political transformations in the 1970s and 80s disrupted this military-industrial partnership. Without industry cooperation, new vaccine development stalled and some existing vaccines were discontinued.
Whether at war or in peace, timely vaccine development is vital. New diseases with pandemic potential occur regularly: SARS in 2003, bird flu in 2005, swine flu in 2009, and Ebola in 2014. Our current vaccine development capabilities are not keeping pace.
Scientific obstacles can be formidable, as our continued struggle to develop vaccines for tuberculosis, malaria and HIV demonstrate. But many vaccines languish in the pipeline for reasons that have nothing to do with science.
Mobilising federal resources on a massive scale, as we did in the 1940s, is not a sustainable solution, but we can still take a page out of the World War II playbook.
In a crisis, such as the West African Ebola outbreak, industry demonstrated that it still has the capacity to partner for the greater good, even when the business case for a particular vaccine is not compelling.
We need to leverage this capacity by reintroducing the highly integrated research practices to accelerate the translation of laboratory findings into working vaccines. Let’s not wait for history to teach us that lesson again.
This article was originally published on The Conversation.
Before World War II, soldiers died more often of disease than of battle injuries. The ratio of disease-to-battle casualties was approximately 5-to-1 in the Spanish-American War and 2-to-1 in the Civil War. Improved sanitation reduced disease casualties in World War I, but it could not protect troops from the 1918 influenza pandemic. During the outbreak, flu accounted for roughly half of US military casualties in Europe.
As the Second World War raged in Europe, the US military recognised that infectious disease was as formidable an enemy as any other they would meet on the battlefield. So they forged a new partnership with industry and academia to develop vaccines for the troops. Vaccines were attractive to the military for the simple reason that they reduced the overall number of sick days for troops more effectively than most therapeutic measures.
This partnership generated unprecedented levels of innovation that lasted long after the war was over. As industry and academia began to work with the government in new ways to develop vaccines, they discovered that many of the key barriers to progress were not scientific but organisational.
World War II sped the development of flu vaccine
In 1941, fearing another pandemic as it braced for a second world war, the US Army organised a commission to develop the first flu vaccine. The commission was part of a broader network of federally orchestrated vaccine development programmes.
These programmes enlisted top specialists from universities, hospitals, public health labs and private foundations to conduct epidemiological surveys and to prevent diseases of military importance.
Wartime vaccine programmes expanded the scope of the military’s work in vaccines well beyond its traditional focus on dysentery, typhus and syphilis. These new research initiatives targeted influenza, bacterial meningitis, bacterial pneumonia, measles, mumps, neurotropic diseases, tropical diseases and acute respiratory diseases. These diseases not only posed risks to military readiness, but also to civilian populations.
These programmes were not a triumph of scientific genius but rather of organisational purpose and efficiency.
Scientists had been laying the groundwork for many of these vaccines, flu included, for years before. It was not until World War II, however, that many basic concepts were plucked from the laboratory and developed into working vaccines.
The newly formed flu commission pulled together knowledge about how to isolate, grow and purify the flu virus and rapidly pushed development forward, devising methods to scale-up manufacturing and to evaluate the vaccine for safety and efficacy.
Under the leadership of virologist Thomas Francis Jr, the commission gained FDA approval for their vaccine in less than two years. It was the first licensed flu vaccine in the US. In comparison, it takes eight to fifteen years on average to develop a new vaccine today.
Flu vaccine, as the Army later discovered, required annual tweaking to match circulating strains of the virus, which it still does today. Even so, the timeline from development to use was a remarkable achievement.
Military needs drove vaccine development
Wartime programmes, like the flu commission, developed or improved a total of 10 vaccines for diseases of military significance, some in time to meet the objectives of particular operations.
For instance, botulinum toxoid was mass-produced prior to D-Day in response to (faulty) intelligence that Germany had loaded V-1 bombs with the toxin that causes botulism. Japanese encephalitis vaccine was developed in anticipation of an Allied land invasion of Japan.
Some of these vaccines were crude by today’s standards. In fact some might not receive broad FDA approval today, but they were effective and timely.
How did these programmes develop so many vaccines, so fast?
Scientists often conducted research at their home institutions, which allowed the military to gain access to valuable expertise and facilities in the civilian sector.
The government used “No loss, no gain” contracts that covered the cost of research and, occasionally, indirect costs, but did not provide a profit. Under normal circumstances, universities would have resisted this technocratic reorganisation of their research agenda, but the threat of war softened opposition.
Manufacturers also began to work on projects with little to no profit potential. Because vaccines were recognised as an essential component of the war effort, participating in their development was seen as a public duty.
With industry as a willing partner, wartime programmes forged a new research format that effectively translated laboratory findings into working products.
At the time intellectual property protections were less of a barrier to information sharing than they are today. Without these restrictions teams were able to consolidate and apply existing knowledge at a rapid rate.
Borrowing management techniques from industry, flu commission head Francis and his fellow project directors exercised top-down authority, transferring people, resources and ideas to the most compelling projects.
Project directors also managed development in an integrated fashion, coordinating activities across disciplines and developmental phases so that everyone involved understood the upstream and downstream requirements for vaccine candidates.
Working together for the greater good
This cooperative, duty-driven approach to vaccine development persisted into the postwar era, even after the urgency and structure of wartime programmes dissolved. This contributed to high rates of vaccine innovation through the middle of the 20th century.
Don Metzgar, a virologist who began working in the vaccine industry in the 1960s explained to me in an interview that, “pharmaceutical companies looked at vaccine divisions as a public service, not as huge revenue generators.”
When the military requested limited-use vaccines, such as meningococcal meningitis and adenovirus, industry obliged. But a series of legal, economic and political transformations in the 1970s and 80s disrupted this military-industrial partnership. Without industry cooperation, new vaccine development stalled and some existing vaccines were discontinued.
Whether at war or in peace, timely vaccine development is vital. New diseases with pandemic potential occur regularly: SARS in 2003, bird flu in 2005, swine flu in 2009, and Ebola in 2014. Our current vaccine development capabilities are not keeping pace.
Scientific obstacles can be formidable, as our continued struggle to develop vaccines for tuberculosis, malaria and HIV demonstrate. But many vaccines languish in the pipeline for reasons that have nothing to do with science.
Mobilising federal resources on a massive scale, as we did in the 1940s, is not a sustainable solution, but we can still take a page out of the World War II playbook.
In a crisis, such as the West African Ebola outbreak, industry demonstrated that it still has the capacity to partner for the greater good, even when the business case for a particular vaccine is not compelling.
We need to leverage this capacity by reintroducing the highly integrated research practices to accelerate the translation of laboratory findings into working vaccines. Let’s not wait for history to teach us that lesson again.
This article was originally published on The Conversation.
Limited-time offer: Big stories, small price. Keep independent media alive. Become a Scroll member today!
Our journalism is for everyone. But you can get special privileges by buying an annual Scroll Membership. Sign up today!