A History of Malaria chemoprophylaxis
Among the many dreaded fevers that plague man, “marsh fever” was distinguished by its periodicity, enlarged spleen, and pattern of attacking those exposed to wet, warm, boggy areas. In the mid-1700s, it became broadly known by its Italian name, mal’aria (“bad air”), referring to bad or foul air emanating from swampy lands. The discovery and development of drugs to treat and prevent malaria have been driven for centuries by the desire and need to protect travelers, military personnel, explorers, and the commercial interests of imperial powers when they went into the malaria-endemic tropics.
Historical Milestones in Antimalarial Chemoprophylaxis
- 1620s: Jesuit missionaries living in Peru learned of the healing power of powdered bark, often known as “Jesuits’ bark,” from the cinchona trees growing in the high forests of Peru and Bolivia.
- 1820: French chemists isolated quinine, the most active compound in cinchona bark, which allowed for quinine’s expanded availability and use.
- 1854: The Scottish surgeon William Balfour Baikie gave 6–8 grains (1 grain = 65 mg) of quinine, half in the morning and half in the evening, dissolved in sherry, to all the ship’s crew during a 118-day expedition up the river Benue in modern-day Nigeria. No men died. This unprecedented accomplishment gradually led to acceptance that malaria could be prevented by chemoprophylaxis.
- 1861: Quinine first saw widespread use as a prophylactic agent in the American Civil War when both the Union and Confederate Armies, plagued with malaria, used massive quantities to prevent the disease.
- 1880: A French military physician working in Algeria, Alphonse Laveran, discovered parasites in the blood of a French soldier. The identification of the infectious agent and its life cycle was an essential step in opening the way to the development of effective prophylactic agents.
- 1914–1918: During World War I, both Allied and Axis militaries suffered terribly from malaria. German armies were denied access to quinine, as most was held by a monopoly of Dutch growers on the island of Java.
- 1920s: German chemists pursued a synthetic route to new antimalarial drugs to circumvent the Dutch monopoly, achieving spectacular success with the introduction of pamaquine and mepacrine during the 1930s.
- 1941: With America’s entry into World War II, trade between the United States and Germany ceased, and mepacrine was no longer available to the Allies.
- 1942: Japanese armies overran the Dutch cinchona plantations on Java, cutting off the supply of quinine and leaving the Allies with no source for antimalarial drugs.
- 1943: American scientists quickly devised a manufacturing process for mepacrine, and the drug was given as a treatment and as a prophylactic under the name Atabrine. Meanwhile, the Allies, especially the Americans, launched the largest antimalarial drug discovery and development program the world had seen. By 1945, several new antimalarial drugs had been introduced, including chloroquine and proguanil. Chloroquine went on to assume a role both in prophylaxis for travelers and as a treatment drug for endemic areas.
- 1959: Chloroquine-resistant Plasmodium falciparum malaria was first reported.
- 1965: Large-scale American military involvement began in South Vietnam, reaching almost 200,000 soldiers by the end of 1965. Chloroquine-resistant malaria caused illness and death in US forces.
- 1967: A second massive US government–sponsored antimalarial drug discovery effort centered at the Walter Reed Army Institute of Research began and led to the discovery of mefloquine.
- 1971: During studies of volunteers with experimentally induced malaria infections, it was observed that tetracycline, administered to treat intercurrent bacterial infections, appeared to exert blood schizontocidal activity against chloroquine-resistant P. falciparum. A few additional studies were performed with tetracycline, doxycycline, and minocycline, but no formal development of the drug as an antimalarial was pursued.
- 1977: Although it was not yet available in the United States, CDC recommended sulfadoxine-pyrimethamine as an alternative for primary prophylaxis.
- Early 1980s: The Wellcome Research Laboratories developed atovaquone as an antimalarial drug. Clinical trials in uncomplicated P. falciparum malaria as monotherapy were disappointing, with early treatment failures due to the emergence of atovaquone-resistant parasites. However, combining atovaquone with proguanil led to an efficacious combination therapy.
- 1982: The fixed combination of pyrimethamine and sulfadoxine (Fansidar) became available in the United States. CDC recommended a combined regimen of chloroquine and Fansidar to prevent chloroquine-resistant P. falciparum .
- 1985: CDC added cautionary language to the recommendation for pyrimethamine-sulfadoxine use as weekly malaria chemoprophylaxis resulted in fatal cases of Stevens-Johnson syndrome. CDC added doxycycline as an alternative regimen for antimalarial prophylaxis.
- 1986: CDC removed amodiaquine as a recommended alternative for malaria prevention.
- Late 1980s: The US Army conducted several field and human challenge clinical trials demonstrating the efficacy of doxycycline as malaria prophylaxis.
- 1987: CDC added chloroquine-proguanil as an alternative regimen for malaria prevention.
- 1988: Although it was not yet available in the United States, CDC recommended mefloquine as an alternative regimen for primary malaria prophylaxis.
- 1989: The Food and Drug Administration (FDA) approved mefloquine (Lariam).
- 1990: Pyrimethamine-sulfadoxine was removed as a recommended alternative for primary chemoprophylaxis. Mefloquine was recommended as the drug of choice to prevent malaria in areas with chloroquine-resistant P. falciparum .
- 1992: Pfizer, at the request of FDA, submitted a supplemental new drug application for doxycycline for malaria chemoprophylaxis.
- 2000: Combination of fixed dose atovaquone-proguanil (Malarone) was approved by FDA.
- 2001–2002: CDC added atovaquone-proguanil as an alternative regimen for malaria chemoprophylaxis and removed chloroquine-progaunil as a recommended alternative.
- 2003–2004: CDC listed atovaquone-proguanil, doxycycline, and mefloquine as equally recommended first-line options to prevent malaria. A recommendation for primary prophylaxis with primaquine in “special situations” was also added.
- 2010: CDC added that primary prophylaxis with primaquine was recommended for areas with mainly P. vivax .
- 2014: CDC added recommendations for carrying a “reliable supply.”
The 3 largest antimalarial drug development efforts of modern times occurred during and after major conflicts of the 20th century—World War I, World War II, and the Vietnam War. The modern history of developing drugs to prevent malaria grew almost entirely from the need to protect military personnel and keep them healthy to conduct combat operations in malaria-endemic areas. Those massive government-funded efforts produced the drugs now used to keep modern civilian travelers safe from the sickness and death that mark malaria’s march through time.
- Greenwood D. Conflicts of interest: the genesis of synthetic antimalarial agents in peace and war. J Antimicrob Chemother. 1995 Nov;36(5):857–72. [PMID:8626269]
- Kitchen LW, Vaughn DW, Skillman DR. Role of US military research programs in the development of US Food and Drug Administration—approved antimalarial drugs. Clin Infect Dis. 2006 Jul 1;43(1):67–71. [PMID:16758420]
- Smith DC. Quinine and fever: The development of the effective dosage. J Hist Med Allied Sci. 1976 Jul;31(3):343–67. [PMID:780420]
- Smith DC, Sanford LB. Laveran’s germ: the reception and use of a medical discovery. Am J Trop Med Hyg. 1985 Jan;34(1):2–20. [PMID:2578751]
- Sweeney AW. Wartime research on malaria chemotherapy. Parassitologia. 2000 Jun;42(1-2):33–45. [PMID:11234330]
Paul M. Arguin, Alan J. Magill