Three-quarters of the war deaths among American troops came not from wounds but from disease.
“It is among soldiers in the field that filth and bad provisions abound. It is among that class of men that dysentery appears with all its hideous forms.” - An American army surgeon during the War of 1812
An American soldier during the War of 1812 was far more likely to die from disease than in battle. In fact, fully three-quarters of the war deaths resulted from disease, most commonly typhoid fever, pneumonia, malaria, measles, typhus, smallpox and diarrhea.
Medical practices were rudimentary and often harmful, in part because the underlying nature of the science of disease was not well understood. The notion that infection could spread through organisms invisible to the naked eye, for instance, did not gain widespread acceptance until the latter half of the nineteenth century.
Even with their limited understanding of the science of germs, however, military doctors appreciated certain risk factors. Most had observed the deleterious effects of poor camp sanitation. Unfortunately, an army surgeon could usually only advise his commander to enact practices thought to reduce disease—separating latrines from cooking areas and water sources, enforcing cleanliness through routine bathing with soap, and restricting alcohol abuse, a major cause of sickness among soldiers.
Military surgeons often resorted to so-called “heroic” treatments. Those treatments often seem crude and sometime barbaric to modern eyes. Bleeding, the deliberate opening of vein to remove blood from a patient, was thought to reduce blood volume and reduce fever and infection. Blistering, the practice of creating a skin infection on the patient, was thought to lead to pus that would carry away infection. Other physicians deliberately induced vomiting in an attempt to combat disease. Such practices were seldom helpful and often made the patient’s condition worse.
Among the items found in a surgeon’s medicine chest were opium and alcohol, useful for pain management, and quinine, found to be effective in treating malaria. But many drugs were either unhelpful or, in the case of the mercury used to treat syphilis, quite toxic.
Army medicine also suffered from some basic organizational shortcomings. The War Department was ill prepared when the conflict broke out in 1812. Officials had no standardized system of accounting for or replenishing its medical supplies, or for evaluating the competency and training of its medical staff.
But as the conflict wore on, army medicine improved noticeably. Congress created the post of surgeon general and outlined professional qualifications for selecting surgeons. In addition, the Congress attempted to improve cleanliness among soldiers through better camp sanitation, and tried to alleviate hospital overcrowding. Over time, the contents of the surgeon’s medicine chest became standardized, and a better system of hospitals emerged. Permanent hospitals were located well to the rear, away from the fighting, and linked to more mobile, “flying hospitals” closer to the front lines.
But in many ways, the most intractable problem remained the scientific unknowns. Solutions to the fundamental puzzles—the nature of disease, how it was transmitted, and how to prevent infection—remained several decades away. More often than not, army doctors found themselves groping in the dark for answers.
Last updated: August 15, 2017