Women in Public Health and Medicine

Women have always been central to the history of health and medicine. They have been doctors, nurses, midwives, activists, and public health experts. Women have worked to heal patients, study diseases, and improve access to health care.

We honor the service of all health care workers. Here are just a few of the women who have shaped American health history and places associated with them.

More Women of Public Health and Medicine

Loading results...

    Associated Places

    Last updated: July 6, 2021


    • Site Index