History of women in healthcare


Historically, medicine has been a profession dominated by males, but over the course of the last century, the number of women pursuing a career in medicine has increased significantly and in the next ten years, women doctors are set to outnumber male doctors for the first time in history. Until the late 1800’s, all doctors working in the UK were male; however, after campaigns and high profile cases such as that of Dr Elizabeth Garrett-Anderson, an act was finally passed to allow women to train as doctors in 1876. Initially, women doctors were ridiculed by their male counterparts and many found it very difficult to cope with the additional challenge of coping with male colleagues; however, … [Read more...]