Medical Myths About Gender Roles Go Back to Ancient Greece. Women Are Still Paying the Price Today

We are taught that medicine is the art of solving our body’s mysteries. And we expect medicine, as a science, to uphold the principles of evidence and impartiality. We want our doctors to listen to us and care for us as people. But we also need their assessments of our pain and fevers, aches and exhaustion, to be free of any prejudice about who we are. We expect, and deserve, fair and ethical treatment regardless of our gender or the color of our skin. But here things get complicated.

2356 232