Dermatology is a branch of medicine dealing with the skin, its structure, functions, and diseases (from Greek derma, "skin"). A doctor who practices dermatology is a dermatologist.
... World War I forced women to break even more gender barriers by entering the workforce to replace the large numbers of men fighting overseas. At war's end, women ...