Gender and Life

Are women instinctively more nurturing?
Answered by Bambi Turner
  • Bambi Turner

    Bambi Turner

  1. Throughout history, women have traditionally taken the reins in terms of child-care duties, and men have relied on their superior physical strength to hunt or simply earn livings to support their families. Despite technological innovations that allow women to perform many of the same jobs as men, these traditional gender roles more or less persist. Many attribute this to women's natural nurturing abilities, or some sort of maternal instinct. Surveys repeatedly demonstrate that the average person believes women are more nurturing than men, but no scientific evidence exists to support a biological or genetic explanation for feminine gender roles [source: Hollingworth].

    In fact, researchers have found enough variations in gender roles across different cultures to suggest that the idea that women are naturally more nurturing than men is largely a social construct [source: University of Nebraska-Lincoln]. Some sociologists go so far as to refer to this type of gender stereotype as a means of perpetuating the male-dominated status quo. Presenting a sense of nurturing as an innate, biological trait represents a "legitimizing myth" in which women believe they are destined to serve as caregivers because of their genes [source: Cole and Jayaratne]. After all, you can't fight nature, right?

    A University of Michigan study lends a great deal of support to this theory. Researchers asked men and women why society often views women as the more nurturing of the two genders. Male participants were much more likely than women to attribute this difference in nurturing abilities to genetics. In the same study, however, both men and women were about equally likely to attribute traits such as math skills or violent behavior to genetics [source: Cole and Jayaratne].

    The authors of this study suggest that men respond the way they do because they have more to lose. If women are naturally predestined to serve as nurturers, men are not only relieved of much of the child-rearing burdens, but also face less competition in the workplace. Removing the biological ties binding women to nurturer roles could turn traditional gender roles inside-out, leaving both men and women facing equal opportunities and responsibilities in terms of child-rearing and the workplace.

    Of course, lack of genetic evidence doesn't negate the fact that women are in fact typically seen as the more nurturing of the sexes. Many social science theories support the idea that women are more nurturing, caring and affectionate than men. The theories simply attribute these qualities to social conditioning rather than biology.

    More answers from Bambi Turner »



Still Curious?
  • Are some men really still like cavemen?


    Answered by Curiosity

  • What's the difference between gender and sex?


    Answered by Discovery Fit & Health

  • What are the advantages of changing biological sex?


    Answered by Planet Green

Advertisement

What are you curious about?

Image Gallery