Wednesday 14 November 2012

Rape Culture Defined

A rape culture is defined as that “which is a complex set of beliefs that encourages male sexual aggression and supports violence against women… condones physical and emotional terrorism against women as the norm” (Brendan, 2012). 

Every time someone uses the word "rape" in a derogatory way, or in day to day conversation, they are feeding into the rape culture, and making it stronger. 

No comments:

Post a Comment