I've been told I've been "brainwashed" into my beliefs by my father or significant other. Women have claimed I'm "anti-women" because I am pro-life. Girls in class have told me that I must hate myself for not being a feminist. Even professors have tried to tell me I don't understand my gender or women's history in the United States. They've said there is a patriarchy that still needs to be fought, and that I'm just blind to it.