I hate romantic comedies/ romance movies. They undermine everything feminism fights for and are targeted specifically toward women. The way that women are portrayed makes it seem like we need a man and to get married for our lives to be complete. Why is it so hard to believe that a woman can be successful, independent, and happy without a man.