The term 'feminism' is sometimes associated with a bad connotation. After taking 'Theories of Feminism' last semester, I've realized that feminism is not what society sometimes makes it out to be. Feminism is clearly and simply the advocation of equal rights no matter one's gender. It should only be associated with positive and hopeful emotions as that is how society will evolve. I've made it a goal of mine to educate people who have the wrong view of feminism and and explain to them what it really is. If we all educate each other, we will all learn and become better people.
In the beginning of the semester, professor actually asked this same question. I think my viewpoint on feminism has changed a bit. I think that feminists get a really bad rep. Granted in the beginning, I didn't think anything negatively about feminism. I support women empowerment and being able to achieve your career choices as a woman without discrimination. I also agree with Chelsey that it is subjective, its something that people can interpret in their own way. Feminism is something that to me helps to fight against sexism and typical gender norms that women are expected to uphold.
After reading this article I've gotten a different understanding of feminism. I'm learning that feminism is something that is extremely complex, and every time I think I understand it I actually don't. Feminism has different factors that make it what it is. Ultimately I think that feminism is subjective, I think that it's whatever a person feels that it is to them. To me feminism is a woman having the freedom to be who she wants to be without judgement. I consider my self a feminism but my beliefs are very different compared to other feminist, and though they are different, I don't think that makes me any less of a woman or less of a feminism.