Women and Men Should be Equal in Society
December 18, 2015
Feminism is a word that still has a negative connotation for some people. They believe feminists are men-hating women who want to be superior to men.
That’s not true. Feminism is the support for the idea that women should be equal to men in all aspects of society, whether it be economic, social, or family-wise.
Feminism is more important now than ever. We need more women to be highly educated to help contribute to society because the world is constantly in need of leaders who can make it a better place. To do this, men — not just women — must advocate for women leaders.
We need women to fill positions that were once only filled by men so that everyone can contribute their knowledge. Women must join previously male-dominated career fields yet they are still not taken seriously. For example, they would be women doctors who will find cures for diseases, female lawyers who will defend the innocent and keep criminals in jail or women scientists that will make discoveries to feed our insatiable hunger for knowledge. To do this, families must encourage their daughters to work hard and go to college.
We also need strong women who can raise and nurture new generations of potentially amazing people. Promoting education and independence in women can help make our world and future so much more positive. This is why feminism is important.
We need girls to see that they can be equally competitive in all aspects of society because we also have a voice and mind that could lead to great things.
Once we get passed the unfortunate domestic roles society has bestowed on our gender, we can start seeing each other as equals. Women joining men on the same level of importance will also make lives fairer for all people. That way, humanity can solve one more issue of oppression at a time and give women the rights and respect they deserve.