Well It's come up many times in convo's lately at least in ones I've been in. So I wanna know what are your view on gender roles?? To elaborate gender roles are the roles that a certain gender are supposed to follow in life, the male supports a family and blahdy blah and such. Do you think gender roles really have any place in the more liberal world we seem to be going into.
my personal opinion is that they have no purpose. Gender kinda got left by the roadside many years ago for me. Gender is meaningless to me, I wear girls clothes cause it looks cooler and I like them more..... so what. Society as a whole seems to look down at people like me for pushing the gender lines and breaking from the gender roles...... Your Thoughts??