As a woman, I have grown up in a society that has convinced me to belive that looks are everything. There are movies, theres the media, theres everything in between. Growing up I was given Barbie dolls to groom and fake makeup to play with. I loved it because it was what I saw on tv. Then it brings me to the question. When did I become so high maintenance? When did I start believing that the way I looked was more important than what came out of my mouth? I cannot fathom the idea of having to appreciate compliments based on my appearance. What about my brain, my thoughts? My drive to succeed? My work ethic? What about how I love, how I care so much about those around me, my optimism? My intelligence? When did something as simple as what I was born with become the thing I should be the most proud of? I narrowed it down, did a little research on the huffington post and it all made sense.
Body hair: Women are mocked for daring to bare it. Back in the 16 to 19th century women didn’t shave at all. IT WAS ALL NATURAL. But advertisements have warned women to shave because it is considered unsightly and disgusting not being as smooth as a baby. It has been made an expectation because it is seen as unfeminine to expose the natural hair of the body. Hair-removal ads such as Veet came up with campaigns such as “don’t risk dudeness”
Skin color:Back then, lighter skin was seen as more beautiful. Black women began bleaching their skin because society made them believe that they would be able to occupy higher positions socially and commercially and that they would marry better men. Ads portrayed lighter skin as lovelier skin. BUT then it became popular for people to start sunbathing, it started off as a prescription from doctors for illnesses but then beauty companies began selling sun tan lotions. LIKE so youre saying being darker is more attractive now? Since when is a skin color a fad that comes and goes?
Cellulite: Back in the 1830s, larger women were considered very attractive, painters lauded curves in paintings with cellulite and all. After a period of time, cellulite was demonized and it is certainly still looked down upon. If you don’t have it,GREAT! But just because you do, that should be okay too.
Manicured nails: Since when does having unmanicured nails have to be considered unsightly? Why does the phrase “ you can tell a lot about a woman by the way she keeps her nails” even exist? Like I don’t have time to worry about my nails every week, I barely even have time to comb my hair. Some women don’t even care for things like that and it DOES NOT make them less of a woman. It began in the 1920s when having manicured nails became a way to display wealth and elegance and it proved that you were above the working class.
You see guys I can go on, there are so many things women have grown up expected to care about. This post is not meant to say that women shouldn’t care about their appearance. I am solely pointing out that it should not define us and should not make us less of a woman if we do not care for those things. I should be more than my appearance, I want to be seen as something more than having to please the eyes.