With the recent rise in popularity of sites like Only Fans, Seeking Arrangments, and just the whole phenomenon of being an Instagram Influencer, I have thought a lot about how nuanced selling your body for money really is. People have a lot of strong opinions about this topic, and I’d like to delve a bit deeper into this idea myself.

Of course, using sexual appeal to sell things is a prominent marketing strategy that has proven to be successful. So selling your own body is only a step away from the established norm. Rather than a woman being an object to sell another object, she is selling herself: the object.

Ultimately though, in capitalist societies, the workers are, in a way, selling their bodies. We exchange our time for money – meaning that we’re getting paid for our bodies to be in places. And although the majority of jobs are not sexual in nature, there are certain parallels between the objectification of women in sex work and the objectification of humans in a capitalist society.

An argument I hear a lot when it comes to sex work is that the women doing this work are not doing it under their own accord. Maybe they’ve been sex trafficked. Maybe they need money and feel there is no other option. Whatever the case may be, if women end up in this field because they have to rather than they want to, then this is a serious issue.

However, I think there are some women who genuinely want to enter this line of work. And I personally don’t see an issue with that. Although, I do think that it contributes to the sexist idea that women’s value comes from their body.

At the same time, though, it could be viewed as women working the broken system in their favor. If men have bought into the idea that women are mere commodities and have used it to sell their products, then women have the right to take that money back and put it into their own hands.

If creepy men leer at us for free, then maybe it’s actually feminist to ask them to pay for it.

At its core, I think feminism is about expanding the options women have instead of limiting them. The question, though, is: is embracing sex work for women expanding their list of options in life, or is it simplifying women down to one-dimensional beings ~ sex objects for male gratification?

2 Comments

  1. To be perfectly honest….and to tell you a few things that you won’t find in American history books……as America was headed WEST….some of the wealthiest people IN the Western territories were WOMEN because they owned brothels next to hotels in the quickly set up cities that were built as the railroad expanded. Most of the “cowboys” in the old west were not WHITE JOHN WAYNE TYPES….. they were freed slaves, Mexicans, and Chinese folks that were building the railroads to the coast. Brothels kept a lot of the citizens IN towns as the west was being built up, and the women that were in charge of the brothels built more buildings, and expanded towns with their money…. so in a way…. the West was WON, by rich WOMEN.

    Like

    1. That’s very interesting. I had no idea that was the case. The fact that brothels are illegal – or just a group of women living together is not legal – is crazy. It probably was a way to take that power away from women

      Like

Leave a reply to Bill Van Orden Cancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.