Not really a question, but an observation, I think reality shows are so anti-women and against feminism, in their cruel and often superficial depiction of women. Many would argue that men are explited also, but men are not the ones who have had to fight for the right to vote, hold down a job, be taken seriously, run for congress.....I would really love to see reality TV just disappear. I am sick of seeing women being degraded and dehumanized on a weekly basis. Men can be dehumanized and exploited till the end of time for all I care, but stop degrading the far superior female species!!! Feel free to comment on this.
boulderv62006-10-17T08:21:39Z
Favorite Answer
I would love to see them disappear, but for a different reason. They Just Suck! I also think that your attitude is causing more harm to the feminist movement than any reality show the same way that environmental alarmists are doing damage to the environmental movement.
the women know what they're getting into when they go on the shows. That being said, I agree that certain reality shows do exploit women (and men to a lesser extent). Fear Factor definitely comes to mind: they always seem (or seemed, if it's off the air) to cast beautiful, big breasted women and then put them into some kind of stunt where they have to wear a bathing suit and jump into ice cold water (i'm not complaining, just commentating on what they do/did).
Some other shows don't exploit at all (as far as I can see). For example, the Amazing Race doesn't seem to exploit anyone. They have cast beautiful women (and men-male models) during more than one of their seasons, but they don't do anything to exploit them.
On a different note, are you really that bitter, or were you just trying to get responses?
There are some reality shows that degrade women, I agree. There are some that don't, my favorite Project Runway is a good example.
Reality shows aren't go anywhere any time soon. The networks save so much money but not hiring actors, or having long term contracts, they'll keep reality tv going, as long as the ratings are up. Gone are the days of $1 million per episode sitcoms.
Sorry to disagree with you, but the people who go on reality shows know what they are getting into, so the women whom you are so concerned with are not being taken advantage of, they are doing it to themselves. There are too many reality shows on TV for them not to have a clue as to how low the producers of such shows will go to get ratings. I personally think that all reality shows should go away, I think they degrade everyone that watches them, but that is the bad choice that they made and is up to them. I do wish that there were not so many reality shows on, making me believe that the entertainment industry be lives that all of the world only wants to watch something so mindless. And just a short comment to you personally for showing your lack of concern for anyone who is not female. And like other minority groups, women have had many of the rights that you are speaking about for decades, yes there is still work to do in the area of equal rights, but no where near the struggle that you are talking about. Get over your feminist rhetoric and work with others to make changes, not sit on the sidelines gripping about it.
Reality TV in general sucks. Sorry ya feel that way about all men, though. Kinda makes me wonder what happened to ya, but that is of course your business.
I'm sure that shows like Survivor encourage the women on the show to dress scantily, but in the end it is the women that do it to themselves. They don't have to go running around in string bikinis and using sexuality as a weapon. And there have been a few that have done that. Not all, but some.