how awesome were the 60's vs. how it is portrayed in the media?
Shows like Mad Men, films like X-men first class, the Wonder Years, Austin Powers, the 60's always look like one of the greatest decades to be alive!
Where everyone had a good job with vacation time and great wages, everyone knew their roles and did their jobs well, the girls were sexier, Men were men, kids were always safe to go outdoors and could take their guns out just for fun to shoot woodland animals, nobody cared about guns at all for that matter and you could go to a gun store and upon receipt of cash immediately have a new gun thrust into your hand!
Everybody had sex everywhere all the time, nobody had to worry about aids or political correctness or social issues of any kind while smoking and drinking openly and freely everywhere even while driving and at the office.
As someone born in the 90's your decade upside down I have to ask:
If you were alive in the 60's does that represent the average american life. Do you like things better now or then? If you could would you change the direction America has gone in since that era? Do you feel like we have more or less freedoms as a people? Or have we become overly responsible to the point of being a nanny state?