Growing up there were a lot of shows that I watched on TV. I watched shows, cartoons, movies, and soap operas on television, and it didn't matter what the genre was or anything like that, they all seemed to perpetuate the same societal 'opinions' and norms, and they basically all spoke of the world that we live in from their perspective. Their perspectives were all the same, and that is not by accident in my opinion. But as I grew up into an adult, I began to experience and see things in life, that differed dramatically from the opinions and ideas that the media puts out perpetually. And the more I spoke to other people, they told me the same thing. On TV, couples tend to have sex a lot more often than people actually do in real life, and then people - women especially - doubt themselves or think that there is something wrong with them because they don't put as as often as what is portrayed of women or couples in the media. The above is just one example. There are lots of examples. So are you brainwashed by television or are you able to make a boundary between what the media puts out, and what actually really happens in society?