I have talked and talked and talked about this with my friends. Back in the day when I was a young child - well, I am not hat old lol. That sounded bad. But when I was younger there were an abundance of fictional TV shows. Dramas, mystery, comedies, thrillers. All sorts of fictional shows and a lot of them were good. But in this new era of television it seems as if reality TV has taken over. Wherever you turn, there is a reality tv shows on and washed up celebs are getting their own reality shows lol. Many seem to be like Marmite when it comes to reality TV shows. They either love them or they hate them. So, in your opinion, has reality TV ruined TV?