I think it's safe to say that the media plays a big part in how we view the world, each other and ourselves. A lot people base their entire lives around how the news portrays the world - fearful of the never ending dangers and supposed chaos. In my opinion, the media has firmly planted itself into our self conscious telling us to buy stuff we don't need, live a certain way and hate needlessly. These Ideologies are (along with many others) what bring about ignorance, ill heath and just general negativity. Maybe I'm just being a pessimist, what do you think?