
So this Memorial Day Weekend (which isn't over yet!) has been relaxing so far. With another day off ahead, I have decided to reflect and add a media related idea! So while down in New Jersey for my cousin's graduation party I was mingling with family and catching up and whatnot. One of my cousins, works for the NYPD down in the Bronx. He works in narcotics and usually has a crazy story or two for us. This got us into a discussion about the television show The Wire. He says it is the closest television portrayal of his job that he has ever seen. My boyfriend has seen seasons 1-4, (there are 5 all together) and I have seen none. So, this conversation has now launched my own Wire marathon that started today. I am halfway through season 1. I keep asking my boyfriend, "Is Baltimore really like that?" and I keep having to reaffirm that I understand what is going on, but so far so good. So this leads me to my question (finally!) are television shows that portray what is really happening better for the American psyche or should we just stick to Glee? I mean I know I've picked two polar opposite shows, and I have a raised a question that is multi-facetted, but if you do narcotics work 7 days a week for 40+ hours a week, do you really want to go home and watch a tv shows that mimics your life for entertainment?