It seems like lately TV shows (especially sci-fi shows--or maybe that's all I watch) are promoting the idea that violence is good, when used against the proper foe. Or at the very least that non-violence isn't necessarily a good thing.
The most obvious example I can think of is the show V. Here, visitors from another planet are coming in peace. Of course, they're actually lizards and will probably want to start eating humans at some point, but only a handful of people realize that on the show. Everyone else is copacetic with the aliens visiting earth (rather unrealistic, in my opinion) and wants to peacefully accept them. The point of the show seems to be:
- People who preach peace and non-violence are not to be trusted, and
- A non-military response to outside threats is a bad idea.
Another TV show I noticed this on was Alice on Syfy (why didn't they spell it Alyc?). The Queen of Hearts lives in a casino and feeds off the emotions of the real-world people whom she kidnaps and forces to gamble. Gah, what a plot. In any case, the Queen only wants happy, pleasing emotions--other emotions like violence, fear, etc., are verböten. So what does Alice do to make the Queen's kingdom crumble? If you answered shoot up the casino and threaten everyone's lives until they snap out of it, then you probably watched the show.
Thus, the theme to Alice seems to be that negative emotions are actually good; and doing things like firing randomly into a crowd is really not that bad of an idea. In the end, the violence in Alice seemed to have only positive consequences, and rebellion to overthrow the Queen (for no reason I can see, to be honest) is heroicized.
Do you agree that violence is being presented in a more positive light on TV these days, or am I just a psycho? Wait, maybe don't answer that last part.
Powered by ScribeFire.