I’ve recently been wondering why films that are fundamentally unintelligent try to teach us things about our lives? Movies like Transformers try to preach to us to be less judgemental and give others a chance to make an impact (those others happen to be 20 foot robots but, they’re used as a metaphore). What is weird is that half of the ones we’re supposed to give a chance to are actively trying to kill us. I guess we’re supposed to know who’s bad and who’s evil instinctively.

Another examples is the new planet of the apes movie. We shouldn’t test mother nature. Or else orangutans will rip our faces off (I’m hypothesizing, I haven’t seen the movie).My question is: who cares and who will learn from these warnings? Are there scientists out there just about t give chimpanzees triple digit I.Q’s who’ll see Rise of the Planet of the Apes and go: “Well, I’ll abandon my life’s work now.” Of course not.

How I Met Your Mother is a sitcom that tries very actively to display the ideal life. The show’s creators seem to genuinely believe that life is supposed to be a certain way. And then it will be good. You will get married to the love of your life and then you will be happy. Until then you will get drunk and you will sleep around and you will have fun but this fun will be fleeting. This fun will be meaningless.

So why are these movies and shows like this? My guess is simple: themes. All movies have to have a theme, otherwise no one will care about them. You can’t just make a film that’s all explosions and have it be good. Well, a lot of these movies sitcoms that have themes shoehorned into them are shit too.

So why not abandon the didactism? Just give us completely meaningless explosions and jokes. We do not care about your (usually) terrible advice Hollywood. Knock it off.

Share this: Twitter

Facebook

Reddit

Tumblr

LinkedIn

Like this: Like Loading...