Soooooo, we all know vampires took over the TV scene following the
incredibly popular (perhaps overrated, but I digress) movie, Twilight. But did
you know that Zombies have taken over!? - Surpassing its counterparts "True Blood" and "The Vampire Diaries", “The Walking Dead” is rated as the
Number 1 Supernatural TV series in America.
The show premiered its first season in 2010, with an impressive 5.35 million viewers, and has moved on to a whopping 16.11 million sixth season! *applause*
The Walking Dead is an American horror drama television
series directed by Frank Darabont. It’s based on Robert Kirkman’s comic book
series of the same name, but quickly moves past its familiar premise, telling the story of what
happened after the apocalypse, and the struggle to remain human after society’s
collapse.
The gritty drama explores the onset of the undead apocalypse through the eyes
of a Sheriff; Rick Grimes, who awakens from a months-long coma. As he stumbles
out of the hospital, he discovers that society has fallen, and his family are
missing. In search for his wife and son, he bands together with a group of
other survivors in a monster-ridden, rural Georgia.
With this diverse crew of
rednecks and city slickers figuring out how or whether to work together, they
tackle the impossible mission to survive and adapt to world infested by
zombies.
At least, we’re told its impossible…
Now, you may have noticed, the zombies are portrayed in a vastly different light through this show in comparison to warm bodies. This is not an unusual occurrence. Pop culture often twists and molds particular things to fit in with their aim. However, I think we can all agree that an army of the undead will most definitely not be desirable, nor particularly friendly.
Keep an eye out for our next post where we FINALLY begin to
share with you our top tips for surviving a zombie apocalypse!
No comments:
Post a Comment