Big Daddies roam the universe of Bioshock (2K).
Up, LB, X, LB, LT, RT. In the eyes of a snobbish moviegoer this combination of letters would be met by a sneer. To those of you who have not been inducted into the world of videogames, it's really, really confusing. Allow me to explain. The above combination might represent a central character reloading a gun, taking cover and then launching from their position to take fire at an enemy. In a film such as Die Hard (John McTiernan, 1988) such an action is taken for granted. But in the world of, for example, the recent Splinter Cell: Conviction there's much more at stake. Die Hard, as brilliant and revolutionary as it is, is still a piece of popcorn entertainment. Playing Splinter Cell is an incredibly tense and atmospheric experience; the shadow-soaked environments cloaking your hero, equipped with just a pistol, from the MP5 wielding goons just a metre away. You're actually there, in the moment, with your life on the line. How can a film match that? Well, easily. For one, films aren't made of pixels. They're allowing you to invest into a world that's instantly recognizable and filmmakers can build tension through performance, editing and score. Consider last years State Of Play (Kevin MacDonald). Tubby journalist Cal McAffrey (Russell Crowe) is investigating the apartment of a suspected killer. He believes the suspect not to be home only to be met with the stone-eyed assassin, face-to-face. I can still remember the shiver that darted down my spine. And of course genres such as comedy, documentary and drama would be hard; albeit pointless, to create on a gaming console. I'm of course initiating an old debate: Videogames vs films. Which one is superior? Well, lets find out...
Firstly lets create an even debating field. Much of the argument employed by moviegoers is that videogames can't be art. And while it's true that Ninja Blade, Afro Samurai and Section 8 are nothing more then linear, button-bashing actioners it's difficult to discern the difference between that and Transformers (Michael Bay, 2007), Quantum Of Solace (Marc Forster, 2008) or 2012 (Roland Emmerich, 2009). Videogames and films can both be trashy entertainment and they can both be art. There are enough examples of cinematic art littered throughout this blog (8½, Federico Fellini, 1963, for example) but examples of games include Mass Effect, Fable II and the above pictured Bioshock which create unique, expansive and alive worlds that can be explored for hours on end. And due to the level of design a videogame undergoes and the fact that this world is being built from scratch normally means they look stunning;
The beautiful world of Fable II (Lionhead Studios).
Cinema is a 122 year-old art form which has benefitted from being a worldwide phenomenon since its birth. Filmmakers from all over the world have innovated their craft over the past century and there has always been an audience to embrace it whether it be the popcorn or arthouse crowd. Based solely on the fact that every genre can be explored through film, both visual and aural techniques are employed in its creation and no matter the language its messages are universal, cinema should be the outright winner. But we now live in a different world to the Cathode Ray Tube Amusement Device, the first acknowledged electronic game designed in 1947 by Thomas T. Goldsmith, Jr. and Estle Ray Mann. We now live in a world where never-before seen universes are at the fingertips of technicians and imagineers. If you can think of it, they can make it. The problem is that despite this fact a majority of games feel incredibly retrograde. It's rare that a product feels totally original or captivates in the way a film can. Perhaps this is why it hasn't yet reached the level of recognition that cinema has? Bioshock is perhaps one of the most absorbing, intelligent and important sci-fi works of the past decade but it will be looked down upon by bookworms and movie snobs alike. Why? It's very simple. Because games are now inextricably linked to the fear-mongering of the national press. Whenever someone is assaulted on the street it's the warped, morally corrupting influence of video games that did it. Of course, they target movies too (http://e-filmblog.blogspot.com/2010/05/panic-on-streets-of-london.html) but over the years GTA and Manhunt have picked up a lot of tabloid heat. Unfairly so too, because as with movies, no artistic work can force a person to do what isn't already in their nature. A videogame allows you to live out a fantasy or do something that isn't possible in real life - if anything, therefore, reducing angry impulses. Of course that isn't to say that people's fantasies should be to roam the streets gunning down thousands of civilians. But it's just a way for people to kick back and forget about the worries of the day - be silly in a made up world that clearly isn't promoting violent activity. The games themselves don't even promote the killing of civilians. A game like GTA IV has an actual story with characters that develop - this is the point of the game. But the option is there to go crazy with a machine gun on a busy street. Because the game is realistic. In today's world, that's an all too scary possibility in reality too. I don't think anyone needs to be reminded of the tragic Columbine High School Massacre...
The main difference between cinema and videogames is obviously control. A film is a pre-packaged product, designed to entertain or provoke for however long... traditionally around 120 minutes. A game comes in a linear format too (despite the ability to free roam, they still have a very strict structure) but there is also the ability to divert... to drive around the city, get a burger, go to the gym, fly a helicopter etc. This can be fun and certainly adds value to an incredibly expensive price tag (cinema costs £7, videogames cost £40). But not every game is like GTA or Fable. Take a look at a long-established gaming franchise - Tomb Raider. The average Tomb Raider game (nowadays) will take 6 - 8 hours to complete on a medium difficulty setting - £5 per hour in the best case scenario. The latest Splinter Cell (Conviction, twice as cinematic, half as stealthy) takes around 5 - 6 hours, working out at around £7 an hour for a disappointing product. And these games are mission orientated in a contained world i.e. not free roam. You start at point A and solve the puzzles / kill the bad guys until you get to point B. Sure, no gaming experience will be exactly the same. You may go a slightly different way about a mission to the way your friend does. But lets think in terms of value for a second. With Tomb Raider Underworld you pay £7 for around 75 minutes of on-rails linear action (pushing blocks and shooting enemies) with a feeling of familiarity. For Lara Croft: Tomb Raider (Simon West, 2001) you pay £7 for a slick, sexy, pumped-up blockbuster on a big screen with surround sound, which you can discuss with your mates afterwards. The thing is, the only reason this argument has become more prominent in the last five years is because gaming has become more cinematic. Think of Call Of Duty 4: Modern Warfare - the opening level on an enemy ship sees you running, trying to escape as the ship sinks. It sways from side to side, water pipes burst and explosions go off all around you. Think of the games mid-way twist and the epic countdown finale. It owes everything to cinema. Yet it still doesn't have the immersive experience. Why? Because of loading screens, health bars, glitches, dying and restarting the level. No matter how far videogames come (and model themselves after cinema) they still retain the basics of an old Sega Megadrive game - Point A to Point B. The difference is that we've jumped from side-scrolling pixels to HD widescreen - and, of course, a living breathing world. This is why most films adapted from a videogame fail - because they have no structure and rhythm.
As I said earlier, games can be artistic. They can be exciting and engaging. They can be jaw-dropping and addictive. But in the war between movies and videogames? They just don't cut it... yet.