In the last week, I've spent far more hours than I expected firing arrows and using gadgets to overcome humongous creatures. I've explored dark caves and a vast open world through lush forests and towering mountains. I've stopped and enjoyed gorgeous vistas, marveled at detailed inclement weather, and slowly learned how to survive in a video game version of the wild. Wild... where have I heard that before?
Wait! Not that! Is that what you though I was talking about? I mean, that one is great, but I've hardly played it since I've been sinking all my free time (and then some) into this:
I really do like the new Zelda
, and I am excited to put some more time into it, but I have just been completely enamored with Horizon: Zero Dawn
. It would be easy for me to go on and on about why, but given how much has already been written about it I don't need to toss another positive review on the pile.
What does seem worth writing about is an odd discrepancy. The recent Game Informer
brings this easily into view; in the magazine, The Legend of Zelda: Breath of the Wild
received a score of a perfect 10, and Horizon: Zero Dawn
received a mostly glowing review of 8.75. Seeing such mild discrepancy likely doesn't mean much at face value, right? But let's follow through a bit. Few games have ever received a perfect score, much less from several review sites. Perfect scores stick in our heads, and help elevate the perception of a game whether we ever play it or not. Much like corrected news headlines, if folks determine later that they were over-exaggerating or even wrong, the correction isn't what sticks, but the initial news (or review.)
I'm in no way implying Breath of the Wild
doesn't deserve the accolades it has and continues to receive. I haven't finished it (or Horizon
for that matter) but I have played enough to agree with the praise. Both are excellent games in my opinion! My frustration is in the arbitrary notion that, qualitatively speaking, a publication's critics somehow devised that one of these is approximately 12.5% better than the other.
How should an actual review score be tabulated? Should a narative-focused game review differently? Can Dear Esther
be measured against Mario Kart 8
and The Witcher III
is one of my favorite games of all time, but it is almost more of an interactive visual novel, so how do I assign it a number to compete with another of my favorites, the shmup Gate of Thunder
If it is by cover art, my favorite shmup just lost.
What about a multiplayer-centric game? Is Overwatch
two points better of a game than, say, League of Legends?
What about puzzles? Is Candy Crush Saga
20% a lesser game than Columns
? And should games based on monetization and "Pay-to-Play" mechanics have a different review criteria? Should "gated" content and patches or updates be reviewed separately?
Should the age of a game factor into scores? Can we objectively compare Space Invaders
to Final Fantasy VI
to God of War
to World of Warcraft
I guess that would depend on which version of Pong we're comparing.
Pretty much each and every game has a number attached to it, a review score from certain trusted sources. We take it for granted without really considering how absurd that is. We borrow the same flawed model from other media; Pink Floyd's album Dark Side of the Moon
, the movie Casablanca
, and the Lord of the Rings
novels all have numerical or letter ratings attached to them as well. It doesn't make any more sense in those mediums. I can't really use a one to ten scale and compare Saving Private Ryan
against Guardians of the Galaxy
. There are plenty of worthwhile things to say about each, but nothing that universally equates to a magically applicable "final score."
Most of us who use video game review scores do so as shorthand to determine if a game is worth our time and attention, and that seems pretty innocuous. Like any form of prejudice, it is considered a great time-saver. The obvious problem, in any prejudice, is explained in the word itself; pre-judging. In this case we are pre-judging a game based on someone else's opinion, experience, mood, expectation, and countless other factors. These may or may not be relevant to an accurate critique, but they undoubtedly influence whatever arbitrary number is used in a review score.
This is bad for the medium as a whole and shows how immature we are about it. How does any review score make sense in a larger context? I do find I like the color green more than red by at least 20%. Therefore, red is obviously inferior. In this odd, commercially competitive system, there is a perceived need to show "mine is better than yours." Be it opinions, perspectives, or media preferences.
The problems with this perceived meritocracy are manifold, starting with how it often doesn't work. Gamers like myself scour fan sites looking for "hidden gems," or games that we would want to play but of which we are unaware, because they can be lost in the deluge of popular (but not inherently better) games. The "best," particularly the best for an individual, doesn't necessarily rise to the top of the crowd. Not to mention that some games are released in specific regions and not in others. Or how different costs and a game's length factor in buying decisions. Or how a critic publication's reliability and financial (or other) incentives may be in question.
And lest we forget, there is more at stake than cultural relevance or just finding a good game. Salaries, bonuses, even jobs have been lessened or lost over Metacritic scores. Some rather cruel individuals have purposefully tanked review numbers and skewed averages, all because of everything from political statements to grievances with game producers. I once read a review that gave a zero out of 100 to a game because the reviewer's game console broke, completely unrelated to the game itself. Even if every other review gave the game 100, it doesn't take a statistician to understand the problem this causes.
I definitely agree that it is valuable to critique and review video games. This stuff is expensive! I also have a strong appreciation for the greater cultural capital of our medium. We need healthy critique. But to accurately critique is to do more than contrasting, comparing and analyzing. In games, there is a need to review sublime gameplay, or that elusive yet subjective 'fun factor,' or even artistic intent in design and narrative. The range of things to consider is even more varied than the number of genres to which each title belongs.
The best critique of books, music, and art doesn't seek to plaster a number on it. There are technical and skill considerations, of course. There are differences to appreciate. There is depth to mine or loftiness for the sake of enjoyment. There are creators communicating, and there is fun for fun's sake. Sometimes there is a wonderful combination.
I'm not pushing for a lack of reviews, but lamenting how available reviews are often lacking. We should be pursuing a better system, a discussion of why a video game is worth our attention. It doesn't necessarily take a professional; I don't need to read a food critic to know I enjoy Arby's
. However, if I'm in the mood to look up a new restaurant for an expensive steak, I'm personally going to research past the online ratings and find out what folks are saying about the actual food. Some of those five star ratings are because of an attractive waitress, and some of those one stars are because someone had a bad break-up and decided to go out to dinner. Neither tells me if that restaurant's steak tastes like shoe leather, or is routinely so undercooked it decided to graze on my salad. I'm looking for a review that describes the important elements and experience of spending money on food in there. If I read a game review, I want to know more than "good graphics, 9/10, average gameplay, 7/10." Whatever that means. (Don't get me started on how a 5/10 isn't average but poor, and somehow a 7/10 seems just passing.)
The problem with quantitative metrics in video game reviews is summed up by a particular good friend and how he goes about all of this. He reasons that his spare time is limited, and there are countless video games through which he works to cull in order to determine what he wants to play. So, he usually only buys games that have overall scored a 9 or above on aggregates like Metacritic. In his thinking, that means he only spends his time playing the "best of the best".
It would make sense, generally speaking. Except here's a short list of games he would have missed by going with this method; Halo Wars
and Halo Wars 2
, Axiom Verge
, Shovel Knight
, Super Meat Boy
, Plants Vs. Zombies
, and Plants Vs. Zombies: Garden Warfare
, Mario Kart 8
, and TowerFall: Ascension
. As you may have guessed, I didn't pick those games at random; these are some of his favorite games, and he played them because he came over while our household was already playing these games. He won't even let me show him Horizon: Zero Dawn
at the moment, as he's concerned it will be added to the list after hearing me go on and on about it.
I find this evidence that 'shooting for a high score' in game reviews is more than just a problem for the games-as-art connoisseur. It helps exemplify that the industry and gamers should move past a system that never really worked in the first place. Our medium deserves a more thorough critique method, including reviews that prove a player's experience can't be summed up with a random number or letter grade picked from the air.
Not all games are created equal, nor are gamers. So let us give better voice to how we experience interactive entertainment. It is time to move the conversation, and our favorite medium, forward.