I personally don’t believe in using review scores when reviewing video games. One reason is because it forces people to actually read what I write rather than skipping to the score at the end and assume everything between that and the beginning. Although the main reason I don’t use them is that it would open me up to a world of scrutiny if I did.
Take the following scenario. 2 games for me to review, Game 1 is good there are a number of points which I cannot overlook so I give it a score of 7/10. Game 2 isn’t as good but still enjoyable, however it’s consistent in it’s enjoyment so I also score it a 7/10. The amount of flack I would open myself up to to Game 1 fans who can overlook the points I couldn’t over giving their precious Prodigal Son game the same score as the dirty plebe that is Game 2.
In attaching a score to a review it also gives readers a tangible number that can be compared to other numbers assigned to reviews of other games that couldn’t be compared any other way. Going back to my previous Game of the Year winners. The experience I got from playing What Remains of Edith Finch is totally different from that I got from playing God of War. As such I wouldn’t be able to say for definite that one is better than the other. Though as soon as you assign a score to them suddenly there is an avenue of comparison. My reviews are based on complex opinions (shut up, they are!!!) and in rounding such complexities down to a single number isn’t doing any justice to the review, to the reader or to myself.
It’s also true that opinions change over time, however a review scores despite being a product of the age we lived in, will stand the test of time meaning that review scores can span across generation gaps. For example on the Metacritic’s Best Video Games of All Time list both Tony Hawk’s Pro Skater 2 and Red Dead Redemption have the same score, does that mean they are both as good as each other despite them being released 18 years apart? No it doesn’t.
So why do mainstream reviewers such as IGN or Gamespot use scores if they are so unreliable? This is because the scores are not for the consumer but for the developers and publishers. A lot of AAA developers issue employee bonuses depending on review scores. Publishers are also known to have clauses in contracts with developers getting bonus payments if game achieves a minimum review score. This is why a lot of studios are known to “butter up” larger video game reviews with events, swag and even parties for journalists. Reddit User u/cronedog took all 208 games from 2016 that had at least 20 reviews and found that the average score was 7.8 out of 10. That tells me that statistically 104 games from 2016 scored between 100% & 78%. This leads me to conclude that either 1) 2016 was one hell of a year for video games or 2) Review scores are heavily inflated to please Developers and Publishers. Honesty, I believe the latter is the more likely.