Fixing Review Aggregates – From OpenCritic to Metacritic, It Can Be Better

Review Aggregates Don’t Have to be a Hot Garbage Factory

Editor’s Note: The opinions expressed in this article are the author’s own and do not reflect the view of COGconnected

Late last year OpenCritic was introduced as a competitor to industry aggregate standard, Metacritic. While it’s done quite a few things right since its inception, there’s plenty of room to grow. I’m here to help, though, and that’s why I’m going to walk through some simple steps that I feel can fix major issues. Also, while most of this advice applies to review aggregates in general, it’ll largely be focused on OpenCritic since it specifically spotlights video games.

Review Aggregates Top Screen

 

Step One: Stop Adding Meaning to Numbers That Aren’t There

Every single gaming site has its own unique grading scale. That’s why a 70/100 on COG might be different from a 3.5/5 elsewhere. Sure, they’ll have the same number on an aggregate service, but they can mean something else completely. That’s why these numbers need to stop being lumped together.

Worse, some aggregates apply their own meaning to reviews. For example, if you check out the OpenCritic page for Melty Blood Actress Again Current Code, you’ll see my positive review of the game listed as ‘weak’. OpenCritic is adding a negative connotation to my review, despite the 60 meaning ‘above average’ on COGconnected’s proper use of the 100 point scale. Essentially, my positive (but not without flaws) look at the game is now egregiously misrepresented by one simple word. That’s why aggregates need to stop putting unnecessary descriptors to numbers.

Review Aggregates 01

 

Step Two: Don’t Put Words in Critics’ Mouths

Another huge issue that OpenCritic has is that it counts any score that is over 75% as ‘recommended’. So, if you take a look at my critic page, you’ll see that I recommend 45.16% of games. What does that even mean? Who am I recommending these games to, and why is it basing it off some arbitrary number?

Since I gave Digimon Story: Cyber Sleuth a 74/100, my review counts as ‘not recommended’, but if I gave it 1 more point, it’d be recommended. This system makes no sense, and the word recommended really should to be taken out. If you want to simply show the percent as what it is ‘45.16% of games scored higher than 75’ then go for it, but don’t try to add a greater meaning to what isn’t there. An aggregate site should be focusing on hard numbers, not putting words in the mouths of critics.

Review Aggregates 02

 

Step Three: Separate Your Games

Another issue comes in the form of how OpenCritic doesn’t separate different versions of games, an area that Metacritic addresses properly. It’s not rare to see a release (even a high profile one like Batman: Arkham Knight) have issues on certain platforms, but for some reason OpenCritic groups all of these scores together. It leads to weird situations like this:

Review Aggregates 03

According to OpenCritic, Jeff Gerstmann gave Fallout 4 a 67/100, which is an impossible percentage on a 5 point scale. It’s an asinine solution to average the scores together, and they should be separated.

This leads to another issue, where sites will have 2 different critics review different versions of the same game. Check out this mess:

Review Aggregates 04

As you can see above, I really liked the PlayStation 4 version of Android Assault Cactus. Meanwhile, one of my colleagues thought it was just okay. That’s great, as different opinions are what make game criticism so interesting and necessary. What’s not so great is how OpenCritic has lumped together our scores.

Despite reaching out to OpenCritic several times on this matter, they were unable to confirm how these averaged out scores affect individual critics’ pages. How can we take their critic pages seriously, if it shows a 65 on my page for a game I gave an 80? Different versions of games need to be separated.

 

It’s Not All Bad

Here’s the good news: OpenCritic recently confirmed to me that they’ll be taking up one of my suggestions and separating platforms. OpenCritic’s Matthew Enthoven told me that “we are planning to get rid of this problem by separating reviews for different platforms/authors.” That’s a great first step, and will make the site much better.

OpenCritic has done a lot right so far (such as adding pages for individual critics, and noting who actually wrote what review for outlets), but there’s still room to grow. Here’s to hoping they change for the better, and continue to implement changes due to feedback. What’s your take on review aggregates like OpenCritic and Metacritic? We’d love to hear more of your thoughts in the comments below.