“For quality games media, I continue to believe that the best form of stability is dedicated reader bases to remove reliance on funds, and a hybrid of direct reader funding and advertisements. If people want to keep reading quality content from full time professionals, they need to support it or lose it. That’s never been more critical than now.”
The games media outlets that have survived, except for Gamespot and IGN, have just about all switched to this model. It seems to be the only way it survives.



Surprisingly so. There’s a huge difference in online advertisements pre- and post-Fyre Festival.
They liked the game more than you. I promise you it is that simple.
I’m not talking about my personal preference on rating, I’m talking about broad community reviews.
For example, Cyberpunk 2077 is a notorious example. It got generally favorable reviews from reviewers, and the public release was a completely broken pile of trash on console. Reviews didn’t even get the console release, yet still gave it a positive review because the experience on PC was decent. How can we trust reviewers if they don’t actually try the game? The terms of the review embargo alone should have pushed reviewers to give it net negative reviews since they’re not able to actually try the game.
For strict review differences, look at Starfield, which got 85% by Metacritic, and Steam reviews are more like 55-60%, and it got hit hard by independent reviewers shortly after launch. That’s a pretty big mismatch.
GTA V was pretty close to a perfect score, but actual reception was a bit lower (80% or so on Steam right now). That’s not a huge difference, and it could be due to frustration about not having a sequel for over a decade, but it does seem that some studios get more favorable reviews/more of a pass than others.
That said, a lot of the time reviews are pretty close to the eventual community response. It just seems that reviewers overhype certain games. I haven’t really seen much evidence where critics review a game much below where the community reception is, but I have seen cases where reviewer scores are quite a bit higher than the eventual community response.
Maybe there’s nothing suspicious going on, it just sometimes feels that way.
Reviews will typically mention which version they were, but in general, there are very few differences between them these days, unlike back in the 6th gen or early 7th gen. Games like Cyberpunk are outliers.
Starfield is not a bad game. In a lot of ways, it’s a very good one. My biggest complaints with it, personally, are all the ways that it should have been modernized but refused to, falling back on what worked over a decade before it came out without turning an eye toward its contemporaries and the improvements they’ve made to the same formula. I find Steam reviews to be a valuable data point among plenty of other data points, but user reviews being that much lower than the critic average doesn’t mean the critic score is a problem.
For an example of a game where critics reviewed it less favorably than the user score, see Mad Max or Days Gone, which might be explained as games where the initial sales weren’t strong, and people who found it later, often at a discounted price, were pleasantly surprised compared to its reputation. There’s also the likes of SkillUp’s review of Ghost of Yotei. That game has largely reviewed very well by other outlets, but he found his review to be out of sync with his audience. If you’re a reviewer who plays dozens of games per year, your opinion of a formulaic open world game might be very different from someone who plays 3 games per year and hasn’t gotten sick of it. Both are valid points of view.