Reviews

Gaming Isn’t Ready For 8K Resolution Yet

The hot new innovation in display technology at CES this year is the introduction of several 8K televisions from top brands. Samsung, Sony, and LG: all of the big companies have brand new TVs with unprecedented amounts of pixels waiting for customers to buy them up. Forget about how 4K isn’t even ubiquitous in many homes yet. The future is coming and we have to double that number for no discernible reason.

Thankfully, gaming consoles aren’t following that trend. While both the Xbox One X and PS4 Pro have support for 4K, Microsoft and Sony understand that performance comes first. You can’t be pushing insane resolutions on paltry hardware and it would be crazy for the PS5 or new Xbox to support 8K displays when our current consoles can’t even reliably render 1080p.

Oh… wait… Microsoft’s Phil Spencer showed off a picture of the Xbox Series X’s CPU and a small “8K” logo is engraved on it. Hoo boy.

As we continuously move forward with technology, both game developers and publishers have become increasingly fascinated with improving graphical fidelity over performance metrics. Look at most console games and you’ll find titles that operate with near photo-realistic graphics but at 30 FPS or lower. Some of the biggest games from Triple-A devs are like this, including The Last of Us, Battlefield, Ghost Recon Breakpoint, The Division, Assassin’s Creed: Odyssey, and more.

For whatever reason, quality is more important than performance. It has always been the case and seemingly always will be. The one saving grace for a lot of games this generation is that PCs exist. Because of constant hardware upgrades and less restraint when it comes to raw power, PC ports can push past the limitations of console hardware and provide much smoother playback.

One recent example is Monster Hunter World. On consoles, Capcom did the best it could to provide a compromise between visual fidelity and performance. Opting to cap the framerate at 30 FPS, the refreshed PS4 and Xbox One play the game somewhat well. On base consoles, though, the game is practically a nightmare. Long load times, horrible frame stutter, and ugly screen tearing: it really doesn’t make a good impression.

On PC, though, even a relatively old machine can run Monster Hunter World at a crisp 60 FPS. There’s no input delay, the graphics can be tweaked to emphasize what you personally like, and everything looks stellar. It’s practically a night and day transformation over its console counterparts. You’d think that would be what Microsoft and Sony want to focus on next-generation, but I guess not.

The release of both the Xbox One X and PS4 Pro seemed kind of odd. While 4K is unquestionably the future, market penetration of the technology wasn’t really that high a few years back. In 2017, a little less than 30% of US homeowners had a 4K capable device. In 2018, that number raised to 32%. For the most part, people have not stepped away from their 1080p televisions.

Even if the number were somehow above 50%, the real telling thing is that our current devices can’t even perform 1080p games well. It makes sense that as a console gets older and graphics continuously get better, performance will drop. You can only do so much with fixed hardware, after all. That doesn’t explain why some of the latest releases on PS4 run at 900p and at only 720p on Xbox One. Aren’t these devices supposed to be more powerful than the previous generation? Some PS4 Pro games can’t even render above 1080p while Xbox One X struggles to perform well at its intended 4K output.

You can find various different analyses from Digital Foundry that prove the point. Team Sonic Racing manages to pull off native 4K on One X but performs noticeably worse than the PS4 Pro version. Kingdom Hearts III is another great example. Because of an interesting tweak users can make on Sony’s console, the PS4 Pro version can output at 1080p and run smoothly while the One X version is locked at a higher resolution and operates with tons of frame drops and screen tearing. It just plays badly when going for “fidelity.”

Obviously better hardware is going to result in better general performance, but we’re still struggling to pull off proper 4K playback. Massively expensive PCs don’t even consistently bring in 4K at 60 FPS, let alone higher resolutions. There aren’t many metrics to go off of, but tests of graphics cards pushing out 8K resolution show results at or below 30 FPS. These are all on machines that the average consumer is likely not going to be able to afford.

One of the best use case examples I can give is the esports scene. Most professional players prefer getting higher framerates and will even strip back visual features of games to achieve that. This has led to high refresh rate displays entering the market, but even a non-pro can feel the difference. Games just play better when they are free of performance hiccups and glitches.

Maybe Microsoft is taking a massive loss on the new Xbox to give gamers the future. I don’t think that’s a reality, but it would be better for developers to focus on improving performance across the board instead of pushing out the pixel counts. Gaming is not yet ready for 8K resolution and we shouldn’t be pushing it as standard.

Source: Read Full Article