You must forgive PC gamers for banging on about Crysis all the time—it feels like an age since a game came along like it that was so impossible to run on existing PC hardware that graphics cards actually had to be redesigned to cater for it. But we have had a few close run-ins with impossibly demanding games these past few years, and that’s got me wondering which will be the next game to grind our PCs to a halt with their obnoxiously demanding system requirements?
If you simply ask for a list of the games primed to show off your all-singing all-dancing rig at the moment, we’ve got you covered there with our list of the best games to show off your new graphics card. Rather, I want to look forward to what’s coming and whether we’ll ever reach that Crysis point ever again, as I’m not yet convinced we will.
But a few upcoming releases spring to mind as candidates. Starfield, for one. This is a new entry from Bethesda but built very much in the vision of the company’s biggest successes, such as Elder Scrolls and Fallout. A fresh open-world (or rather open-universe) with a brand new version of Bethesda’s well-used and famously quite janky Creation Engine—this is sure to be a gorgeous, if very demanding, game.
The Starfield trailer from earlier in the summer only offered a glimpse of a rocky planet or moon, yet even those lumpy space rocks looked impressively detailed. The actual gameplay upon release could be a lot different to what we’ve seen so far, especially as the game’s release has been pushed back into 2023. However, the shadows and ambient occlusion alone in that trailer appear enough to make a graphics card whimper.
Perhaps the vast emptiness of space will be easy on the CUDA Cores. In space, no one can hear your graphics card’s fans scream.
Then there’s The Witcher 4. Though that’s not confirmed to be the name, we know CD Projekt Red is working on the next instalment right this moment, and isn’t wasting any time with its own REDengine on this one. It’s instead choosing to side with Epic’s Unreal Engine 5 (UE5), which will mean it joins the legions of games in development for that engine. The game will undoubtedly be gorgeous, but I wonder if its impact on hardware will actually be minimised by the use of a more widespread game engine.
The business of game development has learned to do a whole lot more with a whole lot less.
“Players can go in whatever direction they want, they can handle content in any order that they want, theoretically,” CD Projekt Red’s Slama said earlier this year. “To really encapsulate that, you need a really stable environment where you can be able to make changes with a high level of confidence that it’s not going to break in 1,600 other places down the line.”
Already shown to be impressive in its breadth and detail, UE5 feels a great choice for the much anticipated Witcher game, and here’s hoping it will offer a much improved launch experience than CD Projekt’s last game, Cyberpunk 2077.
Cyberpunk 2077 was a recent game that really pushed the graphics hardware of the time, but was it because of its impressive expanse or due to a not-so-optimised engine? It’s a mix of both, perhaps more the former at times, but the lack of optimisation really stung with this game’s performance. There is an important distinction to be made between a game that’s demanding for the right reasons and one that’s demanding for the wrong reasons.
(Image credit: Crytek)
Perhaps the closest we got to a watershed moment for graphics hardware like that of Crysis has been the adoption of ray tracing in modern gaming, so I’d take a guess that whatever game we’re waiting to become a benchmark of processor performance will use it to impress to some extent.
Bouncing a ray for every pixel on-screen was sure thirsty work for even the graphics cards designed with that in mind. The RTX 30-series manages to lessen the load somewhat with more impressive RT Cores than the RTX 20-series, and since then we’ve seen AMD join in with its own RDNA 2 Ray Tracing Accelerators, which are moderately decent at the job. But there’s still a fairly significant price to pay for pretty reflections and shadows.
Just look at F1 2022. It certainly looked the part with ray-traced reflections, shadows, and ambient occlusion glistening off the side of Sainz’s Ferrari, but even an RTX 3080 struggled to make ray tracing in any way worthwhile. That’s with both Nvidia’s Deep Learning Super Sampling (DLSS) and AMD’s FidelityFX Super Resolution (FSR) working their butts off to improve the final image and performance.
Right, yes, upscalers. Upscalers might change everything. Are extreme demands for more cores, more VRAM, and faster clock speeds being swept under the upscaling rug? I’d argue that upscaling is, and will continue to, do as much for PC performance on a grand scale as faster GPUs.
(Image credit: Epic Games)
That actually leads us back to the next Witcher game again, and seeing as CDPR is opting for UE5 rather than its own engine, it will likely come with support for UE5’s embedded upscaler, Temporal Super Resolution (TSR). Not to mention any other upscalers that CDPR decide are well worth integrating into the game. That could be many of them by that time, as Nvidia’s framework for better integrating even competing upscaling techniques, known as Streamline, could be in frantic use by the time that game’s release rolls around.
Considering console development, and what feels like a shift away from PC exclusive development, it also appears that the days of madcap schemes to push PC hardware over the edge may be dwindling. With compatibility across many PC-like consoles of varying power and capability, any developer will be keen to at least maintain a steady performance across most platforms.
That doesn’t necessarily close the door to extreme presets on PC but I would expect it to at the very least lessen their regularity.
It comes down to what we class as the next ‘Crysis’—in the sense of a PC-breakingly demanding title. It’s not just a game that struggles to run at 120Hz at 4K on a high performance graphics card. We have loads of those already. It’s a game that is so frankly absurd in its adoption of bleeding-edge graphics technologies and techniques that your PC gets coil whine just installing it.
(Image credit: Crytek)
(Image credit: Future)
Best gaming monitor: Pixel-perfect panels for your PC
Best high refresh rate monitor: Screaming quick screens
Best 4K monitor for gaming: When only high-res will do
Best 4K TV for gaming: Big-screen 4K PC gaming
In a sense of a game as absurd as Crysis once was, I look to the big games of the previous two years and those coming in the next two, and I just don’t see anything that fits the bill. Even Crysis Remastered wasn’t a match for its older self in this regard, though it didn’t run particularly well at launch, either. That was also down to poor CPU utilisation rather than some new-age graphics technology, however, and was later patched for proper performance.
The next generation of games are going to be beautiful, I never have any doubt about that, but perhaps the reason we won’t see another Crysis moment is simply that the business of game development has learned to do a whole lot more with a whole lot less. Times have changed: Crysis came at a time when PC performance wasn’t decided by hundreds of frames but a mere expectation of a steady 30fps. Developers weren’t bringing their console exclusives to PC like they are now, either, and we’re scoring big first-party games on PC like God of War. The goalposts have been moved and people expect a lot more from their games. I don’t know if today a game like Crysis, with demands so high that they closed the door to the majority of gamers, would be met with such awe as it once was. A publisher certainly wouldn’t be that fussed on the idea—someone spent a lot of time and money making that game.
It’s for the best, really—while it was a fascinating and exciting time in graphics development, not being able to play Crysis with any semblance of a decent frame rate was also rather annoying for those of us without the hot new graphics card.