It's time to admit it: Unreal Engine 5 has been kind of rubbish in most games so far, and I'm worried about bigger upcoming projects

1 month ago 68

More like Unreal Stuttering, am I right?

STALKER 2 - cutscene Image credit: GSC Game World, VG247

Stalker 2 has made me look back and realize that maybe it was a mistake to make Epic Games' Unreal Engine 5 become an industry standard for the next decade.

Sure, GSC Game World's big comeback FPS/survival game is a 'worst case scenario' of an UE5-powered game being broken as s**t at launch, but I've played my fair share of UE5 games four years into this gen, and mayhap Epic's powerhouse of an engine isn't as good for everyone as it was made out to be after those early demos and The Matrix Awakens.

Some context ahead of my rant: I don't know much about the ins and outs of game engines, programming, 3D modelling, or whatever. I did dabble in Bethesda's Creation Engine back in the day, and that was it. I've never shown much interest in tinkering with software and professional tools beyond an 'advanced user' level. I'm, however (and for obvious reasons), very curious about all those processes and the work that goes into making all sorts of video games and interactive experiences.

Of course, I'm also someone who plays far more games than he should over the course of a year, which is a good way to start picking up on things, good and bad. Combine that with self-taught hardware knowledge and OS tinkering, and as a consumer, you start to make sense of performance issues beyond saying 'this runs like wet ass' and asking for a refund (which I encourage everyone to do more often).

Fortnite UE5.1 update Image credit: Epic Games

Anyway, do you remember that late-2022 big Fortnite update which ported the entire thing over to UE5.1 to make good use of Nanite, Lumen, and all that stuff? After a couple of years of hyping up its new engine and trying to get developers to move past PS4 and Xbox One's limitations, it sort of felt like Epic's big triumphant moment with the new tech. Getting everyone and their respective mothers to experience at no cost all the shiny, amazing new visuals that came with UE5, implemented in a fully functional AAA online game.

Unsurprisingly, things didn't go exactly as planned. Your average Fortnite pro had been using the lowest possible settings for years to maximize their K/D ratio, and those of us packing hardware beefy enough made the jump only to find even more stuttering than in UE4's later versions and an overall performance hit that wasn't worth the hassle. Two years later, the situation hasn't changed much. Jumping into a Fortnite match right after updating drivers or the game itself means you won't do great, as shaders are being fully reloaded all over again, and all that heavy lifting is being done on the go. Not ideal.

Remnant 2 - boss fight Image credit: Arc Games

For those who don't know: The whole deal with shaders is that each hardware configuration must (or should) prepare them for quick loading ahead of regular play, which is why consoles aren't as affected by these woes and why modern gaming on PC can be kind of rough as of late, at least until your PC has 'acquainted' itself with the latest AA/AAA beast. Different engines (and developers) handle this in different ways. In the case of UE5, 'stutter struggle' is a very real thing, especially when traversing huge levels/worlds, and the lack of good, proper shader compilation upon launch in some titles only makes matters worse.

Even when UE5 has been used reasonably well (see Remnant 2), with Lumen and Nanite adding lighting and a level of granular detail to scenes that seemed impossible just a few years ago, the accompanying performance hit, even on expensive PC hardware, isn't worth it for the average gamer, who's just looking for smooth, painless experiences, especially when playing particularly stressful and demanding games.

Immortals of Aveum - close-up Image credit: EA

The solution that's taken over the industry even quicker than UE5? Aggressive AI-powered upscaling and frame generation. Both AMD and Nvidia are all over this, with the latter locking the (admittedly more robust than the competition's) tech behind the 40 series and above. Since devs can now conjure frames out of thin air, it feels like graphical fidelity is moving faster than the actual hardware being used to run all that jazz, with ray tracing leading the charge. The final result? Most big studios are trying to be Crytek in 2007, pushing for ridiculous visual quality that can only be carried by the current hardware at high framerates thanks to crutches such as DLSS, FSR, and whatnot.

And you know what? I think that tech works pretty well and is showing even more promise with each passing year. I love how my 4070Ti can poof frames into existence. But even when you're getting all those sweet boosts... some games remain a stuttery, uneven mess, and I'm not happy to say that UE5 continues to be the worst offender. A good recent example was 2023's ill-fated Immortals of Aveum, a game that, even after several patches, is largely broken on a technical level to this day. Despite the FSR3 and DLSS 3 support it enjoys, it's a wobbly mess prone to crashing. This also applies to several other 2023 games, such as the surprisingly okay Lords of the Fallen reboot, which has gained quite the cult following, but gets second-long freezing out of nowhere no matter your settings and remains rough on consoles.

Hellblade 2 - cave Image credit: Xbox Game Studios, VG247

These are just some examples, but you can see the pattern here. If you don't believe me, let the fine folks at Digital Foundry convince you with more data and deep research than I could ever pretend to provide. Also, if you're an avid gamer, check this list for recent games you've played and complained about regarding performance. The big exception, at least in my experience, seems to be Hellblade 2 (unsurprising, considering how much time Ninja Theory put into the audiovisual presentation versus everything else), which was shockingly smooth and stutter-free bar the rare crash related to PC woes I was suffering around the time it launched. But Hellblade 2 also happens to be a very linear game, so make of that what you will.

As we look to the future, with UE5-powered behemoths like the next Witcher, Mass Effect, and Star Wars Jedi entries looming nearer and nearer with each passing day, I can't help but be worried about all the major studios that have ditched their own tech to rely on Epic Games' engine, which so far feels quite underwhelming in real use outside of wild tech demos and projects where tons of time and resources were allocated to work out its kinks.

Stalker 2 might've been the straw that broke the camel's back, and while we could place a good amount of the blame on GSC Game World not taking extra time with it, I can't help but think of how super smooth Dragon Age: The Veilguard is across a very wide range of hardware, all while running on an engine (Frostbite) that was deemed unfit for anything but FPS not too long ago. Were we all deceived by Tim Sweeney again? Uh-oh...

Continue reading