How Epic got such amazing Unreal Engine 5 results on next-gen consoles

  News
image_pdfimage_print
Watch this in full screen at 4K if at all possible.

After last week’s somewhat blasé showcase of in-development Xbox Series X software, our expectations for the coming generation of new game consoles were somewhat tempered. But those expectations got a significant boost this week with Epic Games’ unveiling of a jaw-dropping real-time demo of its Unreal Engine 5 technology.

The engine, which Epic says will launch next year, was shown running on PlayStation 5 development hardware, though the company said the same general level of fidelity should be possible on the Xbox Series X and high-end PCs as well. And it shows how the state-of-the-art in graphical quality on those platforms is quickly advancing past resolution and frame rate counts and toward new levels of lighting and modeling detail.

Lighting

The first pillar of Epic’s new engine is what it’s calling the Lumen system, a “fully dynamic global illumination solution that immediately reacts to scene and light changes.”

Those kinds of effects aren’t all that new in the gaming space. Nvidia has been pushing the real-time ray-tracing capabilities of its PC graphics card line for years now, and those effects are currently making even games like Minecraft look spectacular with the right PC hardware. But we haven’t yet seen anything like this kind of real-time lighting on console hardware, much less with what Epic says is “diffuse inter-reflection with infinite bounces and indirect specular reflections.”

Epic says this kind of dynamic lighting will save developers time by removing the need to design light-maps by hand and to wait for the engine to “pre-bake” that static lighting during development, as in current engines. Instead, artists will simply be able to place a light source with the engine and immediately see how the game will look when running on console hardware.

“We’re really trying to empower developers here to make next-generation experiences that are unbelievably realistic but also economical and practical to create without a 1,000 person team,” Epic founder and CEO Tim Sweeney said in a “Summer Game Fest” interview attached to the demo’s unveiling.

Using dynamic lighting also means that developers can move light sources around more easily during the course of gameplay and see light bend around obstructions in ways that don’t really work with the static light sources found on current engines. Besides just looking nice, Epic sees this kind of lighting leading to new gameplay ideas—a player-held candle light-bending around a corner to reveal a hidden enemy, for instance.

Real-time lighting also means an entire universe of user-generated content can take advantage of these kinds of effects without special coding. With Lumen, “things change beyond the release of the game,” Epic CTO Kim Libreri said. “Things can grow and change, and that can all be calculated live.”

Scaling

While real-time lighting effects are getting more common these days, Unreal Engine 5’s most unique features seem focused on the idea of massive, automatic scalability of in-game art assets. What Epic is calling the Nanite system is designed in part to reduce the “cost for an artist to manufacture a major new piece of content for a game,” as Sweeney puts it.

In current game design, game artists usually squash their in-game 3D models down to a few different levels of detail (LODs), each with fewer polygons and smaller textures than the last. Those low-poly models can be subbed in when an object is far away from the in-game camera, saving both memory space and rendering time in complex scenes, without much loss in apparent frame quality.

With Nanite, Epic says artists can instead just directly import raw “movie quality” 3D models from a library like Quixel Megascans into the engine, without the need to specifically design lower LOD models for use in the game. The engine can then automatically create LOD models that are appropriate for the scene and the target hardware, from a next-gen console on down to a mobile phone. The idea, as Epic VP of Engineering Nick Penwarden puts it, is to “build content once and deploy it across all devices.”

“The artist doesn’t even have to think about it,” Libreri added. “You pick a high-resolution asset you want, you load it in, you don’t have to think about making levels of detail or normal maps or any of the usual stuff that goes with making game content. It’s a pretty freeing experience for anybody trying to tell a story or make an experience.”

Streaming

On high-end hardware, Epic says the top end of those LOD models can now be detailed enough so that each polygon only takes up a single pixel on the screen. At 4K resolutions, that level of “Render Everything Your Eye Sees” detail can require tens of billions of polygons for even a moderately complex scene (though Epic concedes that its demo drops to 1440p at points to preserve performance).

On existing hardware, putting that many polygons in a frame would overload the system memory and overwhelm the “draw count” budget needed to render the scene in time for the next frame. Unreal Engine 5 gets around those limitations by making use of the ultra-fast bulk storage SSDs that will be standard on next-generation consoles.

The idea of streaming graphics content from storage to RAM has been around for decades now—the original Crash Bandicoot utilized that basic method to create environments with 20 to 30 times the level of detail of contemporary PlayStation games. In the past, though, such streaming usually meant loading in the next area as the player approached and using game design tricks to hide the “pop in” effect of that loading from the player.

With ultra-fast SSDs as a standard, that concept can now be extended to stream in art assets as the camera pans and turns around a single scene, essentially getting the polygons and textures players need to see in to RAM just in time for display. “There are ten of billions of triangles in that scene, and we simply couldn’t have all of them in memory at once,” Penwarden said. “You end up needing to stream in triangles as the camera is moving around the environment.”

Freeing developers from “loading data off a spinning mechanical device that had its roots in the 1950s… will have a much larger impact on gaming than people are expecting right now,” Sweeney said. “The world of loading screens is over. The days of geometry popping up in environments is ended.”

(This also means that PC players may have to upgrade to high-end NVMe drives in their gaming rigs to get the best graphical detail from games going forward.)

Tempering expectations

For now, we’re trying not to get too excited about the potential shown in this short demo. There will likely be some constraints and tradeoffs necessary to get from even a playable demo to a full game complete with AI, physics simulations, multiple animated actors, and more.

It’s also not clear just how much storage is necessary for these high-quality streaming assets or how big games can comfortably get on relatively expensive SSD storage. And despite the promise, we’ll have to wait until next year for outside developers to report on just how much the engine lives up to Epic’s “build content once and deploy it across all devices” promise.

All that said, this is the first showcase that has gotten us really excited for the potential advantages of the next generation of gaming hardware. The improvements in environmental detail, lighting, and loading speed on display here are definitely more interesting than the usual resolution pissing contests and careful frames-per-second splicing that have characterized next-gen gaming “advancements” in recent years. We can’t wait to see how this potential translates to actual games when Unreal Engine 5 actually rolls out to outside developers next year. https://arstechnica.com/?p=1675926