Better than reality: New emulation tech lags less than original consoles
We’ve previously written about how difficult it is to perfectly emulate classic video game consoles even with powerful modern computer hardware. Now, the coders behind the popular RetroArch multi-emulator frontend are working to make their emulation better than perfect, in a way, by removing some of the input latency that was inherent in original retro gaming hardware.
While early game consoles like the Atari 2600 sample and process user inputs between frames, consoles since the NES usually run that game logic while a frame is rendering. That means the game doesn’t output its reaction to a new input until the next frame after the button is pressed at earliest. In some games, the actual delay can be two to four frames (or more), which can start to be a noticeable lag at the usual 60 frames per second (or about 17ms per frame).
An experimental Input Lag Compensation mode being rolled out in new versions of RetroArch fixes this issue by basically fast-forwarding a few hidden frames behind the scenes before displaying that first “reaction” frame in the expected spot. So in a game like Sonic the Hedgehog, which has two frames of input lag, the game will quickly emulate two additional, hidden frames after every new input. Then, the emulator actually shows the third post-input frame (where Sonic first shows a visible reaction) timed for when the first post-input frame would naturally appear, cutting out the delay a player would usually see.
The result is classic emulated games that can actually run with less input lag than the original hardware, as seen in this super-slow-motion Super Mario Bros. sample video. It’s a major accomplishment and the culmination of a lot of theorizing for an emulation community that has been maniacally focused on latency mitigation for many years.
This extra emulation work obviously comes with a lot of extra processing overhead, but that’s not a big concern for emulating older systems running on modern, multi-core PCs (slower processors on the Raspberry Pi and other microconsoles boxes might run into more problems). Save states help make sure the internal game logic doesn’t get out of sync during the “run-ahead” process, while audio output running on different cores can make sure the music and sound effects don’t end up wonky.
Work on “LAGFIX,” as the feature is being called by RetroArch developers, has been going on since early March and continues to be refined and debugged by the community through regular open-source builds. The community is also working to document the “natural” lag in long lists of classic games in order to calibrate the new feature for as many pieces of software as possible.
This isn’t the first time emulators have tried to improve on the raw experience of the original source hardware. Many emulators use overclocking and memory management tricks to remove the sprite flickering and/or slowdown that occurred on many classic games. Others try to “upgrade” those blocky pixel graphics with automatic HD upscaling or polygonal 3D graphics, with varying levels of success.
But better-than-real-hardware response times for input is arguably the biggest improvement to the experience the retro gaming community has yet seen. Now all that remains is the inevitable debates with authenticity-obsessed enthusiasts that argue a few frames of input latency was the “intended” and therefore “better” experience.
Listing image by Tyler Loch / YouTube
https://arstechnica.com/?p=1296197