Emulators for playing older games are immensely popular online, with regular arguments breaking out over which emulator is best for which game. Today we present another point of view from a gentleman who has created the Super Nintendo emulator bsnes. He wants to share his thoughts on the most important part of the emulation experience: accuracy.

It doesn't take much raw power to play Nintendo or SNES games on a modern PC; emulators could do it in the 1990s with a mere 25MHz of processing power. But emulating those old consoles accurately—well, that's another challenge entirely; accurate emulators may need up to 3GHz of power to faithfully recreate aging tech. In this piece we'll take a look at why accuracy is so important for emulators and why it's so hard to achieve.

Put simply, accuracy is the measure of how well emulation software mimics the original hardware. Apparent compatibility is the most obvious measure of accuracy—will an old game run on my new emulator?—but such a narrow view can paper over many small problems. In truth, most software runs with great tolerance to timing issues and appears to be functioning normally even if timing is off by as much as 20 percent.

So the question becomes: if we can achieve basic compatibility, why care about improving accuracy further when such improvement comes at a great cost in speed? Two reasons: performance and preservation.

First, performance. Let's take the case of Speedy Gonzales. This is an SNES platformer with no save functionality, and it's roughly 2-3 hours long. At first glance, it appears to run fine in any emulator. Yet once you reach stage 6-1, you can quickly spot the difference between an accurate emulator and a fast one: there is a switch, required to complete the level, where the game will deadlock if a rare hardware edge case is not emulated. One can imagine the frustration of instantly losing three hours of progress and being met with an unbeatable game. Unless the software does everything in the exact same way the hardware used to, the game remains broken.

Or consider Air Strike Patrol, where a shadow is drawn under your aircraft. This is done using mid-scanline raster effects, which are extraordinarily resource intensive to emulate. But without the raster effects, your aircraft's shadow will not show up, as you see in the screenshot below. It's easy to overlook, especially if you do not know that it is supposed to be there. But once you actually see it, you realize that it's quite helpful. Your aircraft has the ability to drop bombs, and this shadow acts as a sort of targeting system to determine where they will land.—something that's slightly more difficult without this seemingly minor effect.

The second issue is preservation. Take a look at Nintendo's Game & Watch hardware. These devices debuted in 1980, and by now most of the 43 million produced have failed due to age or have been destroyed. Although they are still relatively obtainable, their scarcity will only increase, as no additional units will ever be produced. This same problem extends to any hardware: once it's gone, it's gone for good. At that point, emulators are the only way to experience those old games, so they should be capable of doing so accurately.

But this accuracy comes at a serious cost. Making an emulator twice as accurate will make it roughly twice as slow; double that accuracy again and you're now four times slower. At the same time, the rewards for this accuracy diminish quickly, as most games look and feel "playable" at modest levels of emulator accuracy. (Most emulators target a "sweet spot" of around 95 percent compatibility with optimal performance.)

There's nothing wrong with less accurate but speedy emulators, and such code can run on lower-powered hardware like cell phones and handheld gaming devices. These emulators are also more suited for use on laptops where battery life is a concern. But there's something to be said for chasing accuracy, too, and it's what I've attempted to do in my own work. Here's why it matters to me.

Doing it in software

Back in the late '90s, Nesticle was easily the NES emulator of choice, with system requirements of roughly 25MHz. This performance came at a significant cost: game images were hacked to run on this emulator specifically. Fan-made translations and hacks relied on emulation quirks that rendered games unplayable on both real hardware and on other emulators, creating a sort of lock-in effect that took a long while to break. At the time, people didn't care about how the games originally looked and played in general, they just cared about how they looked and played in this arbitrary and artificial environment.

These days, the most dominant emulators are Nestopia and Nintendulator, requiring 800MHz and 1.6GHz, respectively, to attain full speed. The need for speed isn't because the emulators aren't well optimized: it's because they are a far more faithful recreation of the original NES hardware in software.

Now compare these to the older N64 emulator, UltraHLE, whose system requirements were a meager 350MHz Pentium II system. To the casual observer, it can be quite perplexing to see Mario 64 requiring less processing power than the original Mario Bros.

My experience in emulation is in the SNES field, working on the bsnes emulator. I adored the ideal behind Nestopia, and wanted to recreate this level of accuracy for the Super Nintendo. As it turns out, the same level of dedication to accuracy pushed requirements up into the 2-3GHz range, depending on the title.

Nestopia caught on because its system requirements were paltry for its time, but I have no doubt that releasing it in 1997 would have been disastrous. Since my emulator ultimately required a computing system with more power than half the market, I've seen first-hand the effect of high system specs and the backlash it causes. It's easier to blame the program than to admit your computer isn't powerful enough, but the reality is that faking an entire gaming console in software is an intensive process.

Why accuracy matters

So if an emulator appears to run all games correctly, why should we then improve upon it? The simple answer is because it improves the things we don't yet know about. This is particularly prominent in less popular software.

As an example, compare the spinning triforce animation from the opening to Legend of Zelda on the ZSNES and bsnes emulators. On the former, the triforces will complete their rotations far too soon as a result of the CPU running well over 40 percent faster than a real SNES. These are little details, but if you have an eye for accuracy, they can be maddening.

I've encountered dozens of titles with obscure quirks. Sometimes the correct, more accurate emulation actually produces a "wrong" result. Super Bonk's attract mode demo actually desynchronizes, causing Bonk to get stuck near a wall on most real systems. And Starfox suffers from significant slowdown issues throughout the game. These are certainly not desirable attributes, but they are correct nonetheless. We wouldn't round pi down to 3 simply because irrational numbers are inconvenient, right?

I don't deny the advantages of treating classic games as something that can be improved upon: N64 emulators employ stunning high-resolution texture packs and 1080p upscaling, while SNES emulators often provide 2x anti-aliasing for Mode7 graphics and cubic-spline interpolation for audio samples. Such emulated games look and sound better. While there is nothing wrong with this, it is contrary to the goal of writing a hardware-accurate emulator. These improvement techniques typically make it more difficult even to allow for the option of accurate emulation, in fact.

Another major area where accuracy is a benefit is in fan-created works from translators, ROM hackers, and homebrew developers. Few of them have access to run code on real hardware, so they will often develop their software using emulators. Unfortunately, speed-oriented emulators will often ignore hardware limitations. This is never a problem for a commercially developed game: upon required testing on real hardware, the bug would quickly be discovered and fixed. But if you can only test on a specific emulator, such bugs tend to persist.

I can name a few examples. The fan translations for Dragon Quest 1&2, Dual Orb 2, Sailor Moon: Another Story and Ys 4 all suffered invisible text issues as a result of writing to video RAM while the video processor had it locked out for rendering the screen. Only half of these titles have subsequently been fixed.

We've known about this hardware limitation since 1997, which consists of a one-line code fix, but the most popular emulator still does not support this behavior. As a result, translations made solely for this emulator continue to cause problems and lock-in. Who would want to use a more accurate emulator that couldn't run a large number of their favorite fan translations?

It doesn't stop there, though. The original hardware had a delay upon asking the math unit for multiplication and division results. Again, any commercial game ever released would respect those delays, but fan hacks led to a Zelda translation's music cutting out and to the Super Mario World chain-chomp patch going haywire.

Or an emulator might ignore the fact that the sound processor writes echo samples into shared RAM. Not a problem until you wind up with hacks that use wildly unrealistic echo buffer sizes, which in turn end up overwriting the entire audio program in memory, crashing and burning in spectacular fashion. This one issue single-handedly renders dozens of Super Mario World fan-made levels unplayable.