The "allowing texture samples from 0x0" fix is really an interesting read. Makes perfect sense if you think about it - it should always work just fine on the console. Terribly poor readability for the code though.
NULL is actually only a valid memory location from the view of the GPU on this platform, not the CPU, and yeah, this is very likely to simply be an uncaught bug.
Traditional game console architectures don't usually have a lot of memory protection for the GPU - in fact, both the 360 and the 3DS were at various points in times hacked by subverting the GPU to do writes to global memory.
This is going back a few years now, but I'm pretty sure NULL was valid to read from the Wii's CPU (though not write, I don't think).
It was where the product code was stored, as I recall - so every now and again you'd find some junk in some random string, starting with your title's product code, and it would be time to track down another missing NULL check...
Technically, there's no requirement (or guarantee) that NULL is the address 0, there have been multiple architectures with non-zero NULL. And every architecture out there allows accessing physical address 0 (OS may map logical address 0 to be unaccessible). So they could well have knowingly decided to load address 0.
This is the quote any emulator dev should write on top of its desk:
> <the emulator> should never, ever trust the software it is running to be sane.
Console games always does weird things the only way to check is real hardware, that's why writing a good emulator is hard, because you must have a way to quickly test a particular behavior on hardware.
We like to romanticize our favorite games as being written by people as passionate as we are about said games.
But the reality is that most (though certainly not all!!) of them are primarily there for a paycheck. They're expected to work on new systems with poor debugging capabilities, and get games out as fast as possible. So naturally, they end up having lots of gross hacks and erroneous code just to get things working and ship the game.
My favorite bug so far was a game that read from a write-only register in a loop until a certain bit was set. That bit could never have been set under normal conditions, so it was freezing in emulators. Turns out that the game had a DMA channel running during each scanline, and after a few million iterations of the loop, the stars would align and the DMA channel would read a byte from memory that had the required bit set, and that value would be floating on the bus, followed immediately by reading from said undefined memory location. Since nothing would respond to it, it would reuse that DMA byte fetch, and exit the loop.
Even more fun, to test on real hardware, I was using a copier device. But said copier device pulled the data bus lines low whenever an unmapped region was accessed, which effectively broke the game on the copier as well. It took me two weeks of full-time work to track down that bug and exhaust all other possibilities for what was happening.
I can't begin to imagine why that loop was in the shipped game, nor why it only ever triggered once during hitting one of many buttons (this one required to finish the level), in stage 6-2 of the game.
I've never heard of this project - very interesting. I wonder if this will start to happen more frequently in the near future.. software simply being to old to really manage and keep updating to run on current operating systems.
I'm also always amazed at the hardware reqs for emulators. They suggest at least an i5 or i7 and recommend a dedicated gpu! For a gamecube! Released 15 years ago!
I've used it, but not recently. It was always a novelty to connect an actual Wii controller with bluetooth, set up some LEDs for motion tracking, and play Wii on my laptop.
As for performance: Using software to implement behavior equivalent to specific hardware is almost inherently inefficient. JIT-style processing of the opcodes helps, but you're still going to use multiple operations on the host CPU for each one that you execute from the guest code. Plus emulation of device synchronization that would've originally been done in hardware and that don't have direct equivalents on the host hardware.
Emulation has been used for mainframe line-of-business type stuff for quite a while, perhaps the longest. But games get much more attention since emulating the system is an extremely popular way to preserve access to the experience. That said not all systems are made equal, and high accuracy treatments of old consoles(e.g. bsnes/Higan) can also be heavyweight.
> I'm also always amazed at the hardware reqs for emulators. They suggest at least an i5 or i7 and recommend a dedicated gpu! For a gamecube! Released 15 years ago!
But it works well enough for most games even with an integrated Intel GPU, as long as you don't try to improve on the base game too much (resolution, filtering, etc..)
Emulating something as humble as an NES authentically takes way more computing power than that. We're talking about trying to faithfully recreate the characteristics of a digital computer with multiple CPUs that are highly sensitive to timing delays as well as a good amount of analog circuitry on top of that.
The emulators that exist today don't have to emulate a single CPU, that would be trivial, but the surrounding hardware.
Here's an example of someone trying to emulate a 6502-based computer (BBC Micro) and the problems they encountered when trying to run Elite, just one game: https://www.youtube.com/watch?v=RiE5pTisEd8
Older games used any optimization they could get, and they'd often exploit quirks in the hardware if it made things work better. Elite flips video modes part-way through the rendering of the screen since it was made in an era of CRTs. Trying to reproduce that effect in a frame-buffer type system is hell.
So most of the emulators out there have to fake a lot of things in order to get software and games to run properly, or at all.
> So most of the emulators out there have to fake a lot of things in order to get software and games to run properly, or at all.
Not really. The emulators for recent systems don't try to emulate the hardware anymore, they emulate the outcome of the system calls and convert them to OpenGL most of the time (HLE).
That's actually necessary for stuff like the PSP. Sony made multiple revisions of the hardware over the course of the PSP life, and therefore relies on such a system anyway to maintain the output identical despite the hardware changes.
> I'm also always amazed at the hardware reqs for emulators. They suggest at least an i5 or i7 and recommend a dedicated gpu! For a gamecube! Released 15 years ago!
And that's still nowhere near the requirements for cycle-accurate emulation. I'm not sure you can even run a cycle-accurrate N64 emulator on a standard machine.
This is one of the few emulators that will run _well_ below the suggest specifications. I've played it on much older hardware with a very low end GPU, and it runs quite well (depending on the game, of course.)
It supports the Wii also, not that that's anew console or anything particularly demanding. Software emulation is always pretty hard, I remember playing N64 games on a Power PC iMac and having them chug.
Not surprised at the system requirements- it has to emulate a 486 MHz PowerPC CPU. I have no idea why it would require a discrete GPU though. It only needs to output SD-quality video. Maybe shader performance is a limiting factor in the GPU emulations?
That requirement is a bit outdated. At native (not higher than a real gamecube) resolution, I find that the vast majority of games play just fine on Intel HD 4000 integrated graphics, and I imagine newer chipsets perform even better. When Dolphin was first released though, Intel graphics were pretty much garbage, so the discrete graphics requirement made sense.
> Not surprised at the system requirements- it has to emulate a 486 MHz PowerPC CPU.
Yeah, to anyone who has tried to optimize a CPU interpreter before; dynamic recompilers are nothing short of magical.
You'd be quite talented to emulate just a 50 MIPS interpreter core from a sufficiently awkward CPU at 100% speed on modern processors. Let alone all the other parts of the emulator, and synchronization between components.
Of course, dynamic recompilers sacrifice huge amounts of synchronization precision in return for that raw speed. But it's very much a necessary evil -- otherwise we wouldn't have playable modern emulators at all. The developers of dynarecs deserve extra special praise for being willing to give up both idealism and portability when writing them.
If you sacrifice synchronization you can also hit some pretty good performance with an interpreter. Until fairly recently CEMU was interpreter-only and they managed to hit 100% speed in a few games. The trick: they were running 50K instructions at a time without checking external interrupts or scheduling other components. At least, that's from what I can tell from reverse engineering...
The graphics are output at arbitrary resolutions, which can be demanding. Also IIRC the GameCube GPU had an architecture different from mainstream PC hardware of the time.
Twilight Princess works fine on a new-ish i5 with the Intel graphics. Smash Brothers Brawl works fine too. So the older Gamecube stuff is alright on modern computers.
For those interested, the next generation Nintendo emulator, Cemu for Wii U emulation is also making rapid progress, with many games fully playable near native speeds.
I think the claim of "using stolen code" is a bit strong. Do you have any evidence of that?
I'm completely willing to believe that they asm2c'd quite a lot of system libraries instead of properly reimplementing them. And I wouldn't be surprised if that's a prime reason why their software is not open source. But that's a step away from "stolen code" in my opinion.