God Ginrai said:
I'm curious, what are you testing that may possibly speed up things like mupen?
-God Ginrai
I have been reverse engineering the GLSL shader compiler, successfully removing redundant cycles.
In the future, I hope that I can have my own assembler, minimizing shader generation and compile time, as well as making it possible to work with more exact values, possibly even gaining access to features which the GLSL ES spec. doesn't even allow.
I also do some GPGPU (General Purpose processing on the GPU) in the PSP emulator and I think this could save some time in the N64 emulator too.
Unfortunaly the SGX driver doesn't work in debug mode and therefore it's hard to benchmark anything, especially with the output corrupting (and the performance going down A LOT) whenever an X11 window is moved out of the display region or rescaled. But I think that a lot of time is currently wasted on the CPU anyway - lot's of small draw-calls which slow the whole thing down. Since I have similar problems on the PSP I look into a higher level fix to wrap around GLES calls to increase performance by removing redundant state changes and batching draw calls. Additionally texture conversions and similar things could be moved to the SGX. While that one might be stressed already, I doubt that the gles2n64 shaders stress the SGX USSE.
MonkeyChops said:
very interesting, thanks for the update. I'm not holding my breath for playable commercial games but this is indeed a very interesting project. Is there even a psp emu for pc's right now? :huh: Nice work! So I have to ask, how does ridge racer run in the current state of your psp emu, frame rate wise?
The pandora emu is particularly interesting. How does it work? I see it doesn't do SGX, does it emulate the rest of the hardware? Big time noob here so sorry if these are stupid questions. Does it basically just emulate an arm processor and then you just run angstrom on it? Or does it have to emulate our specific processor + other hardware? Did you release that emu for others to take advantage of?
btw, you da man!
Since I'm only emulating Ridge Racers graphics calls I can't say anything about performance at this state. I can just guess that it would be horribly slow at the moment, even if the CPU emulation was perfect. The problem with most PSP games is that they submit TINY draw calls with.. lets say 3 triangles - and that about 200 times in a row.
This kills rendering performance and even takes about a minute to render the frame on my GTX265 (Note that I'm forced to initialize any geometry, shader or texture for the frame as the cache from a previous frame doesn't exist when rendering a single frame (and since a dump can easily grow a few hundred mb I only have single frames..))
The Pandora emulator has it's own topic: http://www.gp32x.de/board/index.php?app=forums&module=forums§ion=findpost&pid=828915 and it will end up in this SDK: http://www.gp32x.de/board/index.php?app=forums&module=forums§ion=findpost&pid=867823 (I definitly need more coders to help with the SDK though. I'll publish more details on that when I have enough time to organize these things.)
But yeh - the SGX is missing, this is because the emulator doesn't emulate hard-, but software. It does this on a fairly high level - usermode applications (the CPU is emulated by qemu).
So you can use GLES2 BUT it will pass these calls from the emulated ARM system to the host system. This usually won't have a native implementation but some sort of GLES2 emulator. Normally this is the one provided by PowerVR, the maker of the SGX. Their emulator has some problems though - first of all, it's 32 bit only and second, it's buggy.
Besides these 2 major problems, it gets even worse if you consider that it's not talking to the graphics hardware directly, but goes through the provided GL2 implementation.
In my case that is provided by the nvidia drivers.
Since there are many complex things in shaders such as driver specific shader extensions, the PVR wrapper / emulator will have problems converting the GLSL ES shaders to GLSL shaders (so you have to add strange code to your GLES2 shaders to work around problems in the PVR wrapper / emulator - you have to remove those if you want to run your stuff on native hardware).
Additionally, the nvidia hardware has a different precission for float values (actually, the PVR wrapper / emulator doesn't even support different precisions while the real GLES2 implementation on the SGX supports the 3 precisions named in the GLES2 spec.).
So when doing general purpose processing on the GPU, you might encounter overflows or underflows whenever you try to work bitwise or similar things - hence code might work on the real hardware but not in the PVR wrapper / emulator (nor my Pandora emulator which is based on that).