PS4 OpenGL ES2 -- Hardware Accelerated Graphics Rendering WIP

So a while back i found a reference to OpenGL ES (GLES) in one of the PS4 libraries libScePsm.sprx which peeked my interest, you see OpenGL is a widely adapted 2D & 3D rendering solution and having access to it, on the PS4, would make porting over emu, games and even building nice UI for games/homebrew and such much easier, not only that it also means we'll have access to hardware accelerated graphics rendering.

The blog won't be all that technical, but it will just walk you through our attempt to get GLES to work, while progress has been made, we still haven't managed to get OpenGL ES to work, well not to a useful state atleast.

Quick Thanks to @masterzorag for the help with the reverse engineering and since he had the hackable console he did all the testing.

So straight to the point, I spent the next day trying to find the revision used on the PS4 to be able to compile a sample to test, it turns out it was OpenGL ES 2.0 and that was the inital result.

(what does this mean? well it means that opengl is accessible and we can make opengl calls successfully)

however, the picture above doesn't tell you half the story, you see, that code was ran through the PS4 browser as a payload, the browser already uses OpenGL (my best guess is for video rendering or perhaps even rendering the page itself), so the calls executed without any issue, later when we tried to use the Playroom trick, the call would immediately cause a crash.

To get Piglet to work in the Playroom, we had to setup piglet (Piglet is the name Sony gave to their OpenGL render),
scePigletSetConfigurationVSH(someUnknownData)
now it took sometime to figure this one out, you see piglet configuration call takes a wooping 1024 bits or 128 byte structure (on new FW thats now 133 bytes), figuring this out was a bit of a PITA, in the end, we ended up getting the values used by the browser and they worked, kinda, so we're happy to continue.

    //FW 1.76
    struct pigconfig
    {
        int32_t unk1[8];//
        int32_t unk2[8];//
        int32_t unk3[8];//DISPLAY
        int64_t unk4[4];//
    };
    config.unk1[0] = 128; // ? struct size
    config.unk1[1] = 0x28;
    config.unk1[6] = 0x800000;

    config.unk2[2] = 0xe400000;

    config.unk3[0] = 1920;
    config.unk3[1] = 1080;
    config.unk3[5] = 0x20000;
    config.unk3[6] = 0x4000;
    config.unk3[7] = 2;

After that we ran into issue creating a window to draw into, the opengl call is

EGLSurface eglCreateWindowSurface(EGLDisplay display,
 	EGLConfig config,
 	NativeWindowType native_window,
 	EGLint const * attrib_list);

the issue here is we didn't know how to get the NativeWindowType or it's value for the PS4 to pass it to create a valid window, so for motivation we looked at how it's done by other apps on the PS4.

Screen-Shot-2018-02-13-at-13.54.06

this initally cought us off guard, why would NativeWindowType be a reference to a static value, 8246337208320LL? as this is usually a window/surface handler id.
well when you pay close attention you'd realise 8246337208320LL = 0x78000000000LL and on closer inspection you might have realised that 0x78000000000LL in a 64 bit value can be broken down to 2 32 bit values: 0x00000000 and 0x00000780... where 0x00000780 = 1920
so the NativeWindowType is the screen width? not quit, you see the & in &qword_2999D4 that mean we're not passing the value but it's location, to be more precise, it's passing the location for the start of NativeWindowType and based on that we concluded this is what the PS4 NativeWindowType looks like

typedef struct native_window_t
{
    uint32_t unk0 = 0;
    uint32_t width = 1080;
    uint32_t height = 1920;
    uint32_t unk1 = 0;
} native_window_t;

and guess what? it works!! this was somewhat expected but odd, you see we're only passing measurement here and not the ID to the windows, which is usually how it work on the PC, but the PS4 doesn't really deal with the concepts of windows, so it was happy just getting the resolution.

so whats next? if you're guessing something else refusing to work, you're right!! :P

Creating Context is failing. well what is context and why is it failing?
Context is probably the most import part in OpenGL and to summaries its function, it's the the glue part that connects the dots.

At the time of writing We still haven't figured out what causing this to fail, the PS4 has 4 different egl configs and they all fail the same.
My best guess to why it's failing is due to the lack of memory allocation, which ill briefly touched on below.

so while stuck with no progress in sight @masterzorag had this to say

Btw we see scevideocoresetinitialize
We miss that too
Can be the videoout <-> egl missing piece I was referring some day ago

so we went after sceVideoCoreSetInitializeInfo(int &val) to see what it does, it's a simple call that takes a reference value and thats about it, unfortunately it didn't solve the context issue, but it did get us further ahead, in the browser at least since the browser already has a context which we can just reuse.
20180107161845
so, whats significant about this picture? that shows glClearColor(0.0f, 0.0f, 5.0f, 1.0f); works, an OpenGL command actually works!! however, until we can get context working outside the browser, this wont be of much use for homebrew. as I mentioned my best guess that context creation is failing is due to lack of memory allocation, we've found references to calls int scePigletAllocateVideoMemory(...) and/or int scePigletAllocateSystemMemory(...) which we tried to play with briefly, but that didn't get us anywhere, but they're worth revisiting again in the future, as thats where I believe the issue is.

so.. is that it?
Nearly, after this, you'd still need to get shaders working, shaders are the code that runs on the GPU and it seems for some reason Sony decided to strip away the runtime shader compiler

Shader source is not a precompiled PSSL binary and no runtime shader compiler available

and they're forcing us to use their proprietary shader compiler that comes with their Official SDK.
initially this was a bridge that we decided not to worry about until we need to cross it, and I also knew I could count on @AlexAltea to write us an open source compiler once we get this far and He nearly did, however I found this function shCompile, which seemed odd at 1st so I did a quick google search boy was I happy I did. it confirmed my suspicion shCompile = Shader Compile.
yup, while Sony had removed the default runtime compiler, they didn't remove this one, which was part of GLSLANG implementation.

I found that the adobe header for ANGLE was the closest to the one found on PS4 https://github.com/adobe/angle/blob/master/include/GLSLANG/ShaderLang.h
Google, Microsoft also have an implementation of Angle and KhronosGroup has a shader compiler(GLSLANG) but the closest matching header was that of adobe probably because it hasn't been updated in 6 years XD

whats ANGLE and whats GLSLANG? ANGLE is a reimplementation of OpenGL using other graphical rendering backends, in this case, I believe Sony reimplemented OpenGL ES on top of their implementation GNM (now, why didn't Sony natively implement OpenGL is beyond me, I'd imagine it would have been easier and they could have had the Desktop version of Open GL, instead of ES which stands for Embedded Systems, which is a more limited version of OpenGL), GLSLANG is a shader compiler/generator thats also used with ANGLE, you can also have a standalone version of it.

with all of that said, this might not be an ANGLE implementation but merely just GLSLANG.

anyway, so do we have shaders working? no :(
but this is my attempt at it.
I defined a new glShaderSource() that would pass all the needed data to GLSLANG to compile the shader
https://bitbucket.org/masterzorag/gles_test/src/429de2637289903f26bbc661454f4a607729c6c7/source/test2.c
however, it was failing during shCompile() call, not really sure why, and it's a pain trying to debug things on the PS4, even more so, doing it through someone else, so this is where I called quits (the code has been updated with debug code since then, but I've tested it yet).

If you're woundering why would I expect shCompile() when the default run time shader was removed? well because I've found reference to it being used in webkit code and tracing that code I found that some argument suggest that none-precompiled (none compiled) shader code being used.

I really hoped I would have gotten it to work before publishing his article, but We've been working on this for some time albeit with much slower progress than I had hoped and at this stage I'm spent and as such this will probably be it from me, but I do hope that someone will be able to pick off where we left and get OpenGL working.

but I'd advise anyone that continues with this to have a UART enabled, because as it stands all the useful errors are probably printed there and we're only guessing why things are failing and trying to fix them, it works sometime but most of the time it's just spent running around in circles.

side note:

  • if you looked at my ps4sdk branch you might have noticed that the include paths are wrong aka I used #include <GLES/egl.h> #include <GLES/gl.h> when it should have been #include <EGL/egl.h> #include <GLES2/gl2.h>, I was initially excited to start testing so i throw everything together and then I was just too lazy to update the sdk and the samples, but they're easy enough fixes anyone should manage just fine.
  • OpenGL implementation has been updated on 4.01 diff and there has been reference to GLES shader compiler being added to the SDK tools. (Sony might be considering making OpenGL a supported rendering solution? or maybe it's being used in dynamic themes)
  • OpenGL is used by the mono implementation for rendering, aka UI is rendered using OpenGL, so it might be worth while looking at the C# code as well.

Our initial testing was done on 1.76, however the linked sdk is updated to work with 4.05 since this is now the new standard, however since we dont make any kernel calls, this should work on even newer FW without an issue, however no test has been ran on anything newer than 1.76 as this post.

ps4sdk with GLES/GLSLANG headers: https://github.com/Zer0xFF/ps4sdk/tree/gles2
Playroom/kernel hook tests: https://bitbucket.org/masterzorag/eboot_plugin-egl_test
browser payload tests: https://bitbucket.org/masterzorag/gles_test

final thoughts: I don't know how much more work is needed to get it to work, however as it stands though, Context creation and shaders are the only visible issues.

Update1:
A potential issue with our current piglet config is that we're using the config we retrieve from an active piglet session, which means the value could have be set/changed after the config step, but this below is an example of config that's actually getting passed to piglet which comes from libSceOrbisCompatForVideoService.sprx (FW 1.01-1.76)

		union __attribute__((aligned(32))) m256i
		{
			int8_t m256i_i8[32];
			int16_t m256i_i16[16];
			int32_t m256i_i32[8];
			int64_t m256i_i64[4];
			uint8_t m256i_u8[32];
			uint16_t m256i_u16[16];
			uint32_t m256i_u32[8];
			uint64_t m256i_u64[4];
		};

		union m256i config[4] = {0};
		union m256i* v20 = &config[0];
		union m256i* v21 = &config[1];
		union m256i* v22 = &config[2];
		v20->m256i_i64[3] = 0x2000000LL;
		v21->m256i_i64[1] = 0xA000000LL;
		*(int64_t *)((char *)&v22->m256i_i64[2] + 4) = 0x20000000200000LL;
        
		//1.01-1.76
		v20->m256i_i64[0] = 0x800000080LL;
		//4.05-4.55
		v20->m256i_i64[0] = 0x800000088LL;

		v20->m256i_i32[2] = 0;
		*(int64_t *)v22->m256i_i8 = 0x43800000780uLL;
		v22->m256i_i32[7] = 2;

beside v20->m256i_i64[0] everything seems to be the same since 1.01.
note 0x80=>0x88 value referred to the size to be copied, yet I didn't notice any change in structure size.
that should be usable as is and config is what you'd pass to scePigletSetConfigurationVSH(&config).