Opengl 4.1 Has Full Support For Es


I guess this is probably a dumb question, but PowerVR SGX supports OpenGL 4.1 then right?

Does this mean that current games and apps that uses OpenGL will be ported easier? Or only stuff that uses OpenGL 4.1?
 
quadomatic said:
I guess this is probably a dumb question, but PowerVR SGX supports OpenGL 4.1 then right?

Does this mean that current games and apps that uses OpenGL will be ported easier? Or only stuff that uses OpenGL 4.1?
No you have this in reverse. PowerVR SGX supports ES (1.1 and 2.X)
PowerVR have special SDKs you can install for linux on Windows that let you compile and run OpenGL ES programs...

OpenGL 4.1 means that you won't have to install this SDK as the ES subset will now be fully supported. This is good because getting the SDK running on 64bit Linux was a little tricky...

It also means you can start writing GL apps only using ES code and they will be supported on the desktop too (provided people are up to date with their drivers.)
 
Last edited by a moderator:
I never got why can't support the full Open GL anyway.

Can someone explain it for me?
 
We can, just no one has written the drivers, at least not for OMAP3. It's a lot more work writing say, an OGL 2 driver than an ES 2 one.
 
Pardon me for pointing out something which might be obvious, but wouldn't writing that driver be worth pursuing? The benefit would be huge! Think of all the games which could be ported.
 
Worth pursuing for us? Can't write a driver without having a thorough low level understanding of how the hardware works. Maciek Urbanski was reverse engineering it but he didn't publish his findings. It's an incredible undertaking.
 
This is great news, now I just have to buy a totally new computer and I can use fixed-point math in regular OpenGL!
>:/

I don't want to type out the whole "Lulzfish tries to OpenGL but fucks it all up Saga", but it goes sort of like this:

The Pandora has weak floating-point math from what I have heard
Luckily OpenGL ES supports fixed-point math
I don't even really like floating-point so I wrote a fixed-point class and made a platformer tech demo (not a game really) based on it.
I was totally about to make some shit happen in OpenGL, but

GLSL represents vertices as floating-point vectors.

I have no idea how to reconcile this. I don't want to write a bunch of fixed-point stuff and then suddenly have my shaders fuck it up.
I don't want to write a bunch of floating-point stuff that runs slowly on the Pandora.

I'm just tired of writing shit like fixed-point classes and matrix classes* that should already exist.

* Some video on YouTube said I should write my own matrix class for OpenGL ES, because GL matrices are fixed-point or something. I don't even know what's going on.

I have lost all faith in OpenGL. It reminds of X11 now. It's so fucked up, and I want somebody to bury it under something like Qt so I don't have to fuck with it anymore.

tl;dr: This doesn't help me, it just reminds me of agony.

@SomeGuy99: We can't write drivers because the hardware is undocumented because all the popular video chip guys like money and IP.
I think someone made a compatibility library that bridged between normal OpenGL and ES, but even that wouldn't solve the problems I had in the spoiler.

Edit: Thanks for the responses, I'll look into it again some time. right now, I don't have anything set up for OpenGL, really.... I never wrote that animation tool I wanted.
 
Exophase said:
Worth pursuing for us? Can't write a driver without having a thorough low level understanding of how the hardware works. Maciek Urbanski was reverse engineering it but he didn't publish his findings. It's an incredible undertaking.

What about the wrappers that are being used? Can they be applied to OpenGL games?
 
Last edited by a moderator:
Those are compile-time wrappers, but the same principle should work I think. From a performance perspective it's not ideal.
 
lulzfish said:
I have lost all faith in OpenGL. It reminds of X11 now. It's so fucked up, and I want somebody to bury it under something like Qt so I don't have to fuck with it anymore.
An abstraction layer for OpenGL in Haskell or OCaml... *drools just by thinking about its potential*

*Goes to Hackage, Hoogle and Hayoo entering "OpenGL" as the search query*
 
Last edited by a moderator:
Haha, yeah, a pure functional language and a graphics API that's built entirely around implicit state, sounds like the perfect coupling.
 
Exophase said:
Haha, yeah, a pure functional language and a graphics API that's built entirely around implicit state, sounds like the perfect coupling.
Creating a scene graph-based API that gets sent to an imperative-style isolated renderer would be the way to go of course. It'd probably even be possible to wrap [acronym='OpenSceneGraph']OSG[/acronym] and use it for the back end, because it's already using the isolated-state model that functional languages need (and that all code should be using to have a hope of running well on resource-limited platforms as well as scale well on multi-core systems).

EDIT: Why it'd be worth it? Because the abstraction that would be achieved would be incredible. As you all know, I like the idea of the compiler doing the platform-specific optimization work for me. Why should a programmer have to do bit-twiddling for individual target platforms? The programmer should be writing completely portable code (Without #defines or in-line ASM or platform-specific makefiles) and it should be the task of the build tools to do platform-specific stuff. Having the additional advantage of using languages that allow you to write (mathematically proven to be) bug-free code is an advantage too, of course...
 
Last edited by a moderator:
Unless I read the article wrong this does nothing for us. The OGL 4.1 spec is backwards compatible with a OGL|ES 3.x spec which our hardware doesn't support.
 
greendots said:
Unless I read the article wrong this does nothing for us. The OGL 4.1 spec is backwards compatible with a OGL|ES 3.x spec which our hardware doesn't support.
Slashdot always reads things wrong.
Well, so I'll have OGL 2 on a desktop and OGLES 2 on a Pandora.
Just as expected.

I wonder if parallel-universe-lulzfish has to put up with this in DirectX.
 
Last edited by a moderator:
lulzfish said:
greendots said:
Unless I read the article wrong this does nothing for us. The OGL 4.1 spec is backwards compatible with a OGL|ES 3.x spec which our hardware doesn't support.
Slashdot always reads things wrong.
Well, so I'll have OGL 2 on a desktop and OGLES 2 on a Pandora.
Just as expected.

I wonder if parallel-universe-lulzfish has to put up with this in DirectX.
4.1 probably brings EGL to the desktop environment too. Meaning if you choose to you can write your games purely in GLES and run it straight on your desktop. It benefits the developer more than the end user.
 
Last edited by a moderator:
Yeah, this is all well and good .. but .. ermm .. yawn. Wake me up when there are OpenGL4.1 drivers/library-packages actually *shipping* ..
 
Exophase said:
....<snip> Maciek Urbanski was reverse engineering it but he didn't publish his findings. It's an incredible undertaking.
Do you think begging/pleading and/or offering a CraigIX bounty would help?
 
Last edited by a moderator:
I mean, its all well and good and everything, don't get me wrong .. but how realistic is it that we're going to get drivers/packages for Angstrom that will give us all this? Not really, and in the meantime, OGL ES 1.1 and 2.0 are there and can be used ..
 
Back
Top