torpor said:It may not be so helpful for me to suggest this to you at this stage, since you're pretty committed to SDL it seems, but if you want some example code that works to get the GL ES 1.1 setup done quick and easy, check out the code in Pandora WakeBreaker: http://w1xer.at/pandora/
If you look at main.cpp you can see all the GL ES setup stuff, then just check out the Texture class for details on how to set up a quad and all that for texture displays .. WakeBreaker uses a couple textures (low-res) for the splash screen and so on - these could easily be abstracted to handle any texture size, etc. You'll need a .png/texture-file loader, however ..
notaz said:ofbset -fb /dev/fb1 -pos 0 0 -size 800 480 -mem 307200 -en 1
fbset -fb /dev/fb1 -g 320 240 320 480 16
export SDL_VIDEODRIVER=fbcon
export SDL_FBDEV=/dev/fb1
op_runfbapp ./your_app_here
ofbset -fb /dev/fb1 -pos 0 0 -size 0 0 -mem 0 -en 0
[/code]
What it does:
- allocates OMAP DSS layer, asks video output to be 800x480 at position 0,0 (could set it to 640x480 at 80,0 instead to get centered 2x scaling of 320x240). 307200 bytes of video memory are allocated for 2 320x240 16bpp screens (for doublebuffering).
- sets video mode to 320x240@16bpp, virtual resolution 320x480 for doublebuffering.
- asks SDL to use the new layer through /dev/fb1
- runs your app and hides X (since SDL is now in framebuffer mode)
- cleans the video layer on exit
After this, you can act as if the screen was 320x240 in your code.
Hopefully in future we'll have SDL handling this transparently
*looks at paeryn's post above*
paeryn: I'm slowly documenting kernel stuff at http://pandorawiki.o...ernel_interface , could probably expose OMAP DMA for GP2X blitter like functionality.
Do you think this video output number could be put into an option menu for emus? This way the user could decide which method to use for the scaling rather than it being hardcoded in. So if a user wanted straight 2x in Picodrive he would set this number to 640 x 448 to get it. Fullscreeners would set to 800 x 480 etc.notaz said:What it does:
- allocates OMAP DSS layer, asks video output to be 800x480 at position 0,0 (could set it to 640x480 at 80,0 instead to get centered 2x scaling of 320x240). 307200 bytes of video memory are allocated for 2 320x240 16bpp screens (for doublebuffering).
- sets video mode to 320x240@16bpp, virtual resolution 320x480 for doublebuffering.
- asks SDL to use the new layer through /dev/fb1
- runs your app and hides X (since SDL is now in framebuffer mode)
- cleans the video layer on exit
After this, you can act as if the screen was 320x240 in your code.
Hopefully in future we'll have SDL handling this transparently
*looks at paeryn's post above*
paeryn: I'm slowly documenting kernel stuff at http://pandorawiki.org/Kernel_interface , could probably expose OMAP DMA for GP2X blitter like functionality.
Are you referring to GL_IMG_texture_stream2? It appears to be in the current drivers. But the only meaningful information I could find about it was this TI code: https://texinst1.gforgegroup.com/gf/project/gleslayer/scmsvn/?action=browse&path=%2Ftrunk%2FPackages%2FOMAP3_Graphics_SDK%2FGLESLAYER_SGXSINK_20%2Fsgxsink_api.cpp&revision=13&view=markupExophase said:The only feasible option is TI's texture stream extension, and I'm not aware if that's supported on Pandora yet or not. I hear that even with that the performance is lacking from what you'd expect, although I haven't seen extensive testing on it.
darkblu said:Are you referring to GL_IMG_texture_stream2? It appears to be in the current drivers. But the only meaningful information I could find about it was this TI code: https://texinst1.gfo...=13&view=markupExophase said:The only feasible option is TI's texture stream extension, and I'm not aware if that's supported on Pandora yet or not. I hear that even with that the performance is lacking from what you'd expect, although I haven't seen extensive testing on it.
Unfortunately I'm yet as pandora-less as the best of them. But I've been using paeryn's services to run some code on the panda. Apropos, if you unspoiler the second spoiler in that post you might see interesting things about the current SGX driver edge.Exophase said:Yeah that. Do you have a Pandora? If so, are you willing to write a test program using it and measuring performance?
Here's a reference to someone who used it: http://bloggingthemonkey.blogspot.com/2009/10/texture-streaming.html
No for current SDL (which runs on unmodified x11 driver now). AFAIK nobody worked on SDL for OMAPs so far..Exophase said:- Does SDL's double buffering work with this?
- And if it does, will SDL wait for vsync using the proper method? If not (or no double buffering at all) then this is a good reason to not use SDL for video, for the time being.
yes they can be called (zodttd does that), no need for root, pnd is always started with regular user rights. ofbset and fbset are just frontends for some driver ioctls, you are free to use those directly instead.Exophase said:- Is it safe to call ofbset and fbset while the program is running using system or some other method? Do they need root? (I have no idea if the PND run script is ran with root or whatever)
RGB565, not at the momentExophase said:- What's the pixel format of the 16bpp framebuffer? Is this in any way configurable?
Driver supports some, but I've never tried them.Exophase said:- Are other formats like paletted 8bpp options? Not that I personally care but someone might want them.
Yes, but I'm thinking about reserving it for system functions like TV-out.Exophase said:- Can you allocate the second scaler, say to fb2?
priority is always fb0 < fb1 < fb2 (fb2 on the top, then 1 and 0). There is transparency color keying support.Exophase said:- And if you can, how exactly do the two interact? You said they're both overlaid on the screen. Does this mean that they're on top of /dev/fb0; for instance, could you write directly to /dev/fb0 to draw a border? If they overlap, is there any means for transparency or blending? How is priority determined?
Yes, anyone can mmap any framebuffer and fight for it.Exophase said:- Can other apps steal the framebuffer from me? Could the system potentially? It'd be good to know that one day we could have a system menu which steals the framebuffer from us (perhaps pauses our program) and pops up on top of it. I want to know if this is a reason not to use it.
Currently 50 and 60 only, 120 can also be done but it's less efficient due to scanning system memory more (do you think it's worth it anyway?)Exophase said:- Is the default refresh rate 60Hz? How can it be changed? Is 120Hz an option?
yes but it's still up to the coder to do it. And it does that 5-tap poly-phase interpolation that you hate so much.DaveC said:Do you think this video output number could be put into an option menu for emus? This way the user could decide which method to use for the scaling rather than it being hardcoded in. So if a user wanted straight 2x in Picodrive he would set this number to 640 x 448 to get it. Fullscreeners would set to 800 x 480 etc.
darkblu said:Unfortunately I'm yet as pandora-less as the best of them. But I've been using paeryn's services to run some code on the panda. Apropos, if you unspoiler the second spoiler in that post you might see interesting things about the current SGX driver edge.
darkblu said:Re that extension, I have the gut feeling it's YUV-centric, but I may be totally wrong about that.
Sounds like SDL could be some kind of bottle neck if you want good performance. h34r:notaz said:AFAIK nobody worked on SDL for OMAPs so far..
Thanks for the offer, I'll surely pester you with pointless code build requests. What IRC do pandoreans frequent, btw?Exophase said:Okay, I can help compile things for you too if you need it. It'd be best if we could talk on IRC or AIM or something. I just don't want to have to dig a lot myself because I really suck at OGL - I've been struggling big time to get double buffered PBOs working on my desktop. I had them working and moved them somewhere else and poof, white screen.
Well, the current extension set is not exactly earth-shattering, by I tend to follow the 'half-full' philosophy, so things likeI'm really not very excited about the GL extensions available right now. No depth textures, multiple render targets, depth/stencil readback, and so on. The annoying thing is that TI actually claims you can read back depth buffer and stencil:
http://processors.wiki.ti.com/index.php/Render_to_Texture_with_OpenGL_ES
GL_OES_vertex_half_float
GL_OES_texture_float
GL_OES_texture_half_float
GL_OES_mapbuffer
GL_OES_fragment_precision_high
GL_OES_standard_derivatives
Could be. I'm still looking at that TI code there. Do you happen to have any other links to resource on the subject?I don't see any reason why it should be tied to YUV - if it does as advertised then it should be allowing to replace textures directly and this should mean any texture format. The blog post does say "or RGB", using it for YUV conversion was just the useful application.
I still use it in PicoDrive..Exophase said:That is, unless the OSS emulation can be trusted.
Haven't really looked into that but I know there are some registers for the filter. Currently the driver uses single static configuration bet we should be able to tune that in future.Exophase said:One other thing - is it possible to choose the taps of the polyphase interpolation or are those modeled dynamically based on the resampling factor?
darkblu said:Thanks for the offer, I'll surely pester you with pointless code build requests. What IRC do pandoreans frequent, btw?
darkblu said:GL_OES_vertex_half_float
GL_OES_texture_float
GL_OES_texture_half_float
GL_OES_mapbuffer
GL_OES_fragment_precision_high
GL_OES_standard_derivatives
darkblu said:Lack of depth texture is a compatibility issue more than anything - i.e. instead of the easy way of reading back the depth buffer, shadow code on the panda will have to output its own depth to (half-) float textures. Not a biggie, unforseen performance hurdles nonwithstanding.
darkblu said:Could be. I'm still looking at that TI code there. Do you happen to have any other links to resource on the subject?
notaz said:I still use it in PicoDrive..
notaz said:Haven't really looked into that but I know there are some registers for the filter. Currently the driver uses single static configuration bet we should be able to tune that in future.
Is there no way to avoid the blurryness of this then? No way to shut off the interpolation even if you are just doing a pixel double?notaz said:yes but it's still up to the coder to do it. And it does that 5-tap poly-phase interpolation that you hate so much.DaveC said:Do you think this video output number could be put into an option menu for emus? This way the user could decide which method to use for the scaling rather than it being hardcoded in. So if a user wanted straight 2x in Picodrive he would set this number to 640 x 448 to get it. Fullscreeners would set to 800 x 480 etc.