GP2X Gpu940 - Getting A Capture Of The Screen


efegea

Active Member
Joined
Aug 8, 2005
Messages
636
Age
39
Location
GP32Spain, Spain
Is it possible to access to get a copy of the framebuffer? And also, uploading it again to gpu940, so you can modify it and it is shown? A way to access the actual buffer?

For example: Getting a copy of the screen, then apply a filter for example a blur, and last alpha-blend it with the screen, and swapbuffers so it's shown on the screen...

If not, is possible to add this feature to gpu940? Could give some nice effects ;)

The filter is done on the main cpu (not on the 940) so we have 200mhz to do it while the 940 is busy doing 3D operations

Nintendo DS can do this but *I think* it's done in hardware

If we can access it as a pointer to the buffer then it will be easy :)
 
If gpu940 has a manual buffer swap which is not just a buffer copying routine (I would of thought so, otherwise there would be a lot of time wasted) then you should be able to rip the frame buffer straight out of memory, do what you want to it, and then do the buffer swapping yourself.
 
With HW accelerated OGL these effects are done with the Accumulation buffer. I don't know if the NDS has that but I don't think GPU940 does... You will have to manually do accumulation effects...
 
Im not sure I understood properly what you asked for.

If you use GL, then probably the answer is no.
If you use direct access to gpu940 commands, it would be pretty easy to do any filtering of the picture before displaying it, since you can know when gpu940 is done with a frame (you know the address of the frame because you gave it to gpu940), and then you can apply your filter before submitting the frame to the display list (ie the list of finished frames that are ready to be displayed.

With GL it's more complicated, but you can achieve a similar behavior if you mix low-level GPU access with GL calls.
 
Well I'm going to do it mixing gpu commands and opengl.

In opengl there is the "buffers" struct that contains two buffers, out and depth. So I think "out" is the buffer that I want.

What do I then? I know there is a function gpuBuf_get_loc(); that gives me the address of the buffer, but what have I to do with it? I know it returns a struct with the address, the width and the height, but what to do with them?

And also, why the width is 9 and 250 the height? what do they mean? I'm talking about gpuAlloc(buffer,9,250);

What I want is a pointer to the buffer data and manipulate it.

What is the buffer format? RGB? RGBA?
 
efegea posted on Feb 15 2007 at 10:27 PM said:
In opengl there is the "buffers" struct that contains two buffers, out and depth. So I think "out" is the buffer that I want.

Actually it contains 3 sets of buffers (one currently displayed, and two that are worked on).

What do I then? I know there is a function gpuBuf_get_loc(); that gives me the address of the buffer, but what have I to do with it? I know it returns a struct with the address, the width and the height, but what to do with them?

The 'address' is in fact an offset in shared->buffers, which is a portion of the upper 32MB mmaped from /dev/mem and accessible to your application.

And also, why the width is 9 and 250 the height? what do they mean? I'm talking about gpuAlloc(buffer,9,250);

The height is 250 pixels. The with is 2^9 pixels (512).

What is the buffer format? RGB? RGBA?

Have a good time, it's YUV :)
 
Last edited by a moderator:
I got it working. Well, not at all...I can modify the buffer, but only that. I have to do the filters to the 3 buffers at each frame, and that's too cpu expensive. At least on my pc it misses a lot of frames only with looping throught each buffer.

The thing is I have to divide the data into YUV values, so I have to do a for loop to go throught each pixel and do the separation intro three vars y,u,v. And when finished the filtering merge them again into the buffer data. Only these loops, without doing the filtering, make gpu940 to miss frames. Just imagine with that AND doing the filtering, it will miss too much frames :rolleyes:

Oh, and also to do the filtering it probably should be necessary to do a conversion to RGB, do the filtering and then convert it back to YUV, so more CPU cicles to waste.

All of that three times for frame :eek:

So no, I'll can't do the filtering I wanted to do :(


EDIT: I've discovered that I have not to render 3 times for frame...yeah!

I have done a edge-detection filter :D




But then, how it's done in Payback? It does something like what I want to do: an HDR-like effect and I thought they did it this way...
 
Make a table of the known/suspected inputs and output of the effects you like, and try to derive a relationship :)
 
Basically HDR means that the colors are calculated within the bounds of reality rather than the bounds of your screen, then clipped. Its not a filter, more of a different sort of rendering.
 
mmm...I'm working on a screenshot capture function for my engine. It's too hard to get it working, because of the yuv->rgb conversion needed, and because I don't know the exact format size and address of the buffers.

This is what's I got for now, a black & white (grey) image with tons of weird data around. I'm using a size of 512x512, _I know_ it's not the buffer size, but at least it works, well, kind of that..


No, it's not a Nintendo DS2X :lol:
 
Back
Top