Controlling World Update Independent Of Frame Rate


rima

Still Fresh
Joined
Nov 17, 2007
Messages
5
So I'm slowly going insane over this, hopefully my error will be obvious to someone.

I have my game running on windows and the GP2X, the only problem is movement in the GP2X version is slower than the windows version, even though I thought I had made it frame rate independent. ie: you can walk from the bottom left to the bottom right of the game world faster on windows than the GP2X.

I have a tick interval in place in my game loop that I thought was restricting my update function (which moves the player) to only be called every X milliseconds. Heres some simplified code taken from my work:

CODE
const int TICK_INTERVAL = 35;
int lastUpdateTime = 0;
int elapsedTime = 0;

while the game is running
{
elapsedTime = SDL_GetTicks() - lastUpdateTime;

// only perform world updates at given tick intervals.
if( elapsedTime >= TICK_INTERVAL )
{
// record the time we updated at.
lastUpdateTime = SDL_GetTicks();

keyStates = InputHandler->RetrieveInput( );

// moves the player around the virtual world based on input.
world->update( keyStates );
world->checkCollisions( );
}

ClearScreen();
world->Render(); // render the world as fast as possible.
}


I know for a fact there must be a bug somewhere in there because I monitored the number of times world->update() was called for each platform (this is what causes movement of the player) and on the windows platform it is called more often than on the GP2X, hence the reason why you end up moving faster on the windows version.

The only way I see the windows version being able to perform more updates than the GP2X is if the GP2X can't manage to do the a full loop in TICK_INTERVAL, but even if I up this value to something stupid around, 200 for example, I still experience a speed difference between the two platforms.

Any help is greatly appreciated, thanks.
 
I'd do it a bit differently and see a little difference, would suspect frame dropping if it's a big difference. It would seem not all clocks are created equal. Doing it that way the gp2x will always keep up the best it can but the pc will always spend most of it's time waiting. Quasist is probably right.

CODE

unsigned int timeleft(unsigned int x)
{
static unsigned int next_time = 0;

Now = SDL_GetTicks();
if ( next_time <= Now ) {
next_time = Now + x;
return(0);
}
return(next_time-Now);
}

do
{
if ( Gstate == PLAYING )
{
Clock++;
if ( Boosting && Boost < MAXBOOST ) // truck 'celeration
Boost+=3;
else if ( ! Boosting && Boost )
Boost--;
if ( vol )
racketvolume( vol );
doevents();
#ifndef GP2X
if ( ! Paused )
#endif
collide();
if ( ForkFling )
{
flingfork();
ForkFling = 0;
}
drawbg( Screen, lpCow );
drawdead( Screen, lpCow );
...
SDL_Flip( Screen );
SDL_Delay(timeleft( RATE )); // 1/18 = 55 1/12 = 83
}
while ( Gstate );
 
quasist said:
Maybe updating world is CPU-consuming one operation?

But if I put the tick interval up to say 200 milliseconds, something that both platforms are capable of performing at, so I only call update every 200 milliseconds, everything slows down across both platforms but not equally, the windows versions is still capable of moving faster than the GP2X (even though both now move slower) :unsure: or did you mean something else quasist?

Sphinxter said:
SDL_Delay(timeleft( RATE )); // 1/18 = 55 1/12 = 83

I havnt tried implementing this Sphinxter, but I take it from the quoted line, that you limit then FPS across both platforms (mainly for windows sake) in order to achieve balance. I'm trying not to go down this route, as there is no harm in allowing the world to be rendered as fast as possible but there is a problem if I dont control the updating (movement) equally across platforms.
I have just been doing some further testing on the matter and I've discovered something strange. I did the following and found that the windows version returns a correct result, 10 seconds almost dead on, however the GP2X returns just over 12seconds. So the GP2X has accumulated an extra 2 seconds from somewhere, any idea's why or how?

CODE

int timer = SDL_GetTicks( );

// game is run for what is "supposed" to be 10 seconds, then the loop is broken.
while the game is running (for 10 secs)
{
... same as before
}

int totalTime = SDL_GetTicks() - timer;
 
Last edited by a moderator:
SDL_GetTicks() can be highly inaccurate. You may want something that has a finer resolution and accuracy.
 
yaustar said:
SDL_GetTicks() can be highly inaccurate. You may want something that has a finer resolution and accuracy.

Any recommendations yaustar?
 
Last edited by a moderator:
QUOTE

I have just been doing some further testing on the matter and I've discovered something strange. I did the following and found that the windows version returns a correct result, 10 seconds almost dead on, however the GP2X returns just over 12seconds. So the GP2X has accumulated an extra 2 seconds from somewhere, any idea's why or how?

CODE

int timer = SDL_GetTicks( );

// game is run for what is "supposed" to be 10 seconds, then the loop is broken.
while the game is running (for 10 secs)
{
... same as before
}

int totalTime = SDL_GetTicks() - timer;




Function call overhead could be much greater on the lesser platform and might account for a few ticks. PC's as a general rule make for pretty crappy clocks, if left alone all lose or gain time. Consider the effects of both platforms and being even farther off than that is not too surprising. I predict you wind up isolating the controlling values and just ifdef the living shit out of it. Odd thought, are you using the same compiler/flags on both?
 
rima said:
Sphinxter said:
SDL_Delay(timeleft( RATE )); // 1/18 = 55 1/12 = 83

I havnt tried implementing this Sphinxter, but I take it from the quoted line, that you limit then FPS across both platforms (mainly for windows sake) in order to achieve balance. I'm trying not to go down this route, as there is no harm in allowing the world to be rendered as fast as possible but there is a problem if I
The rate of commercial animation starts at 18 fps, anything else is extra and higher is not always better, assuming the number of sprites, effects etc. is changing and can take different amounts of time drawing using a delay like that means to take the highs and lows out and make your animations smooth and consistent.
 
Last edited by a moderator:
Back
Top