Linux Optimization Advice


Butterman

Gief Pandara
Joined
Jan 30, 2009
Messages
776
Location
London
Website
Visit site
Hey guys. I've just got TINCS compiling on Linux. But it won't play nice. The server starts up fine, runs fine, as far as I know. But once you enter the game as a client, you move very slowly, (this is probably because the server cannot complete its game logic thread very fast, which doesn't make sense, it's threaded and game logic is running in a separate thread, yet while the server is running I'm getting 100% usage on CPU1 and like 10% on CPU2). Game logic thread is created after starting too, so it should be occupying in the second CPU. Then every few seconds the client will lock up for about 20 seconds, general rubbish like that. But when the client is moving about, the stuff that doesn't have to go through the server, which is only turning the camera, runs smooth as hell, so I've ruled out my graphics drivers.

I really think the server is to blame here, everything ran PERFECT on Vista x64. But why is this only happening on Linux? Am I doing it wrong? I use g++ and a custom makefile to compile.

Any advice would be great.

UPDATE: Well, that was stupid of me, I had another build running in a different workspace that was unable to connect to the server and stuck in an endless loop. But the main problem still isn't gone, when more than one person is connected to a server, they both lock up for 90% of the time.
 
What graphics card have you got? my instinct says its a graphics driver related problem.
 
Hessiess said:
What graphics card have you got? my instinct says its a graphics driver related problem.
I've got a 9600GT. I'm trying to install the latest NVIDIA drivers now, last time I tried I had forgotten to disable the 170.X series ones, and Ubuntu would no longer boot. I'm just about to do it again now.

I really don't think it's video card related, I think the server is running into trouble. I've just made some quick modifications, I'm going to check them out now, if that isn't helping, I'll go ahead and install those drivers.
 
Last edited by a moderator:
Are you sure that your C++ threading code is cross-platform? Have you tested it on vista on a dual-core cpu?
 
Butterman said:
I really think the server is to blame here, everything ran PERFECT on Vista x64. But why is this only happening on Linux? Am I doing it wrong? I use g++ and a custom makefile to compile.

Any advice would be great.
Heh... Could be a number of things.

Could be something spinlocking on you- what sort of synchronization primitives are you using?

Could be something working a bit differently that you expected from the Vista to Linux side of things (Network code doesn't work QUITE the way you expect it to and it's oftentimes better to use something like ACE (if you're doing low-level network type code...) or Grapple (High-level type code...) instead of coding WinSock2 and BSD sockets code yourself.

It could be a piece of cut-n-pasted code that's run away with your server.

It could be a piece of braindead-ness from VC++ and Windows that's run amok (Windows will let you do some of the most STUPID things like altering the memory contents of a declared constant. Both GCC and VC++ will give you a warning, but Windows will let you change things. Under most other OSes, you get an immediate segfault on the attempt, which is what should happen... ;) ).

Without having your code in front of me, I can't be more specific on what might be busted.

I suggest learning how to use oprofile or sysprof to find out where you're eating your cycles- it'll be at least somewhat helpful in finding where it's borked on the Linux side because it'll give you numbers on what functions/methods are getting ran and the percentages of cycles spent in the same.
 
Last edited by a moderator:
sindbad said:
Are you sure that your C++ threading code is cross-platform? Have you tested it on vista on a dual-core cpu?
I wrote it on Vista on my dual core. It worked fine, it's SDL_threads.

Svartalf said:
Butterman said:
I really think the server is to blame here, everything ran PERFECT on Vista x64. But why is this only happening on Linux? Am I doing it wrong? I use g++ and a custom makefile to compile.

Any advice would be great.
Heh... Could be a number of things.

Could be something spinlocking on you- what sort of synchronization primitives are you using?

Could be something working a bit differently that you expected from the Vista to Linux side of things (Network code doesn't work QUITE the way you expect it to and it's oftentimes better to use something like ACE (if you're doing low-level network type code...) or Grapple (High-level type code...) instead of coding WinSock2 and BSD sockets code yourself.

It could be a piece of cut-n-pasted code that's run away with your server.

It could be a piece of braindead-ness from VC++ and Windows that's run amok (Windows will let you do some of the most STUPID things like altering the memory contents of a declared constant. Both GCC and VC++ will give you a warning, but Windows will let you change things. Under most other OSes, you get an immediate segfault on the attempt, which is what should happen... ;) ).

Without having your code in front of me, I can't be more specific on what might be busted.

I suggest learning how to use oprofile or sysprof to find out where you're eating your cycles- it'll be at least somewhat helpful in finding where it's borked on the Linux side because it'll give you numbers on what functions/methods are getting ran and the percentages of cycles spent in the same.


I'm getting no warnings when I compile. I'm using SDL_net for handling sockets. I don't know what spinlocking is?

I ran sysprof while running two clients and the server. The clients ate up most of the CPU, but with stuff you would expect, like drawing functions. The CPU load never went very high either. The server was spending most of its time doing stuff I would expect too, but, still, very little time was spent doing that.

I think the problem, is entirely client based, it locks up, the camera stops being able to move, (which tells me, that there is something wrong with the client, as camera rotation is done on the client), animations stop, for example, the gun stops moving. It just locks up, if I'm running two clients, they wont necessarily lock up at the same time either. It happens if only one client is connected too, but takes about 5 seconds to start happening, I know it's a total lock up client side also because the client can be locked for so long that the server drops him.

Again, CPU load never goes above 50%, on either CPU, and the profiler is only telling me the obvious, that slightly more time is spent on drawing that anything else.

This is quite interesting, a whole load of time on the client is getting spent on vdso?

This is the only thing in the profiler that could possibly account from the locking up and I reckon not enough time is getting spent on it for it to be the problem.

2po49d0.png
 
Last edited by a moderator:
Butterman said:
I think the problem, is entirely client based, it locks up, the camera stops being able to move, (which tells me, that there is something wrong with the client, as camera rotation is done on the client), animations stop, for example, the gun stops moving. It just locks up, if I'm running two clients, they wont necessarily lock up at the same time either. It happens if only one client is connected too, but takes about 5 seconds to start happening, I know it's a total lock up client side also because the client can be locked for so long that the server drops him.
Here's a bit of a differing line of thought for you... What 3D api are you using for Vista and which are you using for Linux?

QUOTE

Again, CPU load never goes above 50%, on either CPU, and the profiler is only telling me the obvious, that slightly more time is spent on drawing that anything else.

This is quite interesting, a whole load of time on the client is getting spent on vdso?

This is the only thing in the profiler that could possibly account from the locking up and I reckon not enough time is getting spent on it for it to be the bottom.



Is your rendering path separate from the network piece and separate from the UI piece? If so there may be a synchronization pinch-point in your client code that's hanging on the render pass to OpenGL. If you're doing something that waits until engine idle, you're stalling the pipeline and perhaps stalling the other pieces. If you're not doing things in separate threads, you could be REALLY stalling out everything against some "oops" on the rendering pipeline. A notable example of something like this fubaring a game with low CPU utilization and dropping to < 1 fps rendering rates would be with certain AMD driver versions and KoTOR or KoTOR II. The engine did some iffy reuse patterns for VBOs, stalling the pipeline all to hell on some levels. The reuse itself wasn't the problem, but rather the filling, use, remap/re-fill, and re-use all intra-frame instead of inter-frame like one should be doing that nasty trick.

Oh, and spin-locking is where you have the thread or process spinning, eating CPU cycles while waiting on getting the lock for itself.
 
Last edited by a moderator:
Make sure that your Linux install is properly using OpenGL? You might have some issue that's slowing down rendering that is outside of the scope of the program. One important thing is to make sure that Linux is using nVidia's OpenGL libraries, not the default X ones.
 
Try Nexuiz or the Phoronix test suite to make sure your OpenGL works fine.

Could you try comenting out all rendering code from your client and replacing it with some stout messages ?
 
No need for that much, just install mesa-utils and try the various glx* tools from it.
 
sindbad said:
Try Nexuiz or the Phoronix test suite to make sure your OpenGL works fine.

Could you try comenting out all rendering code from your client and replacing it with some stout messages ?
My OGL works fine, I was playing Dystopia via Wine last night.

Laurent said:
This is obviously using nVidia driver and I can even tell it's version 180.22 :D
How? :huh:
 
Last edited by a moderator:
Back
Top