The code assumes all chars are signed. Programs compiled for the Pandora are unsigned chars. Causes a problem when the engine is trying to compare negative numbers, and throws it into an infinite loop of trying to parse the same character over and over again.
-fsigned-char option and bada bing, it runs!
Gimme a couple of hours to tidy it up a bit and I'll post a preliminary PND.
0_o
Possibly a little off topic, but could you please elaborate?
if you were to:
Foo would be an unsigned char?
EDIT: Man I should'a read the docs first.
http://gcc.gnu.org/onlinedocs/gcc-4.3.5/gcc/C-Dialect-Options.html#C-Dialect-Options
I feel ashamed, yet i'll admit I didn't even know there was the 'signed' keyword. I thought 'char' without unsigned had always meant signed. *goes to fix his code*.
Surely not?
Is this true for other types? If so, I too have a portability issue(I've used "short"s without "unsigned" in front, for the very first and last time (I hate signed variables, and avoid them any time I can - if I see a 2 byte int I want it to go 0 to 65535. It's etched in my mind) when I made my screen x/y coords in my RPG. Every other time, i'm 'positive' I've used unsigned vars* (oh that was a baad pun)).
I'd not be surprised if Makslane/other devs (after the opensourcing - it was non-free, in both freedom and beer (it was like $20 AUD for 3 months updates, and idk, $70 for a whole year) and that's when I used GE) did some nasty hacks that relied on the processor doing signed ints directly, which the Cortex-A8 mightn't (but well, should(it's not that *R*ISC
), does anyone at all still do it the ye olde fashioned way - besides on old hardware?). Game Editor has worked on GP2X for aaages, so I'd doubt that this would actually be the issue.
Nice job, btw! I'll let my friend know; he made the 'Game Editor GUI' in the demos section on the website, and continues to use Game Editor. He'll be glad to hear he can run his games on the Pandora now.
It's really a great start for new programmers, who wanna get in there and make some simple games, learn variables(erm - 'see' later), events, you know, the elements - oh, but don't treat it like C - or when you do real C you'll find yourself saying multiple times "but that's not how it was in GE!" - I'm guilty of this
.
WizardStan, I know you had mentioned it was slow in the other thread, but I don't see the point posting there too - Speed can't be expected to be great, expect something similar to (bytecode'd) python, or in case the Pawn interpreter is using JIT in GE now, more like Java.
EvilDragon, that's great to see you aren't noticing slowdown. But note most games will be 'actor'(sprite/object) heavy, and the other thing that makes GE slow will take more effect - (Software, unless changed) Pixel-accurate collision on everything(also unless changed, I doubt it, only not long ago a friend of mine did his own collision detection to dodge the impact).
I think they're using OpenGL now. I vaguely remember it being mentioned - but if that's true, much faster than way back when.
However, it's unfortunate, but in this universe you can't do something that's so easy and have as good results in every regard as the harder method (besides when the harder method is just plain wrong
)
* - except when doing dirty hacks where i'd check for a value below 0 to determine overflow(one thing you might appreciate in ASM is access to an 'overflow' flag on most archs
), that I hope I've always changed before release! I don't know whether it's the 8-bit uC programmer in me, but i'm always found trying to crunch stuff into a single byte, even though it makes it no faster on 16-32-64-128bit machines
.