ConsoleTom
Member
Hi !
How are #define commands handled ? Is it just like copying a text ?
example:
#define INT_X 1+2+3+4
when i have this line:
for (i=0;i<=INT_X;i++) { ... }
would the compiler work like:
1.) for (i=0;i<=1+2+3+4;i++) { ... } or
2.) for (i=0;i<=7;i++) { ... }
My real question: For defining bits. Which one is better:
#define A 1<<0 // bit 0
#define A 1<<1 // bit 1
#define A 1<<2 // bit 2
or
#define A 1 // bit 0
#define A 2 // bit 1
#define A 4 // bit 2
Or how do good programmers write such code ?
Perhaps its stupid, but i want to write code that is as fast as possible.
Greetings
Tobias
SDK: DevkitARM_R8
How are #define commands handled ? Is it just like copying a text ?
example:
#define INT_X 1+2+3+4
when i have this line:
for (i=0;i<=INT_X;i++) { ... }
would the compiler work like:
1.) for (i=0;i<=1+2+3+4;i++) { ... } or
2.) for (i=0;i<=7;i++) { ... }
My real question: For defining bits. Which one is better:
#define A 1<<0 // bit 0
#define A 1<<1 // bit 1
#define A 1<<2 // bit 2
or
#define A 1 // bit 0
#define A 2 // bit 1
#define A 4 // bit 2
Or how do good programmers write such code ?
Perhaps its stupid, but i want to write code that is as fast as possible.
Greetings
Tobias
SDK: DevkitARM_R8