Why are all of the tutorials for C++?


C-compiler exists for anything[...]


java-harder to get the virtual machine setup going[...]
I guess that XxionxX means apps made by the programming language, not the language itself (nad the parts of it like compiler, interpreter or in java's case JVM etc) :)
Exactly! You hit the nail on the head. I am more interested about where you can put your programs then where you can put the languages. A VM sounds like it would just add to the programs size and make it larger(although size is becoming less of a problem these days). I would like to be able to port my programs to any device that I want (if that is possible and the device is open).
 
A VM sounds like it would just add to the programs size and make it larger(although size is becoming less of a problem these days). I would like to be able to port my programs to any device that I want (if that is possible and the device is open).
Just to clarify: The JVM is not a VM in the classical sense. While it IS a virtual machine, it doesn't emulate any operating system. Java programs, when compiled, result in bytecode instead of executables. The Java VM, which you need on every computer a java program should run, takes this bytecode and "translates" it for your machine. This Java VM is included in the java runtime (JRE), so if you want to run any java program on your computer, you have to install the JRE first. But it is NOT included in any Java program, so the program size does not increase (As an example, my Visual Cryptography program weights in at a whooping 27 KB).


The reason for that is that while you have to compile "normal" programs for each and every operating system they should run on and provide downloads for all of them, you have to compile your java program only once and it will run everywhere where the JVM is aviable. So, if you make a program in C, C++ etc, you'd have to compile it for your windows, for your linux, for the pandora, for the mac etc (depending on where you want to run it), having to use a cross compiler or even those operating systems at your disposal, yet if you make one in java, you only have to compile it once and put that bytecode online, and it will run on mac, linux, windows, bsd, pandora etc - everywhere where java runs.


If my talent for confusing people instead of clearing stuff up stroke again: http://en.wikipedia.org/wiki/Java_Virtual_Machine
 
^ A small note, if the Java program uses native libraries, they of course have to be ported before the program can run. Other than that you're golden :)
 
^ A small note, if the Java program uses native libraries, they of course have to be ported before the program can run. Other than that you're golden :)
There are always exceptions (And, of course, way too many java.lang.Exceptions :D )


One other, more common example would be file handling. If you make stuff like 'new File("C:\blubb\bla.txt")', it will obviously not run on linux because linux does not have such a path and does not use a backslash to sepparate directories. Of course, most of the time you would let the user select the file or would use a file in the user's home directory or the programs directory etc, all of which can be read in a system independent way, and there are various ways for which you (as a developer) do not need to know about file path sepparators, so this is not really an issue if you are aware of it.


 
Last edited by a moderator:
^ Yes, and I try to point them out as much as I can to not give people the wrong picture :p . My problem with the original argument (about C's superior performance and portability) was its blanket nature, not that it wouldn't be true under certain circumstances. I'm sometimes a bit pedantic in these things :)
 
Ok, so I know nobody was really begging for this, but new programmers are going to be reading this, so I gotta set 'em straight before the situation gets out of hand.


ON JAVA:

^ A small note, if the Java program uses native libraries, they of course have to be ported before the program can run. Other than that you're golden :)
I hate it SOOO much when a java programmer tries to school you on languages(not you, I don't hate you ;) ), repeatedly making the point that java is more portable, and faster(bull!)...to then see them write their programs depending on binary blobs(sometimes unaware of this, until after they flame me for politely letting them know it does not run on Linux x86_64 (or x86 for that matter)."YOU NOOB YOU'RE DOING IT WRONG" ...five mins later... "Oh.".).


C is actually more portable in Java. On any GNU/Linux system, be it ARM, x86, Z80(yeah, no, you can't run Linux on a z80), etc, other than endian-differences when doing networking (including sneakernet) between systems of differing endianness, a C program will work. Java? (on exotic platforms)Forget about it, since you gotta port a JVM - and since you don't get the source (you DO get openjdk, but as i'm sure you're all aware, that's SHIT), that might not ever happen(You rely on Sun - I mean Oracle - I mean, wait, who the fuck owns it now?). I mean, sure, to be able to run on any platform you have to compile the mentioned C program - but you have to do the same for the java program - every time you run the program. Just distribute your C program so it self compiles(perl/python - your time to shine - or just 'make' :p ), you can easily get more than what java provides. The main problem is other OSes _er - windows_ not providing a compiler. True, ubuntu doesn't by default. But meh.


But sure, C programs need other stuff to do more than text, you need something like OpenGL or SDL - yeah - then you have a problem - however, SDL is LGPL'd so if you're up to it, there is a possibility, at least, of making a port.


Java and Flash (yeah, ok, Dropbox too - Oh, and my video drivers, but i'll pass that off as firmware.) is the only proprietary software on my system. They both fill my kernel log with segfaults. You can't rely on a company to provide decent software to be able to run YOUR software. Because they won't be able to. Personally, i'd prefer you all write your code in .NET(or whatever Mono is), provided you use Mono to run it - because that shit actually IS (comparatively - no in depth studies taken) stable (opensim (second life server) was used as test material). But yeah, don't use .NET. Why would you? Use C. High level languages don't make anything actually easier for you(yeah, i'll admit perl and python are awesome...for small scripts! Not games! Dammit, don't do this). And if a C64(a C64 was of course not used in 1969, but, the power of a C64) can send you to the moon, why can't we have decent, 60fps Mario clones?(ok, so nobody does ASM anymore, for some decent reasons - also video card infos are not given out so we cannot get acceleration on some platforms).


Yes, Richard Stallman, I have sinned.


And, fuck whatever test results you've seen, C/C++/ASM are, and will always be, faster than Java. In practical applications. No one say "but Minecraft runs great!" because Minecraft is drawing a really low poly scene. Doing that in another language would run better (though at this point, it matters less since it's the video card that's generally the bottleneck - AAAND IIRC Minecraft relies on a binary blob). Err, I forgot what I was gonna add to this.


NOW THAT'S OVER, I THINK I'VE MADE MY POINT (if I haven't, PM me. I'll rant some more. And I might *accidentally* tap my Caps Lock key ;) )


The tutorials are probably for C++ because that's what they teach in most schools now (seriously, try finding a class that teaches pure C!). Also because it's new - not really better (tho I'm biased, I'm a hardcore C fan!), but newer.


I write my games (for SDL) in 'C++', but really it's just C that's compiled with g++, and uses new/delete. I avoid everything else that's new. And I put 'typedef' infront of my structs, that would in C actually need that typedef(actually throws warnings in C++ with -Wall turned on).


For anyone starting out, pick some language - whatever's simplest to you(I started with BASIC - not _really_ by choice but not a bad start). Then, go learn an assembly language. THEN, go learn, and then use, C/C++, then go use C/C++ your language of choice (search out there for your favorite language. I've tried several - C is my honest favourite.)


Klaue: I love exception jokes :) . You made my day.


Ah, the dreaded difference-in-path-slashes. Damn you, Microsoft... But can't windows' file stuff take forward slashes now? I thought it could.


<offtopic>I did my 'research' just then in Lynx. How awesome am I? ;) jk.</offtopic>


I wish my essays were this good/convincing. (And this wasn't very good :p )


cd ~/;rm .bash_history;exit
 
Oh well, another java flamewar. Whoop-de-doo. Do you actually search for "Java" just to rant again? I mean seriously, Java was used as an example because it's a high-level-language and C++ would be a bit too close to C. And yet, instead of just accepting that, you go on and rant.


I am so SICK of it.


B-ZaR might not bite, but I do. I shouldn't, I know, but I can't help it.


dutycallsv.png



But at least this will be the last I post about this matter. I just do not want a flamewar right now.

I hate it SOOO much when a java programmer tries to school you on languages(not you, I don't hate you ;) ), repeatedly making the point that java is more portable, and faster(bull!)...to then see them write their programs depending on binary blobs(sometimes unaware of this, until after they flame me for politely letting them know it does not run on Linux x86_64 (or x86 for that matter)."YOU NOOB YOU'RE DOING IT WRONG" ...five mins later... "Oh.".).
I hate it SOOO much when a [choose your poison] feels the need to just bash java using laughable strawman arguments. Noone ever said that java was generally faster. Noone would say his binary-dependent prog was portable if he wasn't a complete moron. Thanks for assuming that all java programmers are such morons, by the way.

C is actually more portable in Java. On any GNU/Linux system blah blah blah
guess what? 99% of the time you are not going to make an app for any exotic platform, you are goint to make them for windows, mac, or linux. If you want to make a prog that runs on all of those three, you have to be really carefull in C and often have to shun the easy solution for a problem and instead doing it the hard but system-independent way. And you still might have to port it to the system. and compile it on every of them. In Java, you do it once, with a minimum of attention to platform-independency, and in most cases it will run on all platforms. I only once had to include a binary, and that was just for spead reasons and I also had a pure java method of doing the same on all the plattforms I diddn't have a binary for.

I mean, sure, to be able to run on any platform you have to compile the mentioned C program - but you have to do the same for the java program - every time you run the program.
No. No you don't.

Just distribute your C program so it self compiles(perl/python - your time to shine - or just 'make' :p ), you can easily get more than what java provides. The main problem is other OSes _er - windows_ not providing a compiler. True, ubuntu doesn't by default. But meh.
not meh. if you have to install a compiler, you actually trash one of the biggest plus points for natively compiled software, namely "no additional software needed".

They both fill my kernel log with segfaults.
Then you probably just made some config or install errors. Or your kernel is buggy. I don't see that happening on my box.

because that shit actually IS (comparatively - no in depth studies taken) stable (opensim (second life server) was used as test material)
funny, NczAll (minecraft classic server) died once every two hours thanks to mono.

Use C. High level languages don't make anything actually easier for you
Let's all pretend there's no such thing as, for example, the STL for C++. And let's all pretend that stuff like Maps do not make programming easier. And that it isn't easier when you have a fully optimized Map class at your disposal instead of having to make one yourself.

And, fuck whatever test results you've seen, C/C++/ASM are, and will always be, faster than Java. In practical applications.
Except when it's not. Because the language is not the important thing, how you use it is. You allready know that, but heh, as long as you can repeat your incorrect assumptions to bash java, all's great, right?

NOW THAT'S OVER, I THINK I'VE MADE MY POINT
I guess your point is "Hey I'm bored and I have nothing to do but hunt wild pointers. Let's make an out-of-place off-topic Java flame! Again!"

Also because it's new - not really better (tho I'm biased, I'm a hardcore C fan!), but newer.
New? Seriously? It is 28 years old, ffs!

I avoid everything else that's new. And I put 'typedef' infront of my structs, that would in C actually need that typedef(actually throws warnings in C++ with -Wall turned on).
because you are supposed to put your typedefs in preprocessor statements.

Then, go learn an assembly language.
Lerning ASM nowadays is about as useless as learning ancient greek. Sure, it's nice to know if you want to feel really elitist, but there's absolutely no need for it.

And this wasn't very good
no argument there

cd ~/;rm .bash_history;exit
history -c;exit
 
^ Not gonna bite...


*read more*...

Lerning ASM nowadays is about as useless as learning ancient greek.

... ok, you just changed my mind.


Yeah, there is no need for it. One can program in any other language without knowing a bit of ASM.


But knowing ASM is not useless.

  • If you understand ASM, you will learn C (incl pointers) in no time.
  • If you work in embedded programming, you might need to write optimized ASM code blops for some bizarre architechture.
  • You can have fun stretching the limits of microcontrollers: http://www.linusakesson.net/scene/craft/ :p



Well the last item is half j/k, but the first one was a big win for me, and the second one is the skill that I can attribute getting my current work to.
 
Lerning ASM nowadays is about as useless as learning ancient greek. Sure, it's nice to know if you want to feel really elitist, but there's absolutely no need for it.

wait, so being able to wrap my head around asm for work makes me elite, maybe i should learn ancient greek too, incase i become a lingual historian


sweet :p


(on a more serious note, if there wasn't any need for asm it wouldn't exist, and you've obviously never had to program directly in machine code, if you had you would give the simplicity of asm a great big hug)
 
Yeah, there is no need for it. One can program in any other language without knowing a bit of ASM.


But knowing ASM is not useless.
I said it was as useless as ancient greek. knowing ancient greek still has its uses, for example for translating really old stuff - it's just that your usual guy would not find any use for it. Same with ASM.
 
Last edited by a moderator:
I said it was as useless as ancient greek. knowing ancient greek still has its uses, for example for translating really old stuff - it's just that your usual guy would not find any use for it. Same with ASM.
fair point, but then again your usual guy wouldn't have much use for java/C :p


though for the average programmer here it seems everyone's pretty high level compared to me, don't see the point in specifically using a high level language, though the OO techniques are useful (whatever language you do it in, sure if you tried hard enough you could do fully OO programming in asm, it'd be a total bitch though)
 
fair point, but then again your usual guy wouldn't have much use for java/C :p
The thing is just that if you want to know ASM or if you have a real use for it, fine, use it. But listing it as some of the things someone who wants to lern programming should do is just wrong. If someone expresses the desire to lern a foreign language (as in speech), would you go on and tell him that he first should lern proto-germanic?

sure if you tried hard enough you could do fully OO programming in asm, it'd be a total bitch though
I doubt it. How would you, for example, use polymorphism? Maybe I'm just ignorant, but I can't see a way to do this if the language doesn't support it..
 
The thing is just that if you want to know ASM or if you have a real use for it, fine, use it. But listing it as some of the things someone who wants to lern programming should do is just wrong. If someone expresses the desire to lern a foreign language (as in speech), would you go on and tell him that he first should lern proto-germanic?


I doubt it. How would you, for example, use polymorphism? Maybe I'm just ignorant, but I can't see a way to do this if the language doesn't support it..

1: definately, should only ever learn asm if you need to


2: in reality asm is just a few aliases for the machines opcodes, C/C++ are built ontop of asm, so anything you can do in any language, can be done in asm


so yes, you could technically do shiny OO concepts in asm, but it's so unrealistic no one is crazy enough to do it
 
Lerning ASM nowadays is about as useless as learning ancient greek. Sure, it's nice to know if you want to feel really elitist, but there's absolutely no need for it.

If you want YOUR points to be taken seriously (and they're mostly valid and should be) then you should be careful about making statements that are eerily similar to the kind you're refuting. In this case you're showing an extreme ignorance - an extremely common ignorance, but ignorance nonetheless.


Sometimes dropping down to ASM is not just advantageous but required. Some machine specific constructs are simply not accessible in a high level language, often even via intrinsics. But that's a pretty superficial reason to use ASM and only applies to sparing cases, so let me give you something a little more worth thinking about..


Like a lot of people here, I do embedded programming for work. Here C is king and as time goes on it only seems to take more and more share, but a lot of people still use ASM in some capacity. For example: in a hard real time system I absolutely must service all interrupts (only important and fairly periodic things cause interrupts in this system) or I'll miss timing requirements. At the same time, I have to support the capability to write to EEPROM. The way the EEPROM interface works, I may space a page's byte writes by no more than 30us or the page programming phase will be entered. This means that I must make sure that interrupts can't take more than 30us.


30us might seem like a lot, but on a 25MHz CPU it really isn't. And I'm cutting it pretty close - after very tightly optimizing the interrupt routines in ASM. If the routines were in C there'd be a ton of requisite boilerplate to interface with a dispatcher (which itself must be in ASM) and there's no way I'd make it, even if the compiler didn't suck.


Here's another example: my bootstrap POST code checks all of SRAM for errors. Try doing a C routine that you can easily guarantee will be entirely register-resident. You can't.


Yet another example: at one point I had to interface a SPI-like peripheral over software (bit-banging). But the only way I could meet timings was to make sure that a cache miss didn't occur while I was in the middle of a clock iteration. The only way to do this was to a) use a diagnostic interface to set cache lines and tags manually; the CPU had no prefetch code instruction and B) make sure my code was aligned, compacted, and all around organized just right to get things fitting. Of course, this was before the general optimization that would have needed to be done. The end result: something that was originally written in C was now written in ASM.


And that optimization part is still relevant. For decades now people have believed that compilers can outdo programmers universally, but this is a myth. For one thing, not every compiler is going to be state of the art. GCC does much better on x86 than most other platforms, but it goes further than that - in one instance GCC 4 is available for the processor I'm using but I had to fall back on GCC 3 because they cut a (arch specific) feature I needed in 4. And GCC 3 is behind in a lot of ways, but even if it weren't this fails in ways that are not the GCC dev's fault but still a fault of the compiler; namely, the maintainers of the port to this architecture failed to properly tell GCC anything about the timing of the instructions (or if they did GCC just doesn't care). As a result the scheduling is terrible and it fails on simple things like working around load-use, FPU latency, and condition code interlocking. Of course, like usual, it also has sub-par register allocation and keeping constants in registers - things that don't hurt you as much on x86 but hurt GCC's performance on more RISC-like platforms. Simply put, beating the compiler is not even remotely challenging here, but I think that experienced assembly programmers today can almost always beat it. OoOE processors doesn't really make this harder, it actually makes it easier; it's the multi-issue in-order processors (like Cortex-A8 and Atom) that are difficult to write efficient assembly for.


When you write a lot in assembly you really start to see more just what it means to write this kind of micro-optimized code.. because it's not merely a matter of translating a program to assembly, but it works in the opposite direction too. What I mean is that you will end up subtly modifying your original algorithm to fit better in the assembly space, with the final result being a convergence in between. It's this feedback that's important in lots of optimization problems (for instance, try designing a FIR filter with quantization and you'll get better results if the quantization is part of the iterative approximation and not just applied afterwards). This feedback path is something that compilers just can't participate in.


Simply put, anyone who does hard real-time embedded for a living isn't worth their salt if they don't understand their CPU at the machine code level. They'll reach the situations above and have no idea how to proceed. Or even worse, they'll never even consider them in the first place and be completely oblivious when their verification massively fails. Sadly, the CS industry all but ignores embedded, which is almost a totally separate sub-culture more familiar to the EEs who design the hardware the embedded code runs on. At best it's considered a niche, but the reality is that it runs on literally billions of devices and makes the modern world work in a capacity that IMO outreaches any commodity applications.
 
Last edited by a moderator:
In all actuality, MOST programmers don't need to use assembly. There are just some fields where it's absolutely important and therefore shouldn't be called pointless. Either way, it's good to know for indirect reasons, hence why most universities still teach it.
 
Lerning ASM nowadays is about as useless as learning ancient greek. Sure, it's nice to know if you want to feel really elitist, but there's absolutely no need for it.
Yeah, because writing emulators is totally useless. Nobody would ever want to do that.
 
Klaue made the same mistake as so many in this thread have: overgeneralized and overbroad statements, that only hold water with a certain set of preconditions. Leaving out those little details is what really should be avoided, even if it makes your argument weaker. It only serves to make the conversation less constructive. :(


EDIT: typo
 
Last edited by a moderator:
First off, I didn't mean to - lol - offend anyone with my java rant. Note to self: Do NOT go in the dev section after, during, or before a loong (C/C++/ASM) coding session (erm - not at all?). After this message, I will NOT post anything java related in this thread - at all. I suggest everyone else does the same - but I won't stop you nor comment. Argh, how do I make this NOT sound like i'm trying to monopolise C/C++? Because that's just not it.


I said at the top of the post that some of the stuff in my post was not going to directly target points raised in this thread (yeah, bad sign :( )


Klaue. I agree, I majorly derailed. And i'm still off the tracks ;) . If you'd like me to just obliterate all traces of Java-flaming, i'd be more than happy to. I really don't see why I feel the need to flame java, but it's always there. I guess I'm just jealous of those with the epic gaming rigs that can run x, and I can't, on my aging, yet still powerful, computer. I'm not being sarcastic here, i'm not trying to subtly add a point here. (If I had some DDR3 I'd probably not have any issues, one of the biggest things that makes C faster is that the RAM isn't getting raped as much).


I never assumed all java programmers were complete morons (I _knew_ it - just kidding :p . Some of you are the smartest people I know). But I know that this has happened (again - by someone who is rather intelligent).


It's gonna be more than 1% for exotic platforms if you're doing anything other than games - there's so many microcontroller projects out there. The PC is no longer the only platform. Furthermore, (IIRC - and it might have changed)Apple's app store only allows programs written in Objective-C - so there's a platform with millions of users. So tell me now that java works everywhere useful(though it's rather hypocritical for me to mention this, I hate Apple, and especially their policies too). True, you can't port a C program to the iAnything either, without work - for one you gotta make it compile in an Objective-C compiler. Then you gotta not use any of your own, non-apple libraries, etc...


And knowing ASM is *very* helpful when optimising. Don't say optimisation is useless. It saves battery life, for one. And makes your music skip less(so does renice for those having issues with that). It allows your program to *actually do stuff* in some cases. Just using CPU because you can is the worst thing a programmer can do(this is not the same as using CPU because using less requires a lot of extra programming). Plus, learning ASM first makes you appreciate C/C++. It won't make you think C/C++ is the hardest thing on the planet, because you'll know about ASM(I'm not making it out that it's the most impossible thing to work in, it's not really _that_ hard, but organisation is, arguably, a lot harder).


Learning ASM is good for improving security. While with things like mudflap and canary values, etc already set up for you, buffer overflows are pretty much not an issue (except on embedded systems, where protections are not an option due to the overhead), without ASM you wouldn't know how to fix this issue (besides the blatant ones). Or to exploit it, yes, ASM is good for breaking security too: www.smashthestack.org .


Also, knowing ASM allows you to, well, use it. You don't need to know how to cook to live, but it's good to be there, so when life gives you roadkill, you can make dinner (ew!). I have performance issues coding in C even, when writing a synth for the Arduino(speaking of roadkill). I had to tighten my code considerably, more needs to be done(yeah i'll admit my code sucks!), even, but I still would not achieve the performance I could with raw ASM. There are a lot of things I can see I could get around in ASM. Some of the changes I made to the C program I would not know without knowing (basic) AVR assembly. The interrupt has to complete in 32microseconds(31250KHz interrupt, 16Mhz CPU clock. I do 3 multiplications in the ISR (IIRC this is okay-ish since there's a two-cycle multiplication instruction, don't quote me on that. My result goes in a 16bit value though, which could make that instruction useless). One for each sound channel, as part of the volume control. This is a lot faster than alternatives (like modulo - hey does anyone know if that would actually work?I tried it first, and got noise so I just assumed I ran out of CPU time (like I predicted I would)) That doesn't give me a lot of instructions (well, it actually gives me plenty to do the work, it's all comparative).


Anyone who says you "Absolutely" don't need to learn ASM doesn't know asm. So, as if you were my friend, which is how I will treat you, http://homepage.mac.com/randyhyde/webster.cs.ucr.edu/www.artofasm.com/DOS/fwd/fwd.html. A bit dated but still a good reference. The way I first learned any 'ASM' is in raw machine code (POKE'd in the opcodes) with a (real) C64 and a (real - btw - not being elitist with the mention of 'real') copy of The "Commodore 64 Programmer's Reference Guide", which can be found Here. If you wanna try it, get a copy of VICE and poke in the program, starting at, idk, $C000 (hex C000, or dec 49152)



Code:
INC $D020

JMP $C000

You need to convert the opcodes and addresses into decimal to POKE them. You'll need to split the addresses into two bytes (C64 is little-endian, which actually means the least significant goes first). For extra vintage flavor, no calculators ;) .


Then just SYS 49152.


Press keys and note the change of effect as the keyboard interrupt changes timing.


Yay - an ASM tutorial. Outdated, rather useless(tho it's a start for more useful projects - go out there and get an Arduino/AVR dev board (arduino is actually inconvenient for ASM - i've never done anything besides inlines, so I can't help)), but now they're not all C++ ;) . (definately not trying to snipe the first poster - or anyone!).


There are reasons to not write your entire project, or any part thereof, in ASM - YES it is more difficult than a high level language. Ok, I was too extreme in my earlier post - you can gain from higher level languages, but spending a little more of your time can save a lot. Just like when a greenie asks you to walk to work instead of driving your V8 monster. I don't write whole projects in ASM(But when I finally get my parts i'll be running both my Z80 computer and DIY CPU with pure ASM - partly because the systems are too slow (can't get insane clock rates on protoboard. The sharp rises cause too much interference, using a slew limiter (RC lowpass filter) would just make stuff too complicated and unreliable (would need schmitt buffers) Also delay along the wires become a problem in a DIY cpu at high speeds.), and also because I can't be bothered to set up/write my own C compiler). I love ASM, as gfrancisdev said (not exactly), the simplicity is refreshing(I do believe he is my brother!). But I love C so much, I write in C - just like Java addicts are Java addicts, i'm a C addict. Like all addicts, I naturally recommend 'trying this shit' over anything else.


I never _meant_ C++ was 'new', but newer. People seem to always jump to the newest thing (like the people saying the Pandora is slow?).


"Hey I'm bored and I have nothing to do but hunt wild pointers. Let's make an out-of-place off-topic Java flame! Again!" - yep.


I left out an important statement: Use java if you will.


I'm not stopping anyone from either ignoring my post or using java. But i'm recommending (yeah, quite forcefully) against it. I'll post what I wanna post. It's possible to add me to an ignore list, I believe. Feel free, I really don't hold grudges. I'd never be able to contact you anyway, what's it matter?


The word 'java', in this context, is connected to a Non-Maskable Interrupt in my brain, which, unlike on x86(yeah, kinda contradictory, huh), cannot be masked. Idea: Make a huge "Don't use java, it's shit, and why" on another site, and link to it in my "ISR". I'm not hinting by this analogy, that people need to learn more about their hardware, you really don't need to understand this analogy to be a programmer.


Yes, yes, Java is just a language. But have you ever tried compiling (more than Hello World) Java code to an actual binary, and had it work properly? No, because see below. Have you tried the inverse, compiling C to bytecode? Yeah, the result is not exactly a rainbow either (Pawn, while it does everything that it aims to, sucks).


My biggest hatred is not that it's slow, but that it's proprietary (well, mostly. There's openjdk, but i've even heard veteran Java programmers complaining (I agree) about that - hey does Ubuntu still use openjdk by default? IIRC I had to enable the 'partners' repo to install sun java. Which a lot of users will not be doing).

Lerning ASM nowadays is about as useless as learning ancient greek. Sure, it's nice to know if you want to feel really elitist, but there's absolutely no need for it.
Yeah, because writing emulators is totally useless. Nobody would ever want to do that.
... :D


And Klaue: I for one did not learn ASM to feel elitist(I learnt it to get stuff done, because on the C64 I didn't have a C compiler). It's rather wrong for you to 'assume' that all ASM programmers are elitist, isn't it?


And not all C programmers are 'elitist' either. I don't consider myself a better programmer because I know, and use C. I didn't want to make it sound that way. We're not better 'people'. We're not more intelligent. I have a friend who uses 'Game Editor'(his games are awesome - yet slow and CPU consuming - that's what happens when you interpret C), and I hate it when he says i'm elite for making x with only C code. Game Editor is not recommended, but might be a good start to people who just wanna write a game, with minimal coding. I've done this(just remember, if you're gonna do C later, try to get GE out of your mind while C programming) No Pandora target though.


Anything can be done in ASM. Anything. ASM is executed by your processor (well, machine code is, so for example, NOP is, on x86, 0x90 in machine code). C++ is not, so your compiler just does what an ASM programmer does in ASM. Just looking at what the hell polymorhism actually is (yes, I really am NOT a C++ programmer. We don't use big words in C, cuz we're stupid n stuff :) . I know what polymorphic code is, as in, code that modifies itself, and have written code that does this(6510 ASM). Doing that in C/C++/Java would be hard/possibly impossible(lol!)) http://en.wikipedia.org/wiki/Type_polymorphism, you don't really need it. The type of a variable would never change during runtime, and I fail to see where you'd use this? Possibly, like this, I encountered recently:


In my x86 OS (bad code(I admit it!), not ready for release, but I can give pics, or a disk image), I got video by using a QEMU specific hunk of code. It worked nice, I wrote some functions to draw, all was good, I made it draw icons/windows that could be added/removed freely. Then I wanted it to run on hardware, so I wrote some VGA stuff, to get a VGA mode. I wanted to make my previous functions work, so I could just set VGA and run my regular graphics stuff.


So I made a function pointer for each draw function, in the spec taking an unsigned long for color. The function pointer would point to the 32bpp drawing code by default. When entering VGA mode, i'd change these pointers to my vga functions, which take an unsigned char as a color argument (yeah I can see a problem with this, I need to go study the ASM output when I get the time - no, my code does not work :p . I didn't quite expect it to. I can get around this with ease by using the unsigned long.), then call my graphics loop. it works, no faults are raised, but the screen fills with garbage (The colors would of course be wrong - but I think I might have actually just drawn beyond 320/240, screwing the array - for an OS I really should be paying for attention XD).


Yes. Yes you DO have to compile a java program each time you run it, while running it. That's no secret little thing that Java tries to hide.


Are you sure that that Minecraft server wasn't just dying because the server itself is a little dodgy?


Klaue, my kernel is as stable as Linux has ever been. My RAM is fine, too. I can run memtest all night without any errors. No other programs segfault. Are you on Linux x86_64? I'm using (X)ubuntu's packages, in the partner repo, iirc. For the record, I have used other Linuxes. If it weren't for my need of several things for


About straw man arguments, reread _your_ post, please.


I hope you weren't hinting that history -c makes you superior. I actually wouldn't do either, my .bash_history is symlinked to /dev/null.


urjaman: YAY - Craft ftw! I'm so jealous of Linus' l33t skillz. Demo awesomeness increases exponentially with limitedness of the hardware :) , even while ignoring the fact that the hardware is limited.


And I agree, knowing ASM makes pointers much easier to understand. But I can't recommend learning ASM just to shake out not knowing pointers, unless it's your _only_ problem. ASM can be overwhelming- more so depending on your architecture - VAX? Ugh. Good thing nobody here is using a VAX (I hope) ;) .


I really do love these forums. They're entertaining. Really, flamewars sure beat what i'm supposed to be doing!


If anyone would like to make a rebuttal, or comment, PM me, since this IS getting out of control. Yeah, I should not have posted..


XxionxX: You should learn ASM, yes, but you should not necessarily use it as your primary language. My biased opinion is that you should use C, of course, and I can say there's definitely nothing wrong with it for game programming. But there's no harm done in using a C++ compiler to compile your games instead of C compiler. Way back when I first started with SDL, my program refused to compile with a C compiler, I was no doubt doing something stupid, but the compiler was saying there was issues in the library file (SDL/SDL.h), so being a noob, I said wtf, and used g++ where my C worked perfectly. Plus doing this would let you tell your boss that you're using C++, to please him, if you ever do programming for work. He often won't know any better. New/delete are actually very nice additions, but things like streams (that look like bitshifts, gets me every time) are not all that useful to a game programmer.


Jump in and try C. If you don't like it, try ASM, then you will(nah, ASM ftw) try alternatives until you find something that you like. I missed the ultimate goal - make an awesome game(but the biggest thing to remember is that your first game will no doubt be crap, it's normal. Nobody liked Super Hit-Enter-And-Get-an-Error-in-C64-Basic Brothers either(no, I didn't actually name it that). Don't try tetris first, I really don't get why people recommend it. Go for moving characters around with the keyboard. Then build up on that.)


I'm going to go on a looong C/C++/ASM coding session(see top)


*Add Reply* (Todo: regret - then, realise I don't regret anything)


EDIT: Wtf how did I miss Exophase's post? That makes my pro-asm arguments redundant(NO it wasn't otherwise). Also, hi Exophase!
 
Last edited by a moderator:
Back
Top