Story of Mel. (Machine Code)


mjohansson

Supporter
Joined
Feb 10, 2011
Messages
409
Thaught this was funny so I thought I should post it here.

http://www.catb.org/jargon/html/story-of-mel.html

Snippet:

"A recent article devoted to the macho side of programming

made the bald and unvarnished statement:

    Real Programmers write in FORTRAN.

Maybe they do now,
in this decadent era of
Lite beer, hand calculators, and “user-friendly” software
but back in the Good Old Days,
when the term “software” sounded funny
and Real Computers were made out of drums and vacuum tubes,
Real Programmers wrote in machine code.
Not FORTRAN.  Not RATFOR.  Not, even, assembly language.
Machine Code.
Raw, unadorned, inscrutable hexadecimal numbers.
Directly."
 
It's kind of an anti-open-source rant though.  Good open souce code should be easy to grok, so that future maintainers can easily take over the project and improve it.  It's a pretty bad idea in closed source too, since it means you can never move on from your old project and get promoted within the organisation, handing responsibility over to a new maintainer.

All the Jargon File stories are worth reading as historical artifacts though.  My personal favourite is A story about magic
 
Can you imagine anything else being designed like that, though?

"I'm sorry, we can't install a new dishwasher. The old one was integrated directly into the wall, the wiring used was of the exact parameters for this make and model, it's all very elegant and efficient and probably saved a few hundred dollars in construction but it prevents us from changing it."

I'm sure the hand optimized code ran far better than the compiled code, but at the end of the day it was just a blackjack program, did it really need to run subsecond? Was those few extra cycles worth needing to rewrite it from scratch, and making it impossible to modify correctly later, even by the original developer?

This isn't a tale about an artist ahead of his time, it's about the dangers of premature optimization and an egomaniac that didn't know the difference.
 
"Real programmers don't comment their code. If it was hard to write, it should be hard to understand."

...
 
This isn't a tale about an artist ahead of his time, it's about the dangers of premature optimization and an egomaniac that didn't know the difference.
I disagree, I certainly would call it a tale of an artist, though I wouldn't have said he was ahead of his time. The fact that his software engineering methodology is not commercially viable in the modern world doesn't detract from the artistry of it at all.
- Neelix
 
This isn't a tale about an artist ahead of his time, it's about the dangers of premature optimization and an egomaniac that didn't know the difference.
I disagree, I certainly would call it a tale of an artist, though I wouldn't have said he was ahead of his time. The fact that his software engineering methodology is not commercially viable in the modern world doesn't detract from the artistry of it at all.
- Neelix
Fine. It's both the tale of an artist and of the hazards of premature optimization in the hands of said artist.
 
It's kind of an anti-open-source rant though. Good open souce code should be easy to grok, so that future maintainers can easily take over the project and improve it. It's a pretty bad idea in closed source too, since it means you can never move on from your old project and get promoted within the organisation, handing responsibility over to a new maintainer.
Sounds like you're reading a lot into it.. what I get from it is simply "check out this cool tricks this guy did, he was really good at programming these weird old computers" The part about real men writing w/o source code isn't meant to be actual advice.

Can you imagine anything else being designed like that, though?

"I'm sorry, we can't install a new dishwasher. The old one was integrated directly into the wall, the wiring used was of the exact parameters for this make and model, it's all very elegant and efficient and probably saved a few hundred dollars in construction but it prevents us from changing it."

I'm sure the hand optimized code ran far better than the compiled code, but at the end of the day it was just a blackjack program, did it really need to run subsecond? Was those few extra cycles worth needing to rewrite it from scratch, and making it impossible to modify correctly later, even by the original developer?

This isn't a tale about an artist ahead of his time, it's about the dangers of premature optimization and an egomaniac that didn't know the difference.
How do you know that his optimizations only made an unimportant difference? This was back during a time when a computer playing blackjack was a popular attraction at conventions. Things were different then.

It also never said the original developer couldn't modify it, it said that the developer who was brought on afterwards took a while to understand what the code was doing then lied about not being able to change it. I'm not sure I buy this story that the code only ever existed as hex the person wrote directly, if so yeah that's pretty stupid. More likely the guy who wrote it didn't do a good job passing on whatever source/documentation he had to the company when he left, and if he did pass it on well then he probably could have written better comments/docs while he was there. But that doesn't mean his optimizations were all a waste.

The whole process of writing code in whatever way seems obvious then profiling it and optimizing the bits that need it may work best for a lot of software out there but sometimes "premature" optimization that involves thinking about the high level structure and efficiency of your design before you code it goes a long way. In the real world not everything follows the 80/20 rule and it's this assumption gives us things like emulators that'll never really be very efficient and suffer in the real world for it. Back in Mel's day this applied to a bigger class of problems.
 
How do you know that his optimizations only made an unimportant difference?
It was a blackjack program running on vacuum tubes that was then ported (eg, totally rewritten) to a transistor based platform. While I can't find any hard numbers on how the RPC 4000 performed I'd be very surprised if the transistor upgrade didn't bring at least some improvements that wouldn't have balanced out any inefficiencies from more readable code. That's just a guess though. Even if not, it's a blackjack game, it's not a death knell if it takes an extra second to play the next card. As a demo application, while it could have been disastrous if it barely ran, I don't believe for a second that the only reason it ran in reasonable time is because it was written entirely in hand optimized machine code, but do believe it would have been more beneficial if the source had been kept simple.
It also never said the original developer couldn't modify it
It claims Mel "accidentally" made a mistake so that it would cheat in favor of the computer instead of the human. That's a pretty big mistake to make. Taking it at face value it suggests even Mel couldn't properly modify his code. Whether that's true or Mel intentionally made the mistake in rebellion is up for debate; given the way the author says Mel reuses instructions as operands I would actually wager that no mistakes were made, but we can't know that, we only have the author's narrative to go on.
I'm not sure I buy this story that the code only ever existed as hex the person wrote directly, if so yeah that's pretty stupid.
The story says it is so, that's what I'm basing my assertion on. It says Mel hated compilers, hated optimizers, knew exactly how long every instruction would take, and in my search trying to find performance data for the RPC 4000 I've found more than a few additional articles on Mel Kaye: he's quite infamous for his optimizations, including multiple instances of the aforementioned "instruction as operand" hack.
sometimes "premature" optimization that involves thinking about the high level structure and efficiency of your design before you code it goes a long way.
Agreed. I wouldn't put writing it in hex and taking advantage of loop quirks to be inclusive in that, however.
 
How do you know that his optimizations only made an unimportant difference?
It was a blackjack program running on vacuum tubes that was then ported (eg, totally rewritten) to a transistor based platform. While I can't find any hard numbers on how the RPC 4000 performed I'd be very surprised if the transistor upgrade didn't bring at least some improvements that wouldn't have balanced out any inefficiencies from more readable code. That's just a guess though. Even if not, it's a blackjack game, it's not a death knell if it takes an extra second to play the next card. As a demo application, while it could have been disastrous if it barely ran, I don't believe for a second that the only reason it ran in reasonable time is because it was written entirely in hand optimized machine code, but do believe it would have been more beneficial if the source had been kept simple.
It also never said the original developer couldn't modify it
It claims Mel "accidentally" made a mistake so that it would cheat in favor of the computer instead of the human. That's a pretty big mistake to make. Taking it at face value it suggests even Mel couldn't properly modify his code. Whether that's true or Mel intentionally made the mistake in rebellion is up for debate; given the way the author says Mel reuses instructions as operands I would actually wager that no mistakes were made, but we can't know that, we only have the author's narrative to go on.
I'm not sure I buy this story that the code only ever existed as hex the person wrote directly, if so yeah that's pretty stupid.
The story says it is so, that's what I'm basing my assertion on. It says Mel hated compilers, hated optimizers, knew exactly how long every instruction would take, and in my search trying to find performance data for the RPC 4000 I've found more than a few additional articles on Mel Kaye: he's quite infamous for his optimizations, including multiple instances of the aforementioned "instruction as operand" hack.
sometimes "premature" optimization that involves thinking about the high level structure and efficiency of your design before you code it goes a long way.
Agreed. I wouldn't put writing it in hex and taking advantage of loop quirks to be inclusive in that, however.
I think you're making a lot of assumptions and not really seeing this in the context in which it was done.

When I say that writing in hex seems silly, all I mean is that there's not much point in using numbers over names for instructions. That's actually not really much of a distinction though. RPC-4000 has a very regular instruction format, every instruction is nothing more than the opcode, an operand address, and a next instruction address. So the assembly code actually doesn't read that much differently from hex to begin with, except using names for opcodes and labels for locations. And even the example code provided uses plenty of hard coded addresses. I wouldn't totally rule out that Mel wasn't writing something like this that may have been lost, leaving them with only a raw instruction listing to program the machine with.

It's naive to think that moving from vacuum tubes to transistors meant they could just stop caring about writing efficient code.. both machines used magnetic drums and putting instructions and data in the right locations was extremely important for performance. If you did it wrong you would suffer tremendously, and while you think it doesn't matter if their blackjack program was slower I think it does; the thing was an attraction and average playtime translated directly into how many people would see it in a limited window they had to promote the machine.

Yeah, there was an optimizing assembler that tried to put stuff in good locations, but you have no idea how well it really did - this is actually a complex problem and to do a good job you really need to know what the locality of the data and code is like. Something that would have been far beyond the optimizer's capability, especially when you consider how little memory it would have to work with. And besides that, think about the overhead of actually running the assembler. What hardware do you do it on? Does the company have a dedicated (at the time very expensive) mainframe just for this purpose? Is it something you have to spend a considerable amount of time loading the assembler on, entering the program, and waiting for its output? Something which has to be done even if you just change one line? This is nothing like the later era where you could quickly and efficiently enter, store, modify, and rebuild the program on the same machine you were running it on. We're talking about a very real tradeoff, all for something that may do a substantially worse job than something an expert like Mel could do on his own.

Another thing you have to consider is that memory capacity for these machines was very limited. So on the one hand, you really wanted to get the most of what little space you had. On the other hand, the overall scope you could work within was so limited that you weren't exactly spending a ton of extra man hours making a more clever program.

Then, for all Mel hated compilers, he didn't mind writing them. I think he just knew how to use the right tool for the right job. You assume that had he done it another way his bug wouldn't have happened or someone else would have fixed it better, but you don't really know. Not really in much place to second guess what some guy was doing with 60 year old computers.
 
Last edited by a moderator:
It's naive to think that moving from vacuum tubes to transistors meant they could just stop caring about writing efficient code
I didn't say "stop caring", I meant that it should have allowed them to relax the optimization at least somewhat so they could write maintainable code. This is getting way to open for my liking. Can I back up to the original statement? Because this is getting way too meta for me, sorry.
It was a blackjack program intended to demo the system, not some real time financial calculation for tracking stock prices, nor hospital monitor, nor anything else that absolutely needed to be run in the smallest space and time constraint possible. Blackjack, an array of cards, random shuffle, and some basic arithmetic for keeping score. I don't believe at all that such a program was taxing of their CPU. It was just a quick little demo for marketing teams to play with at tech shows to prove that it can do things. If it needed to be hand optimized to perform this task with any reasonable efficiency beyond naive implementations then no one would have been able to write anything more complex without Mel-level of artistry. This obviously wasn't the case, I'm sure the author and every other non-Mel developer were churning out perfectly reasonable code in an standard language that was then compiled down to machine code that ran in a reasonable length of time.

I'm certain the result of Mel's optimization absolutely had better performance than anything the compilers and optimizers could dish out but it didn't need to be. As the author tells it, the program was unmaintainable and performed far beyond the scope of the project to the point that it is my opinion that his optimizations actually detracted from the goal of the program. Adding functionality to the demo should have been trivial, even in FORTRAN mucking about with the shuffle probabilities should have been a 10 minute job, and Mel presumably got it wrong. No mention of how long it took, but the author claims to have taken weeks just analyzing the code Mel created before backing away and claiming it couldn't be done. The optimizations Mel made, while surely impressive, cost at least hours if not days of labour for something that should have been banged out in an afternoon, even with naive assembly code and even less time with something more manageable like FORTRAN.

Was he an artist? Sure. Did his code run better than if it were machine optimized? Absolutely. Did he do the right thing in hand optimizing the code? I don't think so.
 
Has been said before, needs to be said again: This is not meant as an instruction of program development. It is a hairy dog story. 

Second thing needs to be said: When you have a list like that, where you have stuff in the format "Real men uses...", it is usually not meant as advice or role model, either. It is actually not preferable to be a Real Man. Particularly not on a hacker context, where "Real Men" is shorthand for "Braindead Jocks" :D

Third thing: The CACM (Communications of the Association of Computing Machinery) have their archives online. You can read the first issue, from around 1956. It takes a year or son before you find the first letter-to-the-editor, complaining that the kids today doesn't learn Real Pogramming, since they're using these newfangled Compilers that produce Inefficent Code, and since they're not writing the opcodes themselves, they really don't understand what their programs does.
 
I didn't say "stop caring", I meant that it should have allowed them to relax the optimization at least somewhat so they could write maintainable code. This is getting way to open for my liking. Can I back up to the original statement? Because this is getting way too meta for me, sorry.

It was a blackjack program intended to demo the system, not some real time financial calculation for tracking stock prices, nor hospital monitor, nor anything else that absolutely needed to be run in the smallest space and time constraint possible. Blackjack, an array of cards, random shuffle, and some basic arithmetic for keeping score. I don't believe at all that such a program was taxing of their CPU. It was just a quick little demo for marketing teams to play with at tech shows to prove that it can do things. If it needed to be hand optimized to perform this task with any reasonable efficiency beyond naive implementations then no one would have been able to write anything more complex without Mel-level of artistry. This obviously wasn't the case, I'm sure the author and every other non-Mel developer were churning out perfectly reasonable code in an standard language that was then compiled down to machine code that ran in a reasonable length of time.

I'm certain the result of Mel's optimization absolutely had better performance than anything the compilers and optimizers could dish out but it didn't need to be. As the author tells it, the program was unmaintainable and performed far beyond the scope of the project to the point that it is my opinion that his optimizations actually detracted from the goal of the program. Adding functionality to the demo should have been trivial, even in FORTRAN mucking about with the shuffle probabilities should have been a 10 minute job, and Mel presumably got it wrong. No mention of how long it took, but the author claims to have taken weeks just analyzing the code Mel created before backing away and claiming it couldn't be done. The optimizations Mel made, while surely impressive, cost at least hours if not days of labour for something that should have been banged out in an afternoon, even with naive assembly code and even less time with something more manageable like FORTRAN.

Was he an artist? Sure. Did his code run better than if it were machine optimized? Absolutely. Did he do the right thing in hand optimizing the code? I don't think so.
What I'm getting out of this is you saying Mel should have used FORTRAN before a FORTRAN compiler even existed for the computer (as evident by the author saying he was hired to write the FORTRAN compiler). The first version of the program for the older computer may well have been written before FORTRAN even existed. Assembly was the only realistic option.

Yeah blackjack may seem trivial now, but we don't really know what it was like coding on this thing and what it's I/O was like. What we do know is that it got a lot of interest at conventions and, according to the author, was their most popular program. If it was so trivial why was it so impressive? Maybe because computer programs weren't awfully interactive back then and a game was an interesting application, and maybe some of that CPU time was needed to manage the I/O properly.

It seems to me that the first order problem here is that Mel should have documented his code better. Given that the author was eventually able to figure it out he would have probably gotten it pretty easily with the right documentation.
 
What I'm getting out of this is you saying Mel should have used FORTRAN before a FORTRAN compiler even existed for the computer (as evident by the author saying he was hired to write the FORTRAN compiler).
No, he could have written it in assembly. Hell, he could have written it in machine code that didn't depend on esoteric optimizations. Anything that could have been read by someone who didn't craft it originally from a solid piece of marble.
Given that the author was eventually able to figure it out
Except he didn't. He spent two weeks figuring out one loop structure. He gave up trying to find the switch that actually changed the logic out of awe.
maybe some of that CPU time was needed to manage the I/O properly
I'm not convinced. Even if the I/O was a bottleneck that still doesn't change anything. Optimize the I/O then until it's not a problem, if it is a problem, the inner logic should be straight forward and easy to understand.
Yeah blackjack may seem trivial now,
Not now, it is and always has been. Shuffling 52 ints, a bit of logic for score keeping, it is not a hard problem in any language (that isn't meant to be hard), and even a naive assembly implementation should be able to play a round in a few hundred instructions. It shouldn't be so complex that it takes someone 2 weeks before they give up. I'm assuming that wasn't two solid weeks, that there were other projects going on, but still, too much time spent in thought over something that should have been simple.
Mel's hand optimizations, while no doubt elegant and powerful, were a hinderance in this case. Especially in a demo app where changing market might mean changing features it should have been seen as a priority to keep the code manageable.
 
I think people in general underestimate the difficulty of working with these old machines. We're still talking about computers from the era when they hadn't quite figure out how to make computers. I did a quick readup about the LGP-30 and noticed, amongst other things, that it has an instruction set consisting of 16 instructions. None of those instructions incorporated indexed adressing, so in order to access sequential data, you had to write self-modifying code. Or, to quote the manual: "Note that after executing the add instruction for the first time we write into memory our first intermediate result, a0. The following three instructions bring, add, and hold change the add instruction from a 2005 to a 2006. Hence, we note that address modification has not required any new type of order." You do not simply shuffle 52 ints around on this machine without some seriously hairy manipulations. 

The three CPU registers are also on the drum. They are being recirculated - i.e. they have separate read and write heads, spaced 32 bits apart, that constantly reads and rewrite the bits in the register, in sync with the position of the other read heads (the memory ones). This is because it's a bit-serial machine - i.e. it doesn't calculate with words, per se, it gets data as a series of bits. If you wanted addition, you add serially, bit by bit. . Everything is on the drum. The 32nd bit of each word was not used - It was a "deflux area". But do note, that "The use of 64 tracks and 64 read-record heads means that any given portion of the drum is available to a read-record head at least 64 times faster than if the memory consisted of a tape governed by one read-record head." So, like, it could be worse :)

The instruction set is actually quite telling. It has 16 instructions, of which five are arithmetics: Add, subtract, divide and 2 multiplies (for getting the most and the least significant halves of the result). Two are fetch from memory, with and without mask (that is, you can fetch from one memory location only those bits that are not set in the mask word, stored in another location). There is also two store-to-memory, with and without clearing the accumulator register (all of these have one explicit adress operand, and are implicitly targeting or sourcing  the accumulator). These are really calculator instructions - The computer is an automated calculating machine. 

For input and output, there are the print and input instructions. Input reads characters (of which there are only 16 - 0-9 and some letters) into the accumulator. You cannot input more than eight characters - Then the accumulator is full, and the program must dump the data out to some memory before reading more. Print can print more complex things - there is a particular character set which more or less are control codes for the printer, and so you give print an adress to fetch control codes from. If you give the argument "000000" as the adress, you instead start the tape reader, which means you can follow up with input instructions reading from the paper tape. Once again, you can only read an accumulator full before you have to turn off the tape reader again to manually store stuff out to memory - No DMA transfers here :D

That is eleven so far - five control instructions left. One is Stop, which doesn't do much. Then there are two jumps:  One unconditional jump to the specified adress, and one conditional - if the sign bit of the accumulator is set, it will jump. The general way to do a loop is to subtract a target value from the loop value, and jump back as long as the result is negative. And finally, there are the adress-modifying instructions. One is "replace adress", which means that it replaces the address part of the word at the designated target memory location with the address part of the accumulator. This is how you do access to sequential data - Before you loop, you rewrite the address of the relevant Add or Fetch or whatever to point to the next data item. And finally is the "Return address". This writes the adress of the memory location after the next one into the designated adress. Why? Well, the next adress will contain a jump instruction to some subroutine, and to get back here from the subroutine, you need to write the adress to jump back to, into an unconditional jump instruction at the end of that subroutine. Or, to spell it out: It is a COMEFROM instruction.

And that, as they say, is that. The entire language of the machine. I'm not sure I'd agree that it'd be trivial to write a blackjack application for this.

There were compilers for some high-level languages, but when we say high-level language, we should remember that it was high level compared to the above described machine language. FORTRAN in 1958-ish wasn't much help for anything else than sequencing advanced desk calculator stuff, either. It just helped keeping track of the memory locations.
 
Last edited by a moderator:
Sorry, I got the Fetch-with-mask-thing wrong. I started wondering how it could deal with two adresses, and of course it couldn't. First you fetch the complete data word from an adress to the accumulator. Then the "Extract" operator does a bitwise AND between the word at its adress (the mask word) and the word in the accumulator. Can't have a single instruction working on two adresses directly, now.
 
Thanks Moxie for highlighting why it's not really fair to judge someone for how they wrote programs on a 1950s drum computer. Especially without seeing the program and comparing it with this hypothetical easy alternative version that would have been every bit as good. I explained why using the optimizing assembler had its own pretty glaring overheads but I guess that didn't really count for anything.

It reminds me of when I had a course assignment to write a multiplication algorithm for a Turing machine. You go write that and tell me how easy it is to create and change the functionality of that code.
 
Last edited by a moderator:
Actually, the input bit is interesting as well. The physical machine was like a big desk with a typewriter (a proper cast-iron 50s-style typewriter, no fancy terminal stuff here), but there really was no interfacing at all. The keyboard was connected exactly like (and in parallell to) the tape reader, which meant that when you pressed a key to input something, what you got was raw keycode rotated into the bottom bits of the accumulator. The machine could read 4- or 6-bit height paper tape, but 4 did fit the data model better, since you're aiming at filling up the accumulator with 32 bits of data (if reading a program in, or something). Otherwise, you'll waste the first 4 bits of the first 6-bit block. That is also why they talk about the program "only existing in hex" - You wrote the code on your notepad, and then you had to translate it down to paper tape 4 bit blocks - Hex. Which you'd probably punch to tape at a dedicated tape punch station, so as not to waste valuable computer time with slow human input. But I digress.

If you had the machine switched to 4-bit reception, the 0-9 keys on the keyboard actually generated the bit patterns 0000-1010, which is a good thing (although the keyboard didn't have a 1 key - you had to use lowercase L). Due to the wiring of the keybed, you had to use F, G, J, K, Q, W for the 1011-1111 patterns (or something else, since 4-bit meant that different keys would generate the same 4-bit pattern), but still, there was a way to input self-parsing data. If you wanted to input actual textual information, you had to use 6-bit to be able to differentiate between different keys, but then you lost the correspondence to the data, because 6-bit added bits to the least significant side of the pattern - 0000 (for 0) became 000010, for instance. So if you used text input, you got an accumulator filled with 6-bit patterns that had no discernible connection to the data they might represent. 

The machine also showed the state of the registers while running...by displaying the voltage pattern from the register read heads on an integrated oscilloscope. :D

Actually, the story of Mel says that the story happened with the successor to the LPG-30, called the RPC-4000. That was certainly a more advanced machine - 8000 words on the drum, double accumulators, an index register, and 5 bits for operands, which means an instruction space of 32 instructions (which is actually 33-ish, since they are faking it a bit by reusing the bit pattern for Halt for another instruction, depending on the state of some other bits). And it does get to be a bit more competent - Even if most of the new instruction space is swallowed by the need to have double the amount of arithmetic operators for the dual accumulators, there is still space for explicit comparisons (>, <, =) and a jump based on them. Also, indexed fetch-from-memory, and some facilities for sensing the state of the machines control panel. The new word format also had an explicit jump at the end of each instruction - Each instruction consisted of the instruction itself, the adress of the data for the instruction, and the adress of the next instruction, all packed into one word. Apparently, Mel had learned his ropes on the LPG-30, and didn't want to change much :) However, I'd say that I'd have problems making a blackjack program even on the more advanced RPC machine.

Somewhere, I found out that some guy wanted to run his LPG-30 programs on the PRC-4000, and actually wrote an honest-to-god LPG-30-emulator for the thing, so that you could run LPG-tapes unchanged. That must mean that he was one of the very first emulator developers. It also gives some perspective to when people complain that this'n'that emulator doesn't reach full framerate :D
 
Moxie's instruction set doesn't seem a million miles away from the 8-bit CPUs with which I have been more familiar with in the past.  The big difference is RAM - the 6502 (for example) only has three registers, each 8-bit, the accumulator, and the X and Y registers, and no input and output instructions (the peripherals are mapped onto the memory map, so you load and store them to input or output).  With only those three registers, you can't do much before having to write something out to RAM, but you only have to wait a handful of clock cycles for that, rather than waiting for a physical drum to spin around.  So there's much less scope for optimisation to the hardware's pecularities, and much easier to think about good structure, as you can lay out your functions anywhere in RAM (well, within 256 bytes as I recall), and make sure they do one thing and do it well and so on.

I don't really mind these old stories of coding to the metal, as long as people realise that's what they are, old stories.  The ability to do anything on these old machines under all those constraints is seriously impressive, as long as everyone realises the world has changed since then, and even if you could write like that these days, there's little reason why you should.  I've met a few too many hackers in my time that still seem to think uncomprehensible code using every trick they can think of and no documentation or tests is good code, but I guess that's my problem (or theirs) rather than anyone elses ;)

And I can't even begin to think how you'd multiply two multi-bit numbers on a turing machine with its serial memory.  It took me long enough to remember how to do long multiplication as is ;)
 
Thanks Moxie for highlighting why it's not really fair to judge someone for how they wrote programs on a 1950s drum computer
The author himself is commenting on how unusual it was. That's the whole point of the story, that there's this one guy who eschews the wisdom of the masses to generate his code, tossing aside the modern tools of the trade (such as they were) to till his proverbial garden by hand.The author isn't telling a tale of "We had a problem and only Mel, with his in depth knowledge and practiced art, could solve it", it was "we had a problem, this was how Mel solved ALL his problems. Now we have two problems, albeit one of them is an amazingly well crafted engine".

The author says compilers exist, that optimizers exist, and that Mel ignores them entirely. The fact that they exist says that it was possible to write structured code that could be documented, that such compilers were in common use, compilers that were sufficiently useful that research and development into further languages and compilers was being done. And Mel chose elegance over maintainability without a thought.

You go write that and tell me how easy it is to create and change the functionality of that code.
Did you have some kind of assembly language that you just pushed through a compiler, or, like Mel, did you hand write all the instructions in order to track the head manually? If it's like the turing machines I used it was the latter, I was never aware of an "assembly language" for the turing "machines" we had to use, although the whole point of the excercise was to force us to think about the instructions we were writing so it could be one existed and we were intentionally kept in the dark about it, but I doubt it.
 
Last edited by a moderator:
Back
Top