Story of Mel. (Machine Code)


Oh, no, it most certainly is millions of miles from, for instance, the 6502 instruction set! :) For memory, certainly, but the big difference comes in the data manipulation area. Adressing modes - Direct, indirect, indexed, and explicit data given (on the LPG, there is no way to add 1 to the accumulator - You had to have 1 somewhere in memory, and add that). Stack operations. Bit operations - logical operations, rotations, all that.  Calling and returning from subroutines. Tests and conditional jumps of several kinds. Data transfer instructions between memory locations. There really, truly, is no comparison. Yes, it is more or less the same bare von Neumann architecture, and the methodology that follows from that - moving things from memory to the cpu, doing calculations, moving stuff back to memory, repeat - but the space of possible computation in any modern machine language (that is, anything this side of 1965-ish) is so much greater that we might as well call it a completely different machine. You could write a nontrivial program in C, and compile it into running code on anything from a 60:s PDP, to the 8-bit micros of the 80s, to a modern computer. But you could not do that on those machines. It quite simply does not support even close to the same class of computations.

And WizardStan: Yes, the point of the story is that Mel is one of a dying breed - The kind of story you tell about your great grandfathers uncle who lived in a small cottage in the woods without electricity and water just because that was how he used to do it. And the same kind of amused awe. The point of the story is not to teach good software development practices. It is rather explicit in telling us that being Mel is a Bad Idea. The point of the story is to tell a story of a masterful hacker on his machine. Which Mel is - He knows his hardware back to front, and can make it do things it never was meant to do. 

And even with that said, I don't think you should press too hard on the existence and use of compilers. Yes, there where compilers. Yes, they were being used by some. Still, at this time that was most certainly not the most common case, and particularly not for systems programmers. It is a time in computing history Very Different from our current.
 
Last edited by a moderator:
Remember that a compiler at the time didn't take a source code file in any readable form. A run of a FORTRAN program would entail that the programmer wrote his program by hand, on paper (which Mel did, too). Then, he'd get to a punching station and punch the program into a set of punch cards, one per program line. Then he'd turn the card pack, with the requisite stop and spacing cards, in to the data center, where the FORTRAN compiler would, eventually, be loaded from (in this case) paper tape, and all the days card packs would be loaded into the card reader. Then, the compiler goes through the programs, and compiles them into machine code that is punched to another card deck. That deck, in turn, is then run, the results from the run is printed, the printouts are cut and distributed to their respective card packs, and our programmer might turn up the next day to pick up his result. Possibly, it'd use paper tape instead of punch cards, but the basic idea is the same. And FORTRAN itself was, quite like the underlying machine language, quite good at being a calculator, but not very much structured and readable as what we'd expect today. It had no function calls, no conditional statements of the kind we have today, no conditional loops...It was, basically, a somwhat tricked out IBM 04 assembler. 
 
Hey, Fortran II was 1958 - predating your arbitrary '65 cut-off by seven years, and introduced subroutines!  That said, I don't think I've ever touched any fortran older than Fortran '77, and always on an interactive terminal of some sort.

Anyway, ALGOL > FORTRAN (or, in FORTRAN: (ALGOL GT FORTRAN), I guess) ;)
 
The author himself is commenting on how unusual it was. That's the whole point of the story, that there's this one guy who eschews the wisdom of the masses to generate his code, tossing aside the modern tools of the trade (such as they were) to till his proverbial garden by hand.
Was he though? The author relates a lot not to what "normal" programmers were like back in Mel's day but what it's like compared to the very different present time (this story was written in the early 80s) In many ways he frames Mel more as a frontiersman in a much less charted territory. The whole first couple paragraphs is about establishing this. He also spends a lot of time throughout the story establishing how much different the hardware was back then.

He gives the impression that Mel is also exceptional compared to himself, but this is in the context of him being a new hire that's learning a new machine vs Mel who had mastered it. Compared to other company workers who were masters of their respective hardware maybe Mel wasn't so unusual for his time, outside of the fact that there were few people programming at all.

The author isn't telling a tale of "We had a problem and only Mel, with his in depth knowledge and practiced art, could solve it", it was "we had a problem, this was how Mel solved ALL his problems. Now we have two problems, albeit one of them is an amazingly well crafted engine".

The author says compilers exist, that optimizers exist, and that Mel ignores them entirely. The fact that they exist says that it was possible to write structured code that could be documented, that such compilers were in common use, compilers that were sufficiently useful that research and development into further languages and compilers was being done. And Mel chose elegance over maintainability without a thought.
You're really over-simplifying this and you're still making a lot of assumptions.

I don't know why you refuse to acknowledge that using even assemblers, much less compilers, had its own cost and set of risks outside of potentially producing (possibly much) less efficient code.

You were very confident before that there'd be plenty of CPU time available for a less optimized implementation, without seeming to know an awful lot about the machine. But the drum word read cycle time is 0.26ms, meaning that it's effectively like a 3.84KHz clock speed. KHz, not MHz. And the instructions take at least 4 of these "cycles" meaning at most 1000 MIPS. A left or right shift takes 8 cycles + 1 cycle for each bit. Multiplications and divisions take 71 cycles.

The above is for optimally schedule code, meaning that the operand then next instruction are always the next thing to be accessed on the drum. For poorly scheduled code you can add up to 126 cycles. So at worst case, you're down to under 10 instructions per second.

Are you starting to get an idea of just how little CPU time this had to spare? You still so confident that Mel was just a stubborn hacker who refused to use the appropriate tools for the job and that anyone who knew the assembly language could have done just as good of a job without the tricks? Or maybe you're starting to see that even for something as simple as blackjack you might be paying a tangible price for using an optimizing assembler that can't do as good of a job as you can because it doesn't know how you'll access the data (or, for that matter, doesn't even know the run times of all of the instructions in your code). And maybe using clever if difficult to understand tricks to save an instruction here or there could actually be worthwhile.

Did you have some kind of assembly language that you just pushed through a compiler, or, like Mel, did you hand write all the instructions in order to track the head manually? If it's like the turing machines I used it was the latter, I was never aware of an "assembly language" for the turing "machines" we had to use, although the whole point of the excercise was to force us to think about the instructions we were writing so it could be one existed and we were intentionally kept in the dark about it, but I doubt it.
AFAIK (this was a long long time ago) I wrote something like a simple assembler and Turing machine simulator to help with this, but the thing with Turing machines is that you can't really abstract that much without making entirely new virtual computers inside them. And if I were doing it in Mel's time I'm sure I would have been much more hesitant to write my own tools for it.
 
Last edited by a moderator:
I didn't really trust compilers to write code that was, a: comparatively performant and, b: actually correct until the mid 90s, and even after then I occasionally had to crack out the disassembler more than once to check what it had done before even running anything.  Compilers in the 80s were often a bit hit and miss IME, so I dread to think what they were like in the 60s.
 
Last edited by a moderator:
definitely true some years ago, but given the advance in compiler and the wholly increase program complexity, a program completely "optimized" by hand from the start is an aberration. given how hard optimization really is with today's architecture, there's good chance that something that look like a naive optimization just end up being slower than clear code as it prevented the compiler to perform some better ones... As example, I wouldn't play with loop unrolling until I've identified an actual problem and measured my fix, as there's lot of chance that the compiler will generate something at least as good. Optimization should only be used where you've profiled a problem, and what lot of people in defense of optimization seems to miss is that "good design" != "optimization".
 
Hey, Fortran II was 1958 - predating your arbitrary '65 cut-off by seven years, and introduced subroutines!  That said, I don't think I've ever touched any fortran older than Fortran '77, and always on an interactive terminal of some sort.

Anyway, ALGOL > FORTRAN (or, in FORTRAN: (ALGOL GT FORTRAN), I guess) ;)
Yes, that is true. But that run on an IBM 704. Which was a rather more capable (and rather more expensive) machine than the Royal McBees :) And note that it has subroutines, but not functions - No parameter passing, and no local variable environment. 

Algol did have the problem that even though the language specs specified a very expressive language, especially for its time, there really was no compilers that managed neither to implement the complete language, nor to produce efficient code. The last bit owing quite a bit to the rather limited instruction sets for many machines.

And elwing: Yes, of course. The situation with todays computers is, thankfully, quite different :D
 
You're really over-simplifying this and you're still making a lot of assumptions
Well yeah, I finished the story, had a minor epiphany, and then carried on my way. I didn't expect to be going down this path at all.
I don't know why you refuse to acknowledge that using even assemblers, much less compilers, had its own cost and set of risks outside of potentially producing (possibly much) less efficient code.
I'm not trying to. All I'm trying to explain is my first thought upon finishing, that this story can be seen as a cautionary tale of premature optimization, that Mel, heedless of the tools of the time, took a direct course of action that preferred optimization vs maintainability and the result was at least several hours of lost labour, possibly days or weeks.
 
I'm not trying to. All I'm trying to explain is my first thought upon finishing, that this story can be seen as a cautionary tale of premature optimization, that Mel, heedless of the tools of the time, took a direct course of action that preferred optimization vs maintainability and the result was at least several hours of lost labour, possibly days or weeks.
Except that to take that away from the story posits that using said tools would have produced something just as impressive as his own handcrafted work, and given how primitive the hardware was, and how primitive the software tools for it would also have had to have been, that doesn't seem terribly likely.

- Neelix
 
Last edited by a moderator:
Optimizing compilation requires resources. Resources that were simply not available yet in the early days.

One example: register allocation. Today we of course let the compiler handle this. But there was a time in which the consensus amongst programmers was that register allocation is something you have to do manually, since after all, it is an NP-complete problem (it's essentially graph coloring) so there is no efficient way to do it automatically. Except there is, if you're happy with approximately optimal solutions, and if your compiler has enough cpu time and memory available.

Early compilers were extremely simplistic, because they had to be: they were running on machines that had very little computational power. So high level languages were always compiled in a rather naive way, with a relatively direct one-to-one mapping from high level instructions into low level code snippets, in a very local way. Global program analysis is only possible if you have enough time and space.

The widely accepted idea that "high-level languages are less efficient than low-level languages" actually originates from those simplistic early compilers, which did indeed more or less cause a more or less fixed overhead in both time and space (mostly time), which could be quite significant. It is becoming less and less true though, since compilers have improved (more optimizations), they have more computational resources available, and the target machines are more powerful which means that the price of being suboptimal tends to decrease (e.g. if your cpu has only a few registers, suboptimal register allocation will be a bigger problem than when it has lots of them).
 
would have produced something just as impressive as his own handcrafted work
Oh no, please don't misunderstand me. I've conceded from the beginning that Mel's code was far more tuned than anything a compiler could have churned out at the time, possibly even today. There is no doubt that reading the code was as awe inspiring as the author says.I'm questioning whether it was necessary. I'm pointing out my belief that his reaching for hand optimized code to solve every problem is the wrong thing to do. Sure, it ran better than what the compiler could come up with, but it took the author of the story two weeks to understand what he had done with a single loop. Two weeks to realize that Mel was using the index register in a non-standard way.

The point of the story seems to be that Mel went for these hand optimizations every time, and it produced brilliant, efficient, awe inspiring code, code which was then virtually unmaintainable. I'm saying that I took from this story "don't be a Mel". The author went to great lengths to impress upon us that Mel never used assembly, didn't trust optimizers at all, he produced perfectly crafted machine instructions that were placed in specific locations on the drum for maximum benefit. That may have been necessary in the time Mel "grew up" in, but by the time Ed was hired on they had rudimentary tools, Ed was even hired to write some of those tools, and Mel refused to use them; today we have even better tools at our disposal. What Mel produced was technically better, but that better was unmaintainable.

If it had been a story about Mel solving the base problem and then using his art to solve inefficiencies that came up in the course of writing the code then I would have had a completely different impression. That's not the case though. The story makes it clear that hand crafted optimization was Mel's first course, a maze of code with drum head spinning with precision, using and reusing instructions in ways they were never really intended, and that this hand crafted code resulted in something that was so awe inspiring that it couldn't be successfully modified.

Optimizing compilation requires resources. Resources that were simply not available yet in the early days.
Except that we're told that optimizers existed. Compilers existed. It was possible to write functional code for these machines without resorting to hand optimized machine instructions. We know this because the author tells us. He was even hired for the purpose of writing a Fortran compiler for the new system. By the time of this story, the late 50s, writing functional code no longer required relied on knowing the details of every instruction, but that was still Mel's first instinct.
 
would have produced something just as impressive as his own handcrafted work
Oh no, please don't misunderstand me. I've conceded from the beginning that Mel's code was far more tuned than anything a compiler could have churned out at the time, possibly even today. There is no doubt that reading the code was as awe inspiring as the author says.I'm questioning whether it was necessary. I'm pointing out my belief that his reaching for hand optimized code to solve every problem is the wrong thing to do. Sure, it ran better than what the compiler could come up with, but it took the author of the story two weeks to understand what he had done with a single loop. Two weeks to realize that Mel was using the index register in a non-standard way.
When I said just as impressive I wasn't only talking about the code but also the finished product. If you were at a show, and had to wait 5 minutes for an interactive display to interact would it really hold your interest long enough to see the results?
Optimizing compilation requires resources. Resources that were simply not available yet in the early days.
Except that we're told that optimizers existed. Compilers existed. It was possible to write functional code for these machines without resorting to hand optimized machine instructions. We know this because the author tells us. He was even hired for the purpose of writing a Fortran compiler for the new system. By the time of this story, the late 50s, writing functional code no longer required relied on knowing the details of every instruction, but that was still Mel's first instinct.
Except that we're also told that those tools weren't very effective. If you could paint in fine detail by hand, would you even consider using a rendering tool that could at best produce an indistinct and blobby image? Just because those tools existed in a primitive form doesn't mean that using them would have been the right decision at the time.
- Neelix
 
All I know is if I were doing stuff on Mel's hardware at Mel's company I probably wouldn't be using an assembler either, much less a compiler. I'm not going to wait hours to get a machine to spit out a new build of the code, especially when only a small part of it needed to be changed. Maybe for the first time, but not to make incremental changes.

I'd probably also spend a lot of time trying to hand simulate some pieces of the code before loading it on the hardware.
 
Except that we're also told that those tools weren't very effective
Not as effective as Mel, yes, but still effective enough that Mel was a dying breed of programmer as everyone else was shifting to the tools. Ed, the author of this story, was hired on for the express purpose of writing a FORTRAN compiler for the system. Someone obviously believed that compilers were the future, that the system was capable of running code produced by one.Even among Mel's peers it's still heavily implied that he was a world apart, a hacker's hacker, a maestro among masters.

If you could paint in fine detail by hand, would you even consider using a rendering tool that could at best produce an indistinct and blobby image?
I don't think "blobby image" does the analogy justice. They may not have been great compilers but given that they were being adopted I'd bet they consistently did well enough.
All I know is if I were doing stuff on Mel's hardware at Mel's company I probably wouldn't be using an assembler either, much less a compiler
Then you would have been in the minority, if the implications of the story are to be believed. All I know is that I would never use an opcode as an operand.
I still don't understand why you guys think I'm wrong. Mel wrote hand optimized code that couldn't successfully be modified, code that relied on quirks of the system that only the aforementioned master could comprehend. You can't possibly think that that's the best way to tackle every problem, so where is the disconnect?
 
I reckon the only charge you can fairly lay at Mel's door is not documenting his code well enough.  When he wrote that Blackjack code the optimisers and compilers probably didn't exist, or were at least so primitive as to not be that much use if you knew how to code the machine already (and you probably needed to know that to fix the compiler's mistakes not too far down the line).  It seems slightly odd to me not to even use an assembler, but I guess if you need to modify the opcode dynamically, masking them with easier to remember mnenomics isn't actually that useful.
 
I still don't understand why you guys think I'm wrong. Mel wrote hand optimized code that couldn't successfully be modified, code that relied on quirks of the system that only the aforementioned master could comprehend. You can't possibly think that that's the best way to tackle every problem, so where is the disconnect?
I don't believe any one of us is saying "That's the best way to tackle every problem" and there's the disconnect. What we are saying is that that was the best way for him to solve that problem in those circumstances, whereas you don't seem to be willing to accept that the differing circumstances make a difference.

- Neelix

Edit:

Someone obviously believed that compilers were the future, that the system was capable of running code produced by one.
I believe that in the future I'll be carrying around a Pyra which will one day serve most of my daily computing needs. That's still in development though, so for the time being I make do with what I have.
 
Last edited by a moderator:
I don't believe any one of us is saying "That's the best way to tackle every problem"
But I am just saying the opposite. The story describes Mel's approach to every problem as to tackle every problem with clockwork precision. It then goes on to say how Ed spent several weeks trying to understand the code before eventually giving up. If you don't agree with me then what are you saying?
whereas you don't seem to be willing to accept that the differing circumstances make a difference.
No, I don't. I don't believe, at all, that Mel's approach was the one and only way for the problem to have been solved. I have not heard a single argument that demonstrated that the hacks he employed first thing were a necessity. But honestly, that's irrelevant. The key is not that he did these clever things, perhaps you are correct and ultimately these machines really couldn't be programmed by anyone who didn't think in fourth dimensional physics or whatever. Fine, we'll assume that's the case. It still doesn't matter, I've said as much several times that I'm not taking issue with the fact that he wrote such efficient code. The point I took from the story is that such optimizations were standard course for Mel directly as part of the development process, everything was written with full intention of being a well oiled and unchanging machine, and when that machine did need to be changed it was an impossible task, resulting in a significant amount of wasted time. This was a method that wasn't standard practice for anyone else, Ed makes it very clear that Mel is an anomaly, that most everyone else is writing in some kind of human readable form.Note also that I'm not saying that Mel was necessarily wrong for his time, I'm saying it's a cautionary tale: here's this thing that was done, here's the results, don't do the thing in the future. It's possible for the thing to seem like the best course of action when it happens, but then turns out to have been problematic later down the road. That is the nature of such tales.
 
trust compilers to write code that was, a: comparatively performant and, b: actually correct until the mid 90s
I remember in 1992 -o3 code that coredumped, while -o2 did work fine, C on sun sparc iirc.

Now a days, we have many nifty things, like telling the machine to run a task between x and y, and the scheduler optimizes the run order. Or databases that check how much certain tables are being used and autosuggests adding keys on a table, making it faster. JIT's are also getting better at micro optimizations.

This reminds me of a software inside a microchip that was computer aided optimized, and there was this non connecting place, that actually influenced the other...

in other words, the software optimized a program that worked only due to an unforseen hardware sideeffect. Need to find that paper, it was a good read.
 
Last edited by a moderator:
Typically for imperative languages, optimizing compilation 'only' improves the constant factors, not the complexity. Hand-optimizing 'only' improves that constant a bit. And nowadays, those constant factors tend to be relatively small, and the abundance of cpu power and memory available makes it easy to assume that hand-optimization has never been a good idea.

Except that with the extremely limited hardware of Mel's days, 1) those constant factors were larger (because compilers were stupider), and 2) most of the time, the limits of the hardware were actually quite limiting, hardware was very expensive, and human labor was in comparison cheaper.

E.g. if you just go back to the Commodore 64 (which is of course way more modern hardware than what Mel was working with): I remember that many games needed to load stuff (from floppy disk or tape) in the middle of the game (usually between levels) because they couldn't just fit everything in memory. Of course this is annoying, so if you could reduce the loading time, it was better. Any constant factor improvement would be immediately noticeable, because waiting times were measured in seconds and minutes back then, not in milliseconds.

The C64 floppy disk drive only had one read/write head, so you had to flip the disk to access the other side. Again: human labor (manually flipping the disk) was relatively cheap, hardware (a second read/write head so both sides could be used without flipping the disk) was expensive.

While the C64 had a 'high level' language built-in (BASIC), no serious programmer would actually use that for real games and applications, because of the overhead in time and space.

So I think it is very much possible that even if it was just a 'small' constant factor, the difference between hand-optimized code and compiler output could very well be enough to make the difference between "fits in memory" and "does not fit in memory", or between "makes you have to wait a few seconds to see the next card" and "makes you have to wait a minute to see the next card". And yes, that would be the difference between "works" and "does not work".
 
Typically for imperative languages, optimizing compilation 'only' improves the constant factors, not the complexity. Hand-optimizing 'only' improves that constant a bit. And nowadays, those constant factors tend to be relatively small, and the abundance of cpu power and memory available makes it easy to assume that hand-optimization has never been a good idea.
I dunno, I can't think of anything outside of constant time factors that's different between DeSmuME and DraStic, they're both more or less imperative (DeSmuME is C++ but it is for the most part pretty C-like with singleton classes and some templates), and a lot of what's in the latter I'd consider hand-optimization.. YMMV I guess. (to be fair, they're also not functionally identical, so it's not a totally valid comparison)
 
Last edited by a moderator:
Back
Top