OK, I'll post some excerpts from the PM thread, I guess, but I am finding it hard to see why you are having so much trouble understanding the reasons why machine code programming fell out of favor. To be honest, I'm posting this more for the others on the board, mainly because I think most of them are beginning to suspect you are either trolling us, or are in fact actually a chatterbot hooked up to the forum
----------------------------------------------------------------------------------------------------------------------------------------
[T]here pretty much never is any reason to work in hex or octal representations of machine code directly - unless you choose to. It's always
possible, but unless you have the sort of prodigious memory and endless patience of Real Programmers of old, such as the aforementioned
Mel Kaye, trying to is likely to push you to the nut house.
Keep in mind that hex, octal, and even binary are just human readable representations of the actual electrical signals - they exist in the software, not the hardware (OK, sometimes they are in the ROM, but that's just software that can't be altered). Given that, and given that assembly language is (more or less) a direct analog of the machine code in a form far more easily read and remembered by humans, it makes little sense to use hex programming unless absolutely necessary - which as I said, should never be the case.
Now, if you were
writing an assembler (as I am), you would have to at the very least know how the different opcode fields are encoded, in order to assemble the instructions into machine code, but even then, you wouldn't usually need to know the individual opcodes themselves - you'd have a table of them which the assembler would map to the different addressing modes or what have you.
Also, you may at times find it useful to use hex, octal, or binary representations of
data when working in assembly language, but that's a separate issue entirely.
----------------------------------------------------------------------------------------------------------------------------------------
trident wrote:Oh Schol-R-LEA , I fell in love for history of Mel Kaye style hex coding!
What are the other stories of Real Programmers of old that developed in hex or octal representations of machine code directly?
Hmmn, I'm going to have to see what I can find. I do know that there is at least one OS developer on this board that works in machine code regularly, and developed his OS entirely in it with some specialized tools he crafted himself. I'm not sure who it is off hand, though. [NB: it apparently is DavidCooper, given his response above.]
----------------------------------------------------------------------------------------------------------------------------------------
trident wrote:Thank you very much.
The Real Programmers of currently develop in what language?
Please speak examples of names of Real Programmers of currently.
Eh? The term 'Real Programmer' is something of a joke, and at the same time something of a badge of honor - it means you were there in the old days when people actually did that kind of programming. There really aren't any current Real Programmers because that's not what the term is for.
----------------------------------------------------------------------------------------------------------------------------------------
trident wrote:The aforementioned Mel Kaye is also a joke?
Oh, no, at least not in the sense of it being made up. Mel Kaye was a real person, and the story that Nather related was very definitely true. When I say it was a joke, I mean that it is something which modern programmers talk about with a combination of awe and humor - sort of like the way someone might speak of a person who memorizes π to the 10,000th digit. I mean, it is an impressive feat, and one that leads to a certain amount of envy or admiration, but at the same time, why would you bother? Now, when Mel Kaye was around, the argument could be made for it - computers were slow, memories were small, and computer time precious enough that even running an assembler was often viewed as a costly move - but the situation today is very different indeed.
In 1954, one could argue, as
Johnny von Neumann famously did with one of his students, that the computer time was more valuable than programmer time, and that even writing an assembler was wasteful - the
MTBF for the machine was shorter than the assembly time for a long program:
http://worrydream.com/dbx/ wrote:John von Neumann, when he first heard about FORTRAN in 1954, was unimpressed and asked "why would you want more than machine language?" One of von Neumann's students at Princeton recalled that graduate students were being used to hand assemble programs into binary for their early machine. This student took time out to build an assembler, but when von Neumann found out about it he was very angry, saying that it was a waste of a valuable scientific computing instrument to use it to do clerical work.
That state of affairs did not last very long, though. Even by the mid-1960s, hand-coding machine code was becoming a lost art, and few regretted that fact. Today, even a moderately optimizing compiler can outperform most hand-coded assembly on a long program, simply because the compiler has a better memory and attention span than the human programmer does - while a talented assembly programmer can out-do a compiler in a small part of the program, humans can be forgetful, impatient and/or tired, which machines cannot.
----------------------------------------------------------------------------------------------------------------------------------------
trident wrote:This subject is irresistible to me,I like so much of this issue that get out tears out of my eyes.
Why that state of affairs did not last very long, though?
Because assembly language made it unnecessary, and made it clear how error-prone hand-assembling programs was. As high-level languages became available and effective, it also became clear that there were abstractions which, while certainly expressible in machine code, could not readily be used by someone trying to write in hex or octal directly.
trident wrote:Schol-R-LEA wrote: Even by the mid-1960s, hand-coding machine code was becoming a lost art, and few regretted that fact.
Why few regretted that fact?
Seriously? Well, for one thing, because it made it possible for people of ordinary skill to write complex programs without needing an exceptional memory and monk-like patience to get very far. Machine code is simply a bad way for humans to express ideas, and having the computer do the drudgery of converting high-level abstractions into said machine code was only sensible.
Second, it separated the programmer from the details of the specific system they were writing code for, making it easier to reason about the programs and how they worked without always having to focus on which bit you have to set in order to turn on the disk drive read head. It made it possible to write programs that ran on different kinds machines, or even machines of the same type with a different configuration.
A third reason is simply that the computer is better at it than we are: [at this point I quoted what I'd said above about compilers not suffering from human weaknesses]
----------------------------------------------------------------------------------------------------------------------------------------
Schol-R-LEA wrote:trident wrote:Johnny von Neumann changed his mind about FORTRAN, assembler and machine language?
Not as far as I know, but then he only lived to 1957. By the time he would have had a chance to see the newer languages, he was already terminally ill with cancer, so he probably never changed his opinion.
trident wrote:Because programmers become the machine code in a lost art?
Because assembly language made it unnecessary, and made it clear how error-prone hand-assembling programs was.
trident wrote:I can not understand.
You said that even by the mid-1960s, hand-coding machine code was becoming a lost art, and few regretted that fact.
Then you said that few regretted that fact because assembly language made it unnecessary, and made it clear how error-prone hand-assembling programs was.
How is can that few regretted that fact because even by the mid-1960s, hand-coding machine code was becoming a lost art and at the same time regretted that fact because assembly language made it unnecessary, and made it clear how error-prone hand-assembling programs was?
I understood that few lamented this fact because they continued believing that machine code and assembly language continued being necessary.
Sorry, the phrase 'few regretted the fact' is an English language colloquialism, with the emphasis on the word 'few'. It is in fact an example of understatement; it means, 'almost nobody regretted the fact except a handful of cranks'.
By 1965 or so, no one of any importance thought it was still necessary to write hand-coded machine code in hex or octal, and even assembly language was largely seen as unnecessary except for certain limited purposes. While assembly programming of applications lingered on into the early 1980s for the sake of efficiency, most programming was done in high-level languages, and as compilers got more efficient (and machines grew ever more powerful) regular use of assembly faded into history.