Re: Speed : Assembly OS vs Linux
Posted: Fri May 22, 2015 9:56 am
Hi,
"On the PDP-7, in 1969, a team of Bell Labs researchers led by Thompson and Ritchie, including Rudd Canaday, developed a hierarchical file system, the concepts of computer processes and device files, a command-line interpreter, and some small utility programs.[2] The resulting system, much smaller than the envisioned Multics system, was to become Unix."
To save memory, you need 2 or more different executables using the same parts of the same shared library at the same time; and even then you still might not save memory (as large amounts of bloat may have been removable through link time optimisation).
In rare cases where something actually does reduce memory usage in practice; it still doesn't justify the disadvantages and probably only indicates that the language's standard library is "overly minimal" (e.g. doesn't cover important things most software needs).
Cheers,
Brendan
From the History of Unix wikipedia page:mallard wrote:This time including such gems as "1960's Unix" (Unix development didn't start until 1970, or late 1969 at the very earliest).
"On the PDP-7, in 1969, a team of Bell Labs researchers led by Thompson and Ritchie, including Rudd Canaday, developed a hierarchical file system, the concepts of computer processes and device files, a command-line interpreter, and some small utility programs.[2] The resulting system, much smaller than the envisioned Multics system, was to become Unix."
Sure, just like nobody has ever heard of VisualBasic or Java. If they cared about portability they would've added a GUI library just to keep the language modern. They didn't.mallard wrote:The idea that a language whose primary appeal is it's portability and simplicity include such things as a GUI library (if it had, you'd never have heard of "C" and instead would be ranting about whatever took its place) is simply absurd.
This is only true because people are too lazy to do anything right (imagine if databases and spreadsheets used plain text because developers were too lazy to implement special tools). As long as the file format is open (and mine will be) nothing prevents multiple editors/IDEs; and if you look at all the IDEs you'll see they all parse and most build their own internal abstract syntax tree (for features like syntax highlighting, intelligent code completion, etc) anyway, so not doing things right just makes things harder for IDE developers. For code generation tools, it's actually easier to deal with abstract syntax trees than it is to deal with text; especially when you're inserting into existing source code.mallard wrote:Many attempts have been made to replace plain text as a format for source code, some of which were even moderately successful (e.g. many BASIC implementations store code pre-tokenised), but time and time again, the advantages of plain text (no special tools required to read it, free choice of editor/IDE, easy to write code-generation tools, etc.) have won out. The overhead of parsing is only reducing as processing capacity increases.
The ability for the same security vulnerability to effect a large number of different processes isn't a good thing. Things like DLL injection and cache timing attacks (those that rely on the same physical pages being used for the shared library) are not a good thing. Working software breaking because you installed a new application (because the new application came with a slightly newer version of the library) is also not a good thing.mallard wrote:"Shared libraries are misguided and stupid." Really? If nothing else, the ability to easily fix security issues in all affected programs is a massively useful thing in a modern environment. "very few libraries are actually shared" is utterly false. Sure, some are used more than others and there are often application-specific "libraries", but it doesn't take much exploring with tools like "ldd" or "Dependency Walker" to see how crucial shared libraries are to modern systems.
To save memory, you need 2 or more different executables using the same parts of the same shared library at the same time; and even then you still might not save memory (as large amounts of bloat may have been removable through link time optimisation).
In rare cases where something actually does reduce memory usage in practice; it still doesn't justify the disadvantages and probably only indicates that the language's standard library is "overly minimal" (e.g. doesn't cover important things most software needs).
That idea is almost 50 years old (O-code) if not older.mallard wrote:Executables as bytecode is actually one of his better "ideas". Of course, with Java, .Net and LLVM, it's been done for over a decade. Whether the compiler is "built into the OS" or not is fairly irrelevant. Something that complex and security-sensitive should never run in kernelspace, so it's just a matter of packaging.
C definitely deserves a prominent place in history.mallard wrote:While the "C" ecosystem is far from perfect (I agree with quite a few of Brendan's points, particularly on the obsolete, insecure functions in the standard library), some of these "imperfections" are exactly what's made it popular (simplicity and portability, enough left "undefined" to allow performant code on virtually any platform). If C had never been invented, we'd almost certainly be using something pretty similar.
I hope so - if people see my OS and those concepts make their way into later OSs (just like MULTICS concepts made their way into Unix) I'd consider that a very significant victory.mallard wrote:I'm increasingly convinced that if Brendan's super-OS ever leaves the imaginary phase ("planning"), it'll be the 20xx's equivalent of MULTICS...
Cheers,
Brendan