MenuetOS 1.26.10
Re: MenuetOS 1.26.10
Talking in general terms for a language is too broad. You need to specify an application. But let's just say that no matter in what language the code is written, if an application is obviously being updated without problems, if it has a good user base and if it can really solve a problem as a tool, then we can say that it is well written, no matter if it's written in x86 assembly or in a purely interpreted language, or C.
But when we start dealing with proprietary closed source compilers and languages it's when we start limiting the portability of the program, and it would probably be better if rewritten for more generic and open tools, languages, libraries and systems.
That would be the best metric, that it becomes a popular, useful and frequently updated program or library (until it really no longer has much more to add to its specific task).
Technology is good when one develops it; if not we really don't know what's there, and I see it as something that has only partial benefit and can be negative if you don't know what is happening. I doubt that optimizations can be effectively implemented if you don't know how to make highly portable code. That isn't the task of the language when the developer is the one supposed to know the structure of the machines where the program will run.
The assembly programming style can be applied to any language, the actual instructions can be mirrored, reimplemented or accessed through library functions.
But when we start dealing with proprietary closed source compilers and languages it's when we start limiting the portability of the program, and it would probably be better if rewritten for more generic and open tools, languages, libraries and systems.
That would be the best metric, that it becomes a popular, useful and frequently updated program or library (until it really no longer has much more to add to its specific task).
Technology is good when one develops it; if not we really don't know what's there, and I see it as something that has only partial benefit and can be negative if you don't know what is happening. I doubt that optimizations can be effectively implemented if you don't know how to make highly portable code. That isn't the task of the language when the developer is the one supposed to know the structure of the machines where the program will run.
The assembly programming style can be applied to any language, the actual instructions can be mirrored, reimplemented or accessed through library functions.
YouTube:
http://youtube.com/@AltComp126
My x86 emulator/kernel project and software tools/documentation:
http://master.dl.sourceforge.net/projec ... 7z?viasf=1
http://youtube.com/@AltComp126
My x86 emulator/kernel project and software tools/documentation:
http://master.dl.sourceforge.net/projec ... 7z?viasf=1
-
- Member
- Posts: 1146
- Joined: Sat Mar 01, 2014 2:59 pm
Re: MenuetOS 1.26.10
By the time you've reimplemented everything in libraries, you might as well use a compiler for a high-level language. It probably does a better job than you.~ wrote:The assembly programming style can be applied to any language, the actual instructions can be mirrored, reimplemented or accessed through library functions.
When you start writing an OS you do the minimum possible to get the x86 processor in a usable state, then you try to get as far away from it as possible.
Syntax checkup:
Wrong: OS's, IRQ's, zero'ing
Right: OSes, IRQs, zeroing
Syntax checkup:
Wrong: OS's, IRQ's, zero'ing
Right: OSes, IRQs, zeroing
Re: MenuetOS 1.26.10
A CPU is probably a more complex task than a regular application, and it reaches a point where it can be regularly improved, so a well written program would be feasible.onlyonemac wrote:By the time you've reimplemented everything in libraries, you might as well use a compiler for a high-level language. It probably does a better job than you.~ wrote:The assembly programming style can be applied to any language, the actual instructions can be mirrored, reimplemented or accessed through library functions.
I think that if those libraries already existed, many small and normal OSes would be finished by now and there would be more understanding about how specialized things work. There would be modern applications universally available on even the oldest systems (486, 386... and that would allow for investigation on producing extremely fast and memory-compact code based on producing it there as a test case) since the code would be just 1 step away from being used by the machine.
But I see that if it's to be done, we can't stay just discussing while we see how our concentration degrades and we lose time. We have to work at it.
Discussing when there are people who don't like the pure assembly idea is probably in error. It's not practical and discussions are on non-applied topics and view approaches, so the ones who are doing it are just forced to dismiss those discussions and keep producing actual code and developed techniques so that assembly can become as readable and portable, maintainable as C, with all the library code and techniques a developer could need to build any application, and just develop that historically while the other programmers use what they see fit in the same way.
Of course, when one tries to discuss to document something everyone will give their opinion, so the best to do is let empty-goal messages fade and let real development become available as actual downloads of good forum messages and project files.
YouTube:
http://youtube.com/@AltComp126
My x86 emulator/kernel project and software tools/documentation:
http://master.dl.sourceforge.net/projec ... 7z?viasf=1
http://youtube.com/@AltComp126
My x86 emulator/kernel project and software tools/documentation:
http://master.dl.sourceforge.net/projec ... 7z?viasf=1
Re: MenuetOS 1.26.10
I'd suggest that accessing a relational database is probably better done using SQL than assembly. You'd be surprised to discover how common a programming task it is in the real world.~ wrote:I can't think of any programming task in existence that cannot be modeled, designed, patterned in assembly in a way that is maintainable.
- Schol-R-LEA
- Member
- Posts: 1925
- Joined: Fri Oct 27, 2006 9:42 am
- Location: Athens, GA, USA
Re: MenuetOS 1.26.10
EDIT: sorry for the repeated edits, but I am trying to avoid sequential posts as new things come to mind.
There is a follow up called Stalingrad which implements a Scheme dialect called VLAD. The former was meant mainly as a proof of concept, and both are intended to be used as a 'ultra-optimal release' tool rather than general development - the idea is that once you had it working and debugged on a faster, but less aggressively optimizing, compiler, you would then use the Stalin (or Stalin∇) compiler to squeeze performance out of the released version.
In any case, the algorithm issue is a ludicrous argument in favor of assembly, as algorithms aren't language specific - if you are changing an algorithm, why change the language? Your basic argument is, as Love4Boobies points out, advocating a Turing tarpit - while you can express any algorithm in assembly, it is it is generally harder for even an experienced assembly programmer to implement one than it is for that same programmer to implement it in a HLL. The ability to express more complex abstractions than would be feasible in assembly is indeed an advantage of HLLs - not be cause it isn't possible to write them in assembly, but because writing them in assembly would be unreasonably difficult compared to writing them in a more abstract notation.
Expressiveness is not the same as computation strength; some things are just easier to express in one language than in another, even though both are equally capable of expressing it. Expressiveness is in the interaction between the developer and the language, not just in the language's Turing Equivalence, and unlike Turing Equivalence, it is subjective - indeed, you can't really discuss expressiveness in general without considering the programmer using the language, though some aspects of it could be considered 'semi-objective' (while some exceptions such as Topmind or Spectate Swamp exist, few programmers would say that tables are more expressive than objects, or gotos more expressive than structured iteration statements and functional decomposition).
This Alan Perlis quote may be apt here: "A language that doesn't affect the way you think about programming, is not worth knowing." While learning assembly does affect your ideas of how you can program - and usually for the better, as it gives a lot of insights into the actual workings of the systems - it is by no means the only meaningful view of programming.
Understanding the relevance of this to my 'Thelema Notes' thread is left as an exercise for the reader.
For example, it is my understanding that Gary Kildall had the first PL/M compiler ready and had the generated code running on a simulator before the design of the 8008 was completed, and had to make changes to it to match the actual hardware once the masks were set. Later, when the 8080 was developed, he was already working on CP/M - which, contrary to what most people assume, was originally mostly written in PL/M, though the 8086 and 68000 versions were first written in Pascal, then re-written in C - when the first chips were being made.
Or consider the product that put Microsoft on the map, Altair Basic. It was written, in the span of a few weeks, on a timeshare account which Paul Allen had for his classwork at Harvard (I don't know the OS; it was on a PDP-10, so probably either TOPS-10 or Tenex, as I doubt Harvard would have used ITS or WAITS), and tested using an emulator Allan and Gates wrote (based on the one the wrote for the 8008 the previous year; Traf-O-Data's hardware used the 8008 as a microcontroller, which was their big innovation at that point). At that time, they had never seen a working Altair (in fact, the only prototype had gone missing in transit when it was sent for the Popular Electronics photoshoot due to a strike at the shipping company - the one on the magazine cover was a mockup made for the shoot when they couldn't track down the actual machine - and they were still working on replacing it when Gates contacted Ed Roberts to offer the interpreter). When they went to demonstrate it at MITS' offices in Albuquerque on an Altair loaded with 4KiB of RAM (which the sole completed prototype with that much, which required multiple memory boards), they were able to load it from a paper tape made using that timeshare account, and it ran on the first go (with some minor issues regarding the tape reader at first). It was the first time any non-trivial program had run on the system at all.
So, then why are there so many stories of early microcomputer software being written in assembly, or even machine code? There were a number of reasons. First, a lot of that was conflation between how you programmed the systems and how you used theme - especially for the Altair, which used a passive backplane and initially didn't ship with a BIOS ROM, so starting it required use of the toggle switches. Two, while there were usually cross-development tools, there generally weren't a lot of tools that ran natively, especially in the very early days when the systems often didn't have enough memory to run a 4KiB BASIC or PL/M compiler. Third, most of the first generation micros didn't bundle any of the tools, and something like Altair BASIC wasn't cheap - the original sale price was $200, which was considered so out of line even then that it spurred hobbyists to develop several inexpensive or free Tiny BASIC as an alternative (and they also usually used less memory). Fourth, just because they existed didn't mean they were very good; compiler optimization was still a black art at the time, so a lot of the time a hand-optimized program could run better (and the limited memory meant that they couldn't be very elaborate, so it wasn't as unreasonable a proposition to hand-code something). Finally, the 'real programmer' ethic was still lingering, so a lot of programmers did it that way because of sheer machismo.
Permit me to blow your mind: STALIN Scheme. It is widely regarded as the most aggressively optimizing compiler for any language, ever (the official motto is, "STALIN Brutally Optimizes"). Notably, it is a global optimizer, something which is generally unheard of elsewhere; it can re-arrange steps, inline function calls, merge memory objects whose scopes don't overlap, and several other things that most compiler writers simply don't try to do, all applied to the entire program. While it can't replace an algorithm, in some cases it can perform successive transforms which end up with a significantly different method for accomplishing the same thing.bzt wrote:In that case a manually optimized assembly will perform better than any machine optimized code (basically because compilers can do only micro-optimizations, while human can do large scale algorithmic optimizations as well).
There is a follow up called Stalingrad which implements a Scheme dialect called VLAD. The former was meant mainly as a proof of concept, and both are intended to be used as a 'ultra-optimal release' tool rather than general development - the idea is that once you had it working and debugged on a faster, but less aggressively optimizing, compiler, you would then use the Stalin (or Stalin∇) compiler to squeeze performance out of the released version.
In any case, the algorithm issue is a ludicrous argument in favor of assembly, as algorithms aren't language specific - if you are changing an algorithm, why change the language? Your basic argument is, as Love4Boobies points out, advocating a Turing tarpit - while you can express any algorithm in assembly, it is it is generally harder for even an experienced assembly programmer to implement one than it is for that same programmer to implement it in a HLL. The ability to express more complex abstractions than would be feasible in assembly is indeed an advantage of HLLs - not be cause it isn't possible to write them in assembly, but because writing them in assembly would be unreasonably difficult compared to writing them in a more abstract notation.
Expressiveness is not the same as computation strength; some things are just easier to express in one language than in another, even though both are equally capable of expressing it. Expressiveness is in the interaction between the developer and the language, not just in the language's Turing Equivalence, and unlike Turing Equivalence, it is subjective - indeed, you can't really discuss expressiveness in general without considering the programmer using the language, though some aspects of it could be considered 'semi-objective' (while some exceptions such as Topmind or Spectate Swamp exist, few programmers would say that tables are more expressive than objects, or gotos more expressive than structured iteration statements and functional decomposition).
This Alan Perlis quote may be apt here: "A language that doesn't affect the way you think about programming, is not worth knowing." While learning assembly does affect your ideas of how you can program - and usually for the better, as it gives a lot of insights into the actual workings of the systems - it is by no means the only meaningful view of programming.
Understanding the relevance of this to my 'Thelema Notes' thread is left as an exercise for the reader.
Actually, even then it would have been nonsensical, as cross-compilers were an established technology in the mid-1960s. The last system of any significance which didn't have at least an in-house assembler and either a compiler or an interpreter for it before release was probably the PDP-1 - DEC was quite fast in adopting cross-development tools, as was IBM, and everyone else copied whatever one of those two companies did.Love4Boobies wrote:As for platforms for which there are no compilers, I don't think that argument holds ground because the cost of writing a C compiler is extremely low. One can slap together a half-assed C89 compiler in no more than a couple of days (and if you use something like LLVM, it will be a whole less half-assed than one might expect for such a short period of time). Or, better yet, they can port an existing one. It's better to invest a little bit in effort in that than have to live with a large-scale assembly program for years. The "no tools" argument is artificial and would've made it in the 80s and maybe early 90s but not today.
For example, it is my understanding that Gary Kildall had the first PL/M compiler ready and had the generated code running on a simulator before the design of the 8008 was completed, and had to make changes to it to match the actual hardware once the masks were set. Later, when the 8080 was developed, he was already working on CP/M - which, contrary to what most people assume, was originally mostly written in PL/M, though the 8086 and 68000 versions were first written in Pascal, then re-written in C - when the first chips were being made.
Or consider the product that put Microsoft on the map, Altair Basic. It was written, in the span of a few weeks, on a timeshare account which Paul Allen had for his classwork at Harvard (I don't know the OS; it was on a PDP-10, so probably either TOPS-10 or Tenex, as I doubt Harvard would have used ITS or WAITS), and tested using an emulator Allan and Gates wrote (based on the one the wrote for the 8008 the previous year; Traf-O-Data's hardware used the 8008 as a microcontroller, which was their big innovation at that point). At that time, they had never seen a working Altair (in fact, the only prototype had gone missing in transit when it was sent for the Popular Electronics photoshoot due to a strike at the shipping company - the one on the magazine cover was a mockup made for the shoot when they couldn't track down the actual machine - and they were still working on replacing it when Gates contacted Ed Roberts to offer the interpreter). When they went to demonstrate it at MITS' offices in Albuquerque on an Altair loaded with 4KiB of RAM (which the sole completed prototype with that much, which required multiple memory boards), they were able to load it from a paper tape made using that timeshare account, and it ran on the first go (with some minor issues regarding the tape reader at first). It was the first time any non-trivial program had run on the system at all.
So, then why are there so many stories of early microcomputer software being written in assembly, or even machine code? There were a number of reasons. First, a lot of that was conflation between how you programmed the systems and how you used theme - especially for the Altair, which used a passive backplane and initially didn't ship with a BIOS ROM, so starting it required use of the toggle switches. Two, while there were usually cross-development tools, there generally weren't a lot of tools that ran natively, especially in the very early days when the systems often didn't have enough memory to run a 4KiB BASIC or PL/M compiler. Third, most of the first generation micros didn't bundle any of the tools, and something like Altair BASIC wasn't cheap - the original sale price was $200, which was considered so out of line even then that it spurred hobbyists to develop several inexpensive or free Tiny BASIC as an alternative (and they also usually used less memory). Fourth, just because they existed didn't mean they were very good; compiler optimization was still a black art at the time, so a lot of the time a hand-optimized program could run better (and the limited memory meant that they couldn't be very elaborate, so it wasn't as unreasonable a proposition to hand-code something). Finally, the 'real programmer' ethic was still lingering, so a lot of programmers did it that way because of sheer machismo.
Rev. First Speaker Schol-R-LEA;2 LCF ELF JAM POEE KoR KCO PPWMTF
Ordo OS Project
Lisp programmers tend to seem very odd to outsiders, just like anyone else who has had a religious experience they can't quite explain to others.
Ordo OS Project
Lisp programmers tend to seem very odd to outsiders, just like anyone else who has had a religious experience they can't quite explain to others.
Re: MenuetOS 1.26.10
There's truth in your words, but still I recommend to watch that documentary. Specially that part when the guy talks about how and why their tool differ to Photoshop/3DSMax/iMovie etc. I believe STALIN is a strong optimizer, but I still don't think it can eliminate a big texture bitmap by substituting it with a small procedure to generate the same texture (which by the way outperforms ANY compression invented so far). And that's only one example. Again, I'm talking about macro, algorithmic and not micro optimizations.
I know I may sound uppish, but the truth is, I often find my manually written assembly code smaller and faster than any compiler optimized version. For an example, here's my ksend routine. I was unable to create such a small and effective code with gcc, no matter whatever optimizations I've specified. All gcc generated versions had more jumps and ruined more registers than my version (it's important that this routine only allowed to use general purpose registers). And I could optimize it even further if I want to, which I don't as it's good enough for now.
I would not suggest that ANY assembly programmer can write better code than ANY compiler optimized version. What I meant is, there're programmers capable of writing better code than the compiler optimized versions in certain cases. I'm pretty sure Fabrice Bellard is one of them, and I believe Ville Turjanmaa could also be (but can't be sure as I haven't seen MenuetOS64's source).
I know I may sound uppish, but the truth is, I often find my manually written assembly code smaller and faster than any compiler optimized version. For an example, here's my ksend routine. I was unable to create such a small and effective code with gcc, no matter whatever optimizations I've specified. All gcc generated versions had more jumps and ruined more registers than my version (it's important that this routine only allowed to use general purpose registers). And I could optimize it even further if I want to, which I don't as it's good enough for now.
I would not suggest that ANY assembly programmer can write better code than ANY compiler optimized version. What I meant is, there're programmers capable of writing better code than the compiler optimized versions in certain cases. I'm pretty sure Fabrice Bellard is one of them, and I believe Ville Turjanmaa could also be (but can't be sure as I haven't seen MenuetOS64's source).
- MajickTek
- Member
- Posts: 101
- Joined: Sat Dec 17, 2016 6:58 am
- Libera.chat IRC: MajickTek
- Location: The Internet
- Contact:
Re: MenuetOS 1.26.10
There was a fork of MenuetOS called "Kalibri", which grew out of control (it is an awesome project though). Other people made forks without giving credit (saying the code was theirs); and so he closed the source.onlyonemac wrote:Why isn't the 64-bit version open-source?
TL;DR:
No forks wanted for fear of no credit.
Everyone should know how to program a computer, because it teaches you how to think! -Steve Jobs
Code: Select all
while ( ! ( succeed = try() ) );
Re: MenuetOS 1.26.10
The original issue was that Ville wanted his copyrights at the top of every existing KolibriOS source file,MajickTek wrote:There was a fork of MenuetOS called "Kalibri", which grew out of control (it is an awesome project though). Other people made forks without giving credit (saying the code was theirs); and so he closed the sourceonlyonemac wrote:Why isn't the 64-bit version open-source?
including inside the new files without even a line of code written by Ville. He claimed that all the KolibriOS files
are / will be the "derivative works" from his original OS and therefore even the new files should include his copyright -
an incorrect interpretation of GNU GPLv2 license. Hypocritically, Ville also borrowed the elements of KolibriOS
without giving credit to Kolibri team, and when the Kolibri people registered at MenuetOS forum to say its wrong
he simply banned them
Sadly it could have been a big misunderstanding between the Menuet/Kolibri developers
because of the language barrier (same reason why you have probably heard only one side of that story)
Re: MenuetOS 1.26.10
Funny thing is that (except the "64-bit feature") Kolibri already delivers significantly more features than Menuet,
and you could easily verify my words by getting both Kolibri and Menuet floppies and comparing them side-by-side
It came to a point where I no longer need a Menuet OS floppy because Kolibri does everything and much more
Also, these OS are not compatible for a long time already, which makes it difficult to borrow
from one to another: even if Menuet releases their 64-bit code, Kolibri would be unable to take it
without a very serious rewrite that would be equal to writing from scratch
and you could easily verify my words by getting both Kolibri and Menuet floppies and comparing them side-by-side
It came to a point where I no longer need a Menuet OS floppy because Kolibri does everything and much more
Also, these OS are not compatible for a long time already, which makes it difficult to borrow
from one to another: even if Menuet releases their 64-bit code, Kolibri would be unable to take it
without a very serious rewrite that would be equal to writing from scratch