Page 8 of 9

Re: Waji's Standards

Posted: Thu Oct 16, 2014 11:56 pm
by Wajideu
Brendan wrote:
Wajideu wrote:That isn't a solution. You're just not compiling the language period, someone else is.
So what? Does your OS have a single application that does everything from linking to spreadsheets to 3D games (and is there anything terribly wrong with "do one thing and do it well")?
It's not "so what?". Even if you ignore how a C compiler works, some else out there can't. You complain about there not being any standards or there being too many standards, yet hypocritically, you have no intention of adhering to them.
Brendan wrote:
  • I don't care about half-baked build systems
  • I don't care about half-baked file systems like FAT
Comments like these are why I'm starting to show less and less respect for anything you say.
Brendan wrote:Most people familiar with compiler development know what AST and ASG are. Anyone that doesn't can google it and find an adequate description on Wikipedia. Childish buzzwords like "expounder" or "delegator" will not be familiar to anyone and can not be googled.
If the words "expound" and "delegate" are childish, you must be a toddler for not knowing what they meant 2 posts ago; and our government and educational systems must be run by a bunch of children.
Brendan wrote:Words like compiler and parser have become standard (in the context of compilers, not politics or whatever) because many people have used them for many years to mean the same thing. How many people (other than you) have ever used "expounder" or "delegator" within a compiler development course (or tutorial or book or...), or in a compiler project, on in a research paper about some aspect of compilers, or anywhere else?
Yep. Wanting to debate whether it's sprinkling or raining. You apparently can't read either, but that's normal for toddlers.
Brendan wrote:Essentially, the fact that these words to have a well known meaning in plain English does not mean these words have a specific meaning in the context of compilers; and without a specific meaning in the context of compilers they are useless.
Ugh, no **** sherlock? I'm using them to describe things that don't have a name yet. They won't have a specific meaning in the context of compilers until they are given one; just like the term "compile" itself; which originally only meant "produce (something, especially a list, report, or book) by assembling information collected from other sources.". Are you going to argue with the dictionary too about how this definition is "yorky-borky-dorky" because a compiler isn't supposed to assemble code?
Brendan wrote:
Wajideu wrote:The fact that you have a poor vocabulary, evidenced by your retarded use of the word "cluster-bork", and "yorky-borky-dorky" to describe the work of anyone else aside from yourself, isn't mine or anyone elses problem.
My vocabulary is fine. I use "cluster-bork" when I really mean "cluster-f*ck" when something (e.g. auto-conf) is so disgusting that it deserves this description. The "yorky-borky-dorky" I made up in an attempt to find something that sounds as silly as the words you've been misappropriating.
Yep. You definitely have a poor vocabulary. And you're a jackass.

Re: Waji's Standards

Posted: Fri Oct 17, 2014 3:10 am
by Brendan
Hi,
Wajideu wrote:
Brendan wrote:
Wajideu wrote:That isn't a solution. You're just not compiling the language period, someone else is.
So what? Does your OS have a single application that does everything from linking to spreadsheets to 3D games (and is there anything terribly wrong with "do one thing and do it well")?
It's not "so what?". Even if you ignore how a C compiler works, some else out there can't. You complain about there not being any standards or there being too many standards, yet hypocritically, you have no intention of adhering to them.
Except that the pre-processor in C compilers traditionally was a separate process, and that there's no user-visible difference between the compiler doing everything itself and the compiler relying on external processes for the pre-processor, assembler and linker.
Wajideu wrote:
Brendan wrote:
  • I don't care about half-baked build systems
  • I don't care about half-baked file systems like FAT
Comments like these are why I'm starting to show less and less respect for anything you say.
If I created a file system with "1 character only" file names, would you suddenly decide that no file name on any file system should ever have more than 1 character?

If you think it's a good idea to limit yourself to the most restrictive limitations you can find; then you should also limit your entire tool-chain to "one single-threaded process at a time", and make sure that no file is too large to fit on an ancient 180 KiB floppy disk, and make sure that nothing needs no more than 16 KiB of RAM. Obviously anyone pretending to be slightly intelligent would not do any of this; and would choose their minimum requirements in a more sensible manner, and would exclude irrelevant limitations from obsolete file systems (especially where those limitations were "worked around" in an standard/established way by their inventor 2 whole decades ago).
Wajideu wrote:
Brendan wrote:Most people familiar with compiler development know what AST and ASG are. Anyone that doesn't can google it and find an adequate description on Wikipedia. Childish buzzwords like "expounder" or "delegator" will not be familiar to anyone and can not be googled.
If the words "expound" and "delegate" are childish, you must be a toddler for not knowing what they meant 2 posts ago; and our government and educational systems must be run by a bunch of children.
I see you're still having difficulty with rational thought. I knew what those words meant in plain English before you used them (and probably knew what they meant in plain English before you were born). I do not know their specific meaning in the context of compilers, because nobody does and nobody ever will.
Wajideu wrote:
Brendan wrote:Essentially, the fact that these words to have a well known meaning in plain English does not mean these words have a specific meaning in the context of compilers; and without a specific meaning in the context of compilers they are useless.
Ugh, no **** sherlock? I'm using them to describe things that don't have a name yet. They won't have a specific meaning in the context of compilers until they are given one; just like the term "compile" itself; which originally only meant "produce (something, especially a list, report, or book) by assembling information collected from other sources.". Are you going to argue with the dictionary too about how this definition is "yorky-borky-dorky" because a compiler isn't supposed to assemble code?
For the entire IT industry there are specific words that mean specific things (far more specific than the same words in English). A tree is not flora, it's a type of data structure. A page is not a piece of paper, it's a piece of memory. A thread is not made of cotton, it's something a scheduler schedules. A pipe is not something for moving liquids and gasses, it's a form of inter-process communication. In all cases their very specific meaning in IT has little to do with their meaning in plain English. None of this should surprise you.

Now tell me what is more likely:
  • a) You are the very first person to ever implement a compiler; or
    b) There's been many thousands of people, all more intelligent and more experienced than you; who have implemented compilers, researched various parts of compilers, wrote books about compiler internals, taught compiler development courses at Universities, etc; and every single one of these people did not feel the need to define special words to confuse people instead of saying (e.g.) "AST to ASG converter"

Cheers,

Brendan

Re: Waji's Standards

Posted: Fri Oct 17, 2014 6:06 am
by Wajideu
Brendan wrote:Except that the pre-processor in C compilers traditionally was a separate process, and that there's no user-visible difference between the compiler doing everything itself and the compiler relying on external processes for the pre-processor, assembler and linker.
It doesn't matter. Pre-processing is part of the C specification. You can't ignore it just because you don't like it.
Wajideu wrote:It doesn't matter if it's a separate process, it's still part of compiling the language. You can't just skip it because you don't like it.
Brendan wrote:If I created a file system with "1 character only" file names, would you suddenly decide that no file name on any file system should ever have more than 1 character?
That has nothing to do with why those statements made me angry. The problem is that you continuously bash and degrade the work of others, labeling it as either half-assed, disgusting, silly, a "cluster-bork/cluster-f*%# of fail", "yorky-borky-dorky", etc. Instead of wasting your time complaining and pointing fingers, either focus on finding a solution or spend time researching about it so you develop an appreciation for it. Preferably both.

Note:
The reason I brought up FAT16 was for the 8.3 file-naming scheme that it's very common on 16-bit machines (especially older platforms). By using a single word, you can make it easier to identify the file. "EXPOUN~1" would be a lot more recognizable than "ASG TO~1". It's not about limiting the length of the file-name, it's about making it easier to work with.

Contrary to what you may believe, I'm not a dumbass. I've worked with and wrote utilities for file systems before that didn't have file names.

Brendan wrote:I see you're still having difficulty with rational thought. I knew what those words meant in plain English before you used them (and probably knew what they meant in plain English before you were born). I do not know their specific meaning in the context of compilers, because nobody does and nobody ever will.
lol, oh really? I could've sworn you just said this earlier
Brendan wrote:I see "expounder" or "delegator" and don't know/remember what it actually means in practical terms (what is it delegating? What does it delegate to?)
Brendan wrote:For the entire IT industry there are specific words that mean specific things (far more specific than the same words in English). A tree is not flora, it's a type of data structure. A page is not a piece of paper, it's a piece of memory. A thread is not made of cotton, it's something a scheduler schedules. A pipe is not something for moving liquids and gasses, it's a form of inter-process communication. In all cases their very specific meaning in IT has little to do with their meaning in plain English. None of this should surprise you.
The thing you appear to not be capable of wrapping your mind around is that we dictate the meanings of the words we use, not vice-versa. You know the word "lexer"? Not an actual word. It came from the Unix command "lex", short for "lexical analyzer"; the word "lexical" stemming from the Greek word "lexikos" meaning "of words"; split between the prefix "lexis" meaning "word", and the suffix "al".

The prefix "lex", unlike "lexis/lexic", actually means "law". This means that the word "lexer" makes no sense whatsoever. How do you "law" something? Because it's used, it developed meaning. Another example of word adoption is "grepping" which came from the acronym "grep". I'm not sure about you, but I've spent an immense amount of time researching both the definition and etymology of technological terms.
Brendan wrote:Now tell me what is more likely:
  • a) You are the very first person to ever implement a compiler; or
    b) There's been many thousands of people, all more intelligent and more experienced than you; who have implemented compilers, researched various parts of compilers, wrote books about compiler internals, taught compiler development courses at Universities, etc; and every single one of these people did not feel the need to define special words to confuse people instead of saying (e.g.) "AST to ASG converter"
How about option c) I never claimed to be the first to implement a compiler, and have gotten pissed off at you several times now for bashing the work of thousands of people far more intelligent and experienced than you who have implemented compilers, researched various parts of compilers, wrote books about compiler internals, taught compiler development courses at Universities, etc.

Backtrack for a bit. I stated before that I wasn't changing anything about compiler development, I'm breaking it down. The fact that you think I shouldn't do something just because it hasn't been done before just fucking boggles my mind. It's called innovation.

Re: Waji's Standards

Posted: Fri Oct 17, 2014 8:53 am
by Brendan
Hi,
Wajideu wrote:
Brendan wrote:Except that the pre-processor in C compilers traditionally was a separate process, and that there's no user-visible difference between the compiler doing everything itself and the compiler relying on external processes for the pre-processor, assembler and linker.
It doesn't matter. Pre-processing is part of the C specification. You can't ignore it just because you don't like it.
Delegating the work to a separate process is not ignoring it; it's just better engineering (separation of concerns, better flexibility for the end user, better modularity, etc).
Wajideu wrote:
Brendan wrote:If I created a file system with "1 character only" file names, would you suddenly decide that no file name on any file system should ever have more than 1 character?
That has nothing to do with why those statements made me angry. The problem is that you continuously bash and degrade the work of others, labeling it as either half-assed, disgusting, silly, a "cluster-bork/cluster-f*%# of fail", "yorky-borky-dorky", etc. Instead of wasting your time complaining and pointing fingers, either focus on finding a solution or spend time researching about it so you develop an appreciation for it. Preferably both.
I have. My solution involves an IDE that works on tokens (no "plain text", and no tokeniser needed in the compiler), where the source code for an executable is a single file containing "units of tokens". There are 2 compilers. The first is used by programmers to convert the source file ("units of tokens") into AST (and do sanity checking and perform as many optimisations as possible), where the (serialised) AST is used as "portable executable" file format. The second compiler runs on the end-user's machine, and converts the "portable executable" into native code optimised specifically for the end user's computer.

For the build system; the OS's virtual file system has "file format converter" plug-ins. For example, if software asks to open a picture file as text, then the VFS might find a "picture file to text converter" (e.g. OCR) and transparently convert the file (and the software that opened the file needn't know or care). Both the first and second compilers are file format converters - e.g. a "source code to portable executable" file format converter, and a "portable executable to native executable" converter. This means that the kernel can ask to open a source code file as a "native executable optimised for Pentium III model 6" and the VFS will take care of the conversions. Of course the VFS also caches the results of conversions, so this only happens once (unless a file is deleted or modified and it needs to be compiled again).

Basically; there is no configure, no make, no pre-processor and no linker; whole program optimisation is mandatory; and (as far as the user can tell) source code is executable "as is".

Of course all of this is designed specifically for my OS design (which itself is quite different to most OS designs in many ways). It will not work for a typical OS (and is not intended for a typical OS).

I've spent over 10 years trying to find ways to improve every part of the OS's design and every part of the native tool-chain.

You come here and claim you're developing a "new" build system; but it has all the same problems as auto-tools and make, and I wonder why you don't just use auto-tools and make if you're not going to attempt to change/improve anything. Next you claim you're developing a "new" compiler, but it has all the same problems as existing compilers, and I wonder why you don't just use GCC or clang/LLVM if you're not going to attempt to change/improve anything. Then, after showing no evidence that you're planning on improving anything or doing anything different whatsoever, you ask me why I don't try to find solutions?
Wajideu wrote:Note:
The reason I brought up FAT16 was for the 8.3 file-naming scheme that it's very common on 16-bit machines (especially older platforms). By using a single word, you can make it easier to identify the file. "EXPOUN~1" would be a lot more recognizable than "ASG TO~1". It's not about limiting the length of the file-name, it's about making it easier to work with.
So, just in case you stumble into a time machine and end up in 1980 (back when people still used 16-bit CPUs to compile things because more powerful machines and cross-compilers were harder to find), just use the name "AST2ASG".
Wajideu wrote:Contrary to what you may believe, I'm not a dumbass. I've worked with and wrote utilities for file systems before that didn't have file names.
That makes it simple then - you'll have to use the name "" in all your documentation, because there's a file system somewhere that doesn't have any file names(!).
Wajideu wrote:
Brendan wrote:I see you're still having difficulty with rational thought. I knew what those words meant in plain English before you used them (and probably knew what they meant in plain English before you were born). I do not know their specific meaning in the context of compilers, because nobody does and nobody ever will.
lol, oh really? I could've sworn you just said this earlier
Brendan wrote:I see "expounder" or "delegator" and don't know/remember what it actually means in practical terms (what is it delegating? What does it delegate to?)
Exactly - no programmer (including compiler developers, and including me) know what "expounder" or "delegator" actually mean in practical terms (i.e. in the specific context of compilers, and not for plain English).
Wajideu wrote:The thing you appear to not be capable of wrapping your mind around is that we dictate the meanings of the words we use, not vice-versa. You know the word "lexer"? Not an actual word. It came from the Unix command "lex", short for "lexical analyzer"; the word "lexical" stemming from the Greek word "lexikos" meaning "of words"; split between the prefix "lexis" meaning "word", and the suffix "al".
The reason language evolves slowly is that "we" is plural - communication fails unless all people involved use the same meaning (and even in the IT industry it can take years to reach consensus). You can not make up your own definitions for words and then expect others to be familiar with them.
Wajideu wrote:
Brendan wrote:Now tell me what is more likely:
  • a) You are the very first person to ever implement a compiler; or
    b) There's been many thousands of people, all more intelligent and more experienced than you; who have implemented compilers, researched various parts of compilers, wrote books about compiler internals, taught compiler development courses at Universities, etc; and every single one of these people did not feel the need to define special words to confuse people instead of saying (e.g.) "AST to ASG converter"
How about option c) I never claimed to be the first to implement a compiler, and have gotten pissed off at you several times now for bashing the work of thousands of people far more intelligent and experienced than you who have implemented compilers, researched various parts of compilers, wrote books about compiler internals, taught compiler development courses at Universities, etc.
Some people's hands are tied. They are not free to (e.g.) radically redesign anything because they have to respect standards, compatibility and interoperability. Other people are free. These are the people that invent things like Java, and Python, and Rust, and Go. They break compatibility by necessity.

We are not (e.g.) the C++ standards committee, we are not (e.g.) Linux distro maintainers, and we are not clang developers. We are an OS development forum. We have no reason to respect existing standards intended for other OSs, no requirement to care about compatibility, and no real need for interoperability with anything except our own work. Our hands are not tied. We are free to innovate.

Perhaps you do not belong here.


Cheers,

Brendan

Re: Waji's Standards

Posted: Fri Oct 17, 2014 10:04 am
by Wajideu
Brendan wrote:I have. My solution involves an IDE that works on tokens (no "plain text", and no tokeniser needed in the compiler), where the source code for an executable is a single file containing "units of tokens". There are 2 compilers. The first is used by programmers to convert the source file ("units of tokens") into AST (and do sanity checking and perform as many optimisations as possible), where the (serialised) AST is used as "portable executable" file format. The second compiler runs on the end-user's machine, and converts the "portable executable" into native code optimised specifically for the end user's computer.
This isn't a better solution, you're just skipping the first step when compiling a language and placing a ton of restrictions on the user. Unless you're meaning that you're separating the analyzing from the compiling, like what was shown in my computation model.
Brendan wrote:For the build system; the OS's virtual file system has "file format converter" plug-ins. For example, if software asks to open a picture file as text, then the VFS might find a "picture file to text converter" (e.g. OCR) and transparently convert the file (and the software that opened the file needn't know or care). Both the first and second compilers are file format converters - e.g. a "source code to portable executable" file format converter, and a "portable executable to native executable" converter.
So...in otherwords, just a more complicated version of what make does? And yet again, placing a lot more restrictions on the users.
Brendan wrote:This means that the kernel can ask to open a source code file as a "native executable optimised for Pentium III model 6" and the VFS will take care of the conversions. Of course the VFS also caches the results of conversions, so this only happens once (unless a file is deleted or modified and it needs to be compiled again).

Basically; there is no configure, no make, no pre-processor and no linker; whole program optimisation is mandatory; and (as far as the user can tell) source code is executable "as is".
I get the feeling from this and your past comments that you've never used inline assembly before or had to write wrapper libraries to make code capable of being compiled for multiple platforms. Optimizing for a platform is a little more complicated than just passing a "-O*" option to the compiler.

I once made a graphical shell in which I had to write a wrapper around the OpenGL, PSPGL, and GX API's in order to target PC, Linux, PSP, and Wii/GC. The PSP build also had to use special math functions for matrices and vectors, otherwise there'd be a considerable difference in speed. Simply ignoring a problem doesn't make it go away.
Brendan wrote:Of course all of this is designed specifically for my OS design (which itself is quite different to most OS designs in many ways). It will not work for a typical OS (and is not intended for a typical OS).

I've spent over 10 years trying to find ways to improve every part of the OS's design and every part of the native tool-chain.
Keywords here: "specifically for my OS design". Believe me when I say, I respect that fact that you have the determination and focus to work on such an ambitious project for so many years and to experiment with new ideas. But this doesn't make you the authority on programming theory and everyone else a bunch of monkeys on typewriters.
Brendan wrote:You come here and claim you're developing a "new" build system; but it has all the same problems as auto-tools and make, and I wonder why you don't just use auto-tools and make if you're not going to attempt to change/improve anything. Next you claim you're developing a "new" compiler, but it has all the same problems as existing compilers, and I wonder why you don't just use GCC or clang/LLVM if you're not going to attempt to change/improve anything. Then, after showing no evidence that you're planning on improving anything or doing anything different whatsoever, you ask me why I don't try to find solutions?
No I didn't. In fact, I explicitly stated several times that I just wanted to make a tool that did the same job as autotools, but without the same problems and drawbacks of using it and in a way that played nicer with make. I also stated numerous times that I had no intention of fixing what isn't broken.

Brendan wrote:
Wajideu wrote:Note:
The reason I brought up FAT16 was for the 8.3 file-naming scheme that it's very common on 16-bit machines (especially older platforms). By using a single word, you can make it easier to identify the file. "EXPOUN~1" would be a lot more recognizable than "ASG TO~1". It's not about limiting the length of the file-name, it's about making it easier to work with.
So, just in case you stumble into a time machine and end up in 1980 (back when people still used 16-bit CPUs to compile things because more powerful machines and cross-compilers were harder to find), just use the name "AST2ASG".
An ignorant statement (about needing a time machine, not the AST2ASG thing). FAT16 is still commonly used today. Ironically, in this very community for floppy disk images. There are also several other 8.3 file systems such as FAT12, the TI symbol table used in Texas Instruments calculators, CP/M, and the Atari's file system. Even if there weren't any modern applications, there are still hobbyists that enjoy tinkering with older hardware. Just because it's not relevant to you doesn't mean it's not relevant at all.
Brendan wrote:
Wajideu wrote:Contrary to what you may believe, I'm not a dumbass. I've worked with and wrote utilities for file systems before that didn't have file names.
That makes it simple then - you'll have to use the name "" in all your documentation, because there's a file system somewhere that doesn't have any file names(!).
Still can't read I see.
Wajideu wrote:It's not about limiting the length of the file-name, it's about making it easier to work with.
Extra info, on the file systems I worked with, there were only physical and logical addresses of information on the flash image. From that, I had to compare the physical and logical addresses to see if the entry was removed. I then had to extract the block, test to see if it was compressed (and if so decompress it), and perform numerous tests upon the data to determine if it was one of 3 (out of about 10) very specific formats. Once I had 3 of these, I had to locate 3 special tables within the flash image using smart-search techniques I came up with in order determine the indices of particular object types. Then, iterating through these lesser tables and inspecting the contents of each file for references to the logical addresses from the main file table. Then, using these indices, references, and addresses together, would finally form file names like "000-000 # 80017000 - 8001A000.map".
Brendan wrote:Exactly - no programmer (including compiler developers, and including me) know what "expounder" or "delegator" actually mean in practical terms (i.e. in the specific context of compilers, and not for plain English).
^ has never used C#
Brendan wrote:The reason language evolves slowly is that "we" is plural - communication fails unless all people involved use the same meaning (and even in the IT industry it can take years to reach consensus). You can not make up your own definitions for words and then expect others to be familiar with them.
I'm sure that every English speaking person in the world just woke up one morning and simultaneously decided that "grepping" was a word. Surely, there's no way that a single person could have came up with that idea because language doesn't evolve like that according to you.
Brendan wrote:
Wajideu wrote:
Brendan wrote:Now tell me what is more likely:
  • a) You are the very first person to ever implement a compiler; or
    b) There's been many thousands of people, all more intelligent and more experienced than you; who have implemented compilers, researched various parts of compilers, wrote books about compiler internals, taught compiler development courses at Universities, etc; and every single one of these people did not feel the need to define special words to confuse people instead of saying (e.g.) "AST to ASG converter"
How about option c) I never claimed to be the first to implement a compiler, and have gotten pissed off at you several times now for bashing the work of thousands of people far more intelligent and experienced than you who have implemented compilers, researched various parts of compilers, wrote books about compiler internals, taught compiler development courses at Universities, etc.
Some people's hands are tied. They are not free to (e.g.) radically redesign anything because they have to respect standards, compatibility and interoperability. Other people are free. These are the people that invent things like Java, and Python, and Rust, and Go. They break compatibility by necessity.

We are not (e.g.) the C++ standards committee, we are not (e.g.) Linux distro maintainers, and we are not clang developers. We are an OS development forum. We have no reason to respect existing standards intended for other OSs, no requirement to care about compatibility, and no real need for interoperability with anything except our own work. Our hands are not tied. We are free to innovate.
You don't have to be part of the C++ standards committee, a Linux distro maintainer, or Clang developer to respect and appreciate the effort and work of others. Just because you came up with an idea of your own doesn't make everyone elses ideas nothing but junk.

Re: Waji's Standards

Posted: Fri Oct 17, 2014 11:39 am
by Brendan
Hi,
Wajideu wrote:
Brendan wrote:I have. My solution involves an IDE that works on tokens (no "plain text", and no tokeniser needed in the compiler), where the source code for an executable is a single file containing "units of tokens". There are 2 compilers. The first is used by programmers to convert the source file ("units of tokens") into AST (and do sanity checking and perform as many optimisations as possible), where the (serialised) AST is used as "portable executable" file format. The second compiler runs on the end-user's machine, and converts the "portable executable" into native code optimised specifically for the end user's computer.
This isn't a better solution, you're just skipping the first step when compiling a language and placing a ton of restrictions on the user. Unless you're meaning that you're separating the analyzing from the compiling, like what was shown in my computation model.
I'm not sure what you mean here. It's roughly comparable to the way Java's tool-chain works, where there's no need for pre-processing, one compiler converts to some sort of intermediate representation (Java byte-code for Java, AST in my case), and a second compiler compiles to native (a JIT compiler in Java's case, an "ahead of time" compiler in my case).
Wajideu wrote:
Brendan wrote:For the build system; the OS's virtual file system has "file format converter" plug-ins. For example, if software asks to open a picture file as text, then the VFS might find a "picture file to text converter" (e.g. OCR) and transparently convert the file (and the software that opened the file needn't know or care). Both the first and second compilers are file format converters - e.g. a "source code to portable executable" file format converter, and a "portable executable to native executable" converter.
So...in otherwords, just a more complicated version of what make does? And yet again, placing a lot more restrictions on the users.
The "file format converter" plug-ins idea was originally for things like graphics file formats, sound file formats, etc - so that applications would only have to deal with one standard file format for each purpose; rather than many (e.g. bmp, tiff, png, jpeg, pcx, gif, ...). Over time it evolved into converting between file formats designed for different purposes (e.g. text to speech/sound); and became a natural choice when I starting thinking about compilers.

It removes a lot of pointless hassle; and it is "restrictive" in several ways. However, that's part of the project's goals (solving the proliferation of standards problem by defining one standard for each purpose and enforcing that one standard).
Wajideu wrote:
Brendan wrote:This means that the kernel can ask to open a source code file as a "native executable optimised for Pentium III model 6" and the VFS will take care of the conversions. Of course the VFS also caches the results of conversions, so this only happens once (unless a file is deleted or modified and it needs to be compiled again).

Basically; there is no configure, no make, no pre-processor and no linker; whole program optimisation is mandatory; and (as far as the user can tell) source code is executable "as is".
I get the feeling from this and your past comments that you've never used inline assembly before or had to write wrapper libraries to make code capable of being compiled for multiple platforms. Optimizing for a platform is a little more complicated than just passing a "-O*" option to the compiler.
I'm been programming assembly language (starting with 6502 assembly) for 30 years now. I only care about my OS, 80x86 and ARM.
Wajideu wrote:
Brendan wrote:
Wajideu wrote:Note:
The reason I brought up FAT16 was for the 8.3 file-naming scheme that it's very common on 16-bit machines (especially older platforms). By using a single word, you can make it easier to identify the file. "EXPOUN~1" would be a lot more recognizable than "ASG TO~1". It's not about limiting the length of the file-name, it's about making it easier to work with.
So, just in case you stumble into a time machine and end up in 1980 (back when people still used 16-bit CPUs to compile things because more powerful machines and cross-compilers were harder to find), just use the name "AST2ASG".
An ignorant statement (about needing a time machine, not the AST2ASG thing). FAT16 is still commonly used today. Ironically, in this very community for floppy disk images. There are also several other 8.3 file systems such as FAT12, the TI symbol table used in Texas Instruments calculators, CP/M, and the Atari's file system. Even if there weren't any modern applications, there are still hobbyists that enjoy tinkering with older hardware. Just because it's not relevant to you doesn't mean it's not relevant at all.
For all of these, you use a much more powerful machine and cross-compiler. You don't port your development environment to a TI calculator.
Wajideu wrote:
Brendan wrote:Exactly - no programmer (including compiler developers, and including me) know what "expounder" or "delegator" actually mean in practical terms (i.e. in the specific context of compilers, and not for plain English).
^ has never used C#
^ has never actually understood a point that anyone's else has ever made (did the C# developers use the word "delegator" for a specific part of their compiler??).

I learnt C# about 3 years ago - it seemed nice (much cleaner than C++), but I haven't had a reason to use it since.
Wajideu wrote:
Brendan wrote:Some people's hands are tied. They are not free to (e.g.) radically redesign anything because they have to respect standards, compatibility and interoperability. Other people are free. These are the people that invent things like Java, and Python, and Rust, and Go. They break compatibility by necessity.

We are not (e.g.) the C++ standards committee, we are not (e.g.) Linux distro maintainers, and we are not clang developers. We are an OS development forum. We have no reason to respect existing standards intended for other OSs, no requirement to care about compatibility, and no real need for interoperability with anything except our own work. Our hands are not tied. We are free to innovate.
You don't have to be part of the C++ standards committee, a Linux distro maintainer, or Clang developer to respect and appreciate the effort and work of others. Just because you came up with an idea of your own doesn't make everyone elses ideas nothing but junk.
If you ever do have an idea of your own (e.g. something that didn't come directly from GNU, Linux or Bell Labs), I'll try to judge it on its merits (but failing to solve a problem and only regurgitating existing work-around/s is something that I will never respect or appreciate).


Cheers,

Brendan

Re: Waji's Standards

Posted: Fri Oct 17, 2014 12:25 pm
by Wajideu
Brendan wrote:I'm not sure what you mean here. It's roughly comparable to the way Java's tool-chain works, where there's no need for pre-processing, one compiler converts to some sort of intermediate representation (Java byte-code for Java, AST in my case), and a second compiler compiles to native (a JIT compiler in Java's case, an "ahead of time" compiler in my case).
All you're doing is saving the code as tokens so you can lazily skip the tokenization process. Btw, if I had to compare this to anything, I'd say it's identical to the way that TI-Basic works. The only difference being that TI-Basic is usually interpreted instead of compiled (an exception being a utility which converts TI-Basic to Z80 assembly whose name escapes me).
Brendan wrote:The "file format converter" plug-ins idea was originally for things like graphics file formats, sound file formats, etc - so that applications would only have to deal with one standard file format for each purpose; rather than many (e.g. bmp, tiff, png, jpeg, pcx, gif, ...). Over time it evolved into converting between file formats designed for different purposes (e.g. text to speech/sound); and became a natural choice when I starting thinking about compilers.

It removes a lot of pointless hassle; and it is "restrictive" in several ways. However, that's part of the project's goals (solving the proliferation of standards problem by defining one standard for each purpose and enforcing that one standard).
This is kind of how the shell I spoke about earlier worked. In my layout, I used a sort of interface/expansion system; where you would install expansions for things like pictures or documents. Each expansion would have modules for various formats like png, jpg, etc. The interface was very generalized, having only load and save functions. The expansions allowed a little finer control, like having readlines and preview functions for pictures. Then each module allowed format-specific tuning.

Unless you want to manually write and keep up with wrappers for all of these things (and write wrappers for those wrappers should you sometime later come to a conclusion that you need a better system); a smarter idea would be just to use a system like make or use a DNA/RNA like structure for your binary formats like Blender uses for forward compatibility.

Brendan wrote:
Wajideu wrote:I get the feeling from this and your past comments that you've never used inline assembly before or had to write wrapper libraries to make code capable of being compiled for multiple platforms. Optimizing for a platform is a little more complicated than just passing a "-O*" option to the compiler.
I'm been programming assembly language (starting with 6502 assembly) for 30 years now. I only care about my OS, 80x86 and ARM.
You just verified what I was thinking. Programming in assembly is not the same thing as programming inline assembly.
Brendan wrote:
Wajideu wrote:
Brendan wrote:So, just in case you stumble into a time machine and end up in 1980 (back when people still used 16-bit CPUs to compile things because more powerful machines and cross-compilers were harder to find), just use the name "AST2ASG".
An ignorant statement (about needing a time machine, not the AST2ASG thing). FAT16 is still commonly used today. Ironically, in this very community for floppy disk images. There are also several other 8.3 file systems such as FAT12, the TI symbol table used in Texas Instruments calculators, CP/M, and the Atari's file system. Even if there weren't any modern applications, there are still hobbyists that enjoy tinkering with older hardware. Just because it's not relevant to you doesn't mean it's not relevant at all.
For all of these, you use a much more powerful machine and cross-compiler. You don't port your development environment to a TI calculator.
There are actually several on-system development environments for the TI-83+, such as AsmDream, Axe (a JIT crossbreed between Basic and ASM), and some other really good app starting with the letter M that I used to use whose name escapes me.
Brendan wrote:If you ever do have an idea of your own (e.g. something that didn't come directly from GNU, Linux or Bell Labs), I'll try to judge it on its merits (but failing to solve a problem and only regurgitating existing work-around/s is something that I will never respect or appreciate).
I tend to stick to the philosophy that it's better to improve than to replace. I have original ideas, I just don't share them because I either don't want them stolen or I don't want people to get all butthurt and ***** me out for recreating the wheel.

Re: Waji's Standards

Posted: Fri Oct 17, 2014 1:07 pm
by Kazinsal
Wajideu wrote: You just verified what I was thinking. Programming in assembly is not the same thing as programming inline assembly.
That is ridiculously pedantic and does nothing to maintain a reasonable debate. Since we're in the realm of being ridiculously pedantic, "inline assembly" could be represented in one of many various syntaxes. Which one are you referring to? I'm sure Brendan comprehends it.

I'm fairly certain at this point you're either arrogant enough to assume you can just come to these boards and boast that you know more than one of the most wise and savvy community members and get away with it, or you're a really awful troll. Do you have any credentials or certifications relevant to the discussion or are you just pulling things from various sources on the fly?

Re: Waji's Standards

Posted: Fri Oct 17, 2014 1:13 pm
by Wajideu
Kazinsal wrote:
Wajideu wrote: You just verified what I was thinking. Programming in assembly is not the same thing as programming inline assembly.
That is ridiculously pedantic and does nothing to maintain a reasonable debate. Since we're in the realm of being ridiculously pedantic, "inline assembly" could be represented in one of many various syntaxes. Which one are you referring to? I'm sure Brendan comprehends it.
I'm fairly certain there is only one meaning for "inline assembly", meaning "in-line (as in the lines of another language like C) assembly.
Kazinsal wrote:I'm fairly certain at this point you're either arrogant enough to assume you can just come to these boards and boast that you know more than one of the most wise and savvy community members and get away with it, or you're a really awful troll. Do you have any credentials or certifications relevant to the discussion or are you just pulling things from various sources on the fly?
I haven't boasted about anything. In fact, a majority of this entire topic has been nothing but debating between others and I where I'm defending the work of people who have done far more than I have from baseless criticism. If that makes you feel inferior, then I'm sorry.

Re: Waji's Standards

Posted: Fri Oct 17, 2014 1:23 pm
by Kazinsal
Wajideu wrote: I'm fairly certain there is only one meaning for "inline assembly", meaning "in-line (as in the lines of another language like C) assembly.
Inline assembly varies from compiler to compiler. I am much more familiar with the intricacies of MSVC's inline assembly than I am with GCC's, and I don't know TCC's or BCC's at all.

What syntax of inline assembly are you smugly accusing Brendan of never having used?
Wajideu wrote: I haven't boasted about anything. In fact, a majority of this entire topic has been nothing but debating between others and I where I'm defending the work of people who have done far more than I have from baseless criticism. If that makes you feel inferior, then I'm sorry.
On the contrary. I don't feel inferior at all. I'm observing from the position of someone who knows that Brendan is not in fact ignorant, a toddler, butthurt, or whatever you feel like calling him next. But I won't fight his battles for him; he's more than capable of doing it himself.

Your statements sure read like you feel smugly superior to everyone else here, though. You might want to tone that down a bit.

Re: Waji's Standards

Posted: Fri Oct 17, 2014 1:29 pm
by Wajideu
Kazinsal wrote:
Wajideu wrote: I'm fairly certain there is only one meaning for "inline assembly", meaning "in-line (as in the lines of another language like C) assembly.
Inline assembly varies from compiler to compiler. I am much more familiar with the intricacies of MSVC's inline assembly than I am with GCC's, and I don't know TCC's or BCC's at all.

What syntax of inline assembly are you smugly accusing Brendan of never having used?
Anything in general. The fact that he demonizes pre-processing, talks about the compiler handling all the optimization, and when questioned about whether or not he's ever used inline assembly responds that he knows assembly (irrelevant to the question); it implies that he has never used it period. Which also explains why we failed to come to an understanding on the value of autotools or the need for configuration.
Kazinsal wrote:
Wajideu wrote: I haven't boasted about anything. In fact, a majority of this entire topic has been nothing but debating between others and I where I'm defending the work of people who have done far more than I have from baseless criticism. If that makes you feel inferior, then I'm sorry.
On the contrary. I don't feel inferior at all. I'm observing from the position of someone who knows that Brendan is not in fact ignorant, a toddler, butthurt, or whatever you feel like calling him next. But I won't fight his battles for him; he's more than capable of doing it himself.
I suggest you go back a couple of pages and start reading. I showed nothing but respect for poor little Brendan while his every response to me was arrogant and snide. If that's what he wants to dish out, that's what he's going to receive. I'm not going to respect a person who doesn't respect me.

Re: Waji's Standards

Posted: Fri Oct 17, 2014 1:40 pm
by Kazinsal
Wajideu wrote: I showed nothing but respect for poor little Brendan
Cute.

I think Brendan's right. This perhaps isn't the community for you. We try to be inclusive, but won't strain too hard if someone's just going to be an arse.

Re: Waji's Standards

Posted: Fri Oct 17, 2014 1:45 pm
by Wajideu
Kazinsal wrote:
Wajideu wrote: I showed nothing but respect for poor little Brendan
Cute.

I think Brendan's right. This perhaps isn't the community for you.
I think you should begin reading on page 2. Just for quick reference, here's Brendan's very first post on this topic:
Brendan wrote:So... You need 'make' because the tool-chain sucks and (e.g.) the compiler/linker won't simply skip unnecessary work by itself; and you need plan files (or auto-tools) because the tool-chain sucks and portable projects aren't possible. Soon, you're going to want something to manage/generate plan files; because the new layer of work-arounds (for the older layer of work-arounds, for the even older layers of work-arounds) needs more work-arounds.

At which point does it make sense to do root cause analysis, and fix the actual underlying problems with languages/compilers; such that this hideous parade of puke is no longer necessary?
First post. No provocation on my part. All of his posts had this tone. I interpreted this as disrespect. After a few posts like this, I started responding to him in the same tone, and it escalated to where we are now.

EDIT:
For the record, if you read the post in which I called him a toddler, it was in reply to him calling my use of "expounding" and "delegate" as childish. And I called him a jackass because in every single post he has made, he's done nothing but un-constructively bash and made fun of not only my ideas, but the work of others like the GNU foundation and the standards committees, posing as though everything else was worthless **** compared to his ideas.

Re: Waji's Standards

Posted: Fri Oct 17, 2014 1:56 pm
by Kazinsal
I see nothing wrong there. The traditional Unix toolchain is a set of crap kludges and workarounds because the toolchain, at its core, sucks. Brendan's not going to tone down his feelings on it for you in a thread you started off by saying "here's my idea; critique it". You wanted criticism, that's his way of giving it.

At no point did you actually answer his question about root cause analysis and just fixing the toolchain from the inside out instead of layering a new shell of gunk to patch the holes in the previous shell of gunk.

Re: Waji's Standards

Posted: Fri Oct 17, 2014 2:02 pm
by Wajideu
Kazinsal wrote:At no point did you actually answer his question about root cause analysis and just fixing the toolchain from the inside out instead of layering a new shell of gunk to patch the holes in the previous shell of gunk.
I don't know how you can see that as a legitimate question rather than rhetoric sarcasm and belittling for wanting to make a tool like autotools.