Brendan wrote:I have. My solution involves an IDE that works on tokens (no "plain text", and no tokeniser needed in the compiler), where the source code for an executable is a single file containing "units of tokens". There are 2 compilers. The first is used by programmers to convert the source file ("units of tokens") into AST (and do sanity checking and perform as many optimisations as possible), where the (serialised) AST is used as "portable executable" file format. The second compiler runs on the end-user's machine, and converts the "portable executable" into native code optimised specifically for the end user's computer.
This isn't a better solution, you're just skipping the first step when compiling a language and placing a ton of restrictions on the user. Unless you're meaning that you're separating the analyzing from the compiling, like what was shown in my computation model.
Brendan wrote:For the build system; the OS's virtual file system has "file format converter" plug-ins. For example, if software asks to open a picture file as text, then the VFS might find a "picture file to text converter" (e.g. OCR) and transparently convert the file (and the software that opened the file needn't know or care). Both the first and second compilers are file format converters - e.g. a "source code to portable executable" file format converter, and a "portable executable to native executable" converter.
So...in otherwords, just a more complicated version of what make does? And yet again, placing a lot more restrictions on the users.
Brendan wrote:This means that the kernel can ask to open a source code file as a "native executable optimised for Pentium III model 6" and the VFS will take care of the conversions. Of course the VFS also caches the results of conversions, so this only happens once (unless a file is deleted or modified and it needs to be compiled again).
Basically; there is no configure, no make, no pre-processor and no linker; whole program optimisation is mandatory; and (as far as the user can tell) source code is executable "as is".
I get the feeling from this and your past comments that you've never used inline assembly before or had to write wrapper libraries to make code capable of being compiled for multiple platforms. Optimizing for a platform is a little more complicated than just passing a "-O*" option to the compiler.
I once made a graphical shell in which I had to write a wrapper around the OpenGL, PSPGL, and GX API's in order to target PC, Linux, PSP, and Wii/GC. The PSP build also had to use special math functions for matrices and vectors, otherwise there'd be a considerable difference in speed. Simply ignoring a problem doesn't make it go away.
Brendan wrote:Of course all of this is designed specifically for my OS design (which itself is quite different to most OS designs in many ways). It will not work for a typical OS (and is not intended for a typical OS).
I've spent over 10 years trying to find ways to improve every part of the OS's design and every part of the native tool-chain.
Keywords here: "specifically for my OS design". Believe me when I say, I respect that fact that you have the determination and focus to work on such an ambitious project for so many years and to experiment with new ideas. But this doesn't make you the authority on programming theory and everyone else a bunch of monkeys on typewriters.
Brendan wrote:You come here and claim you're developing a "new" build system; but it has all the same problems as auto-tools and make, and I wonder why you don't just use auto-tools and make if you're not going to attempt to change/improve anything. Next you claim you're developing a "new" compiler, but it has all the same problems as existing compilers, and I wonder why you don't just use GCC or clang/LLVM if you're not going to attempt to change/improve anything. Then, after showing no evidence that you're planning on improving anything or doing anything different whatsoever, you ask me why I don't try to find solutions?
No I didn't. In fact, I explicitly stated several times that I just wanted to make a tool that did the same job as autotools, but without the same problems and drawbacks of using it and in a way that played nicer with make. I also stated numerous times that I had no intention of fixing what isn't broken.
Brendan wrote:Wajideu wrote:Note:
The reason I brought up FAT16 was for the 8.3 file-naming scheme that it's very common on 16-bit machines (especially older platforms). By using a single word, you can make it easier to identify the file. "EXPOUN~1" would be a lot more recognizable than "ASG TO~1". It's not about limiting the length of the file-name, it's about making it easier to work with.
So, just in case you stumble into a time machine and end up in 1980 (back when people still used 16-bit CPUs to compile things because more powerful machines and cross-compilers were harder to find), just use the name "AST2ASG".
An ignorant statement (about needing a time machine, not the AST2ASG thing). FAT16 is still commonly used today. Ironically, in this very community for floppy disk images. There are also several other 8.3 file systems such as FAT12, the TI symbol table used in Texas Instruments calculators, CP/M, and the Atari's file system. Even if there weren't any modern applications, there are still hobbyists that enjoy tinkering with older hardware. Just because it's not relevant to you doesn't mean it's not relevant at all.
Brendan wrote:Wajideu wrote:Contrary to what you may believe, I'm not a dumbass. I've worked with and wrote utilities for file systems before that didn't have file names.
That makes it simple then - you'll have to use the name "" in all your documentation, because there's a file system somewhere that doesn't have any file names(!).
Still can't read I see.
Wajideu wrote:It's not about limiting the length of the file-name, it's about making it easier to work with.
Extra info, on the file systems I worked with, there were only physical and logical addresses of information on the flash image. From that, I had to compare the physical and logical addresses to see if the entry was removed. I then had to extract the block, test to see if it was compressed (and if so decompress it), and perform numerous tests upon the data to determine if it was one of 3 (out of about 10) very specific formats. Once I had 3 of these, I had to locate 3 special tables within the flash image using smart-search techniques I came up with in order determine the indices of particular object types. Then, iterating through these lesser tables and inspecting the contents of each file for references to the logical addresses from the main file table. Then, using these indices, references, and addresses together, would finally form file names like "000-000 # 80017000 - 8001A000.map".
Brendan wrote:Exactly - no programmer (including compiler developers, and including me) know what "expounder" or "delegator" actually mean in practical terms (i.e. in the specific context of compilers, and not for plain English).
^ has never used C#
Brendan wrote:The reason language evolves slowly is that "we" is plural - communication fails unless all people involved use the same meaning (and even in the IT industry it can take years to reach consensus). You can not make up your own definitions for words and then expect others to be familiar with them.
I'm sure that every English speaking person in the world just woke up one morning and simultaneously decided that "grepping" was a word. Surely, there's no way that a single person could have came up with that idea because language doesn't evolve like that according to you.
Brendan wrote:Wajideu wrote:Brendan wrote:Now tell me what is more likely:
- a) You are the very first person to ever implement a compiler; or
b) There's been many thousands of people, all more intelligent and more experienced than you; who have implemented compilers, researched various parts of compilers, wrote books about compiler internals, taught compiler development courses at Universities, etc; and every single one of these people did not feel the need to define special words to confuse people instead of saying (e.g.) "AST to ASG converter"
How about option c) I never claimed to be the first to implement a compiler, and have gotten pissed off at you several times now for bashing the work of thousands of people far more intelligent and experienced than you who have implemented compilers, researched various parts of compilers, wrote books about compiler internals, taught compiler development courses at Universities, etc.
Some people's hands are tied. They are not free to (e.g.) radically redesign anything because they have to respect standards, compatibility and interoperability. Other people are free. These are the people that invent things like Java, and Python, and Rust, and Go. They break compatibility by necessity.
We are not (e.g.) the C++ standards committee, we are not (e.g.) Linux distro maintainers, and we are not clang developers. We are an OS development forum. We have no reason to respect existing standards intended for other OSs, no requirement to care about compatibility, and no real need for interoperability with anything except our own work. Our hands are not tied. We are free to innovate.
You don't have to be part of the C++ standards committee, a Linux distro maintainer, or Clang developer to respect and appreciate the effort and work of others. Just because you came up with an idea of your own doesn't make everyone elses ideas nothing but junk.