OS integrated compiler

Discussions on more advanced topics such as monolithic vs micro-kernels, transactional memory models, and paging vs segmentation should go here. Use this forum to expand and improve the wiki!
User avatar
Love4Boobies
Member
Member
Posts: 2111
Joined: Fri Mar 07, 2008 5:36 pm
Location: Bucharest, Romania

OS integrated compiler

Post by Love4Boobies »

As I mentioned before on this forum, I've been working on a C++ compiler (which right now is nothing more than a crappy C compiler). The point is that I want to integrate it into the system in the following (open-source-ish) way which I think might be similar to Gentoo (which I am not familiar with, btw) in some ways:
  • The boot loader checks the CPU type and if it's different from the one used last time it re-compiles the source code of the kernel and other modules optimized for the new target. It's not as bad as it sounds because although this is slow, how often do people really change CPUs? In case the user doesn't want to wait, a generic image might be provided (although this isn't a design decision, I'm just brainstorming). If it's the first time the OS is started, this is how installation is performed.
  • The package manager too is basically supposed to be a database of source codes on the file system that will get compiled (aka installed). For every CPU change, the programs will be automatically re-installed on demand, if necessary (last CPU type != current CPU type). I think this is what I have in common with Gentoo - the installation method. However, the CPU detection code is a step further in this direction.
  • The compiler will be built to use generic instructions (at least the one embedded in the boot loader). I think...
Terribly slow, but my kernel will never be that big probably :) Anyways, what do you guys think? Any improvements to this design? Any other ideas on how I can do something similar?
Last edited by Love4Boobies on Wed May 27, 2009 3:03 am, edited 1 time in total.
"Computers in the future may weigh no more than 1.5 tons.", Popular Mechanics (1949)
[ Project UDI ]
User avatar
Troy Martin
Member
Member
Posts: 1686
Joined: Fri Apr 18, 2008 4:40 pm
Location: Langley, Vancouver, BC, Canada
Contact:

Re: OS integrated compiler

Post by Troy Martin »

I'd use it. Sounds like the best way to make a quick change to the kernel without having to build and reinstall/copy over to a disk in the host OS.
Image
Image
Solar wrote:It keeps stunning me how friendly we - as a community - are towards people who start programming "their first OS" who don't even have a solid understanding of pointers, their compiler, or how a OS is structured.
I wish I could add more tex
User avatar
piranha
Member
Member
Posts: 1391
Joined: Thu Dec 21, 2006 7:42 pm
Location: Unknown. Momentum is pretty certain, however.
Contact:

Re: OS integrated compiler

Post by piranha »

I'd use it if it could recompile for optimizations based on more than just the CPU. And if it didn't slow down the boot up very much (I like your 'generic image' idea).

-JL
SeaOS: Adding VT-x, networking, and ARM support
dbittman on IRC, @danielbittman on twitter
https://dbittman.github.io
User avatar
NickJohnson
Member
Member
Posts: 1249
Joined: Tue Mar 24, 2009 8:11 pm
Location: Sunnyvale, California

Re: OS integrated compiler

Post by NickJohnson »

Take a look at TCC. The project at least claims to be able to fully compile and boot a patched Linux 2.6 kernel in just a few seconds. It's LGPL'ed, so you could probably borrow some of it's dynamic compiling routines. It's not much of an optimizing compiler, but I'm impressed with any toolchain that fits in 100K.

I think it would be an interesting idea, but you could just manually compile things (which is what you do in Gentoo) and then you wouldn't have to worry about the compiler breaking things with optimizations or things like that. The gains for architecture specific optimizations are usually overshadowed too much by disk I/O and memory allocation to care unless you have the time. The more important part of Gentoo is that you compile selective pieces of each package using USE flags, which then use less space and time.
NReed
Posts: 24
Joined: Wed May 28, 2008 10:56 pm

Re: OS integrated compiler

Post by NReed »

An idea that you can do is have a two phased compile for the kernel. The first phase will do a quick, tcc-style compile ( or you can just use the generic image ). The second phase will be done inside a background process inside the system and it will produce the optimized image and able to take more time. Then either through some dynamic patching of the kernel or just save this image for use next time.

For apps, you can just keep a generic exe to run. Then in a background process, you can start creating optimized exe of each app (probably in the order of usage).

I think its a really cool idea, I just wonder how useful it is. :P
User avatar
Love4Boobies
Member
Member
Posts: 2111
Joined: Fri Mar 07, 2008 5:36 pm
Location: Bucharest, Romania

Re: OS integrated compiler

Post by Love4Boobies »

Troy Martin wrote:I'd use it. Sounds like the best way to make a quick change to the kernel without having to build and reinstall/copy over to a disk in the host OS.
That's why I said "open-source-ish". However, only the admin would be allowed access to the source directory tree.
piranha wrote:I'd use it if it could recompile for optimizations based on more than just the CPU. And if it didn't slow down the boot up very much (I like your 'generic image' idea).
I'm not sure what you mean here. Devices use drivers that are optimized the same why the kernel is. If you have some other idea in mind, I'd love to hear about it :) As for the boot times, I bet it wouldn't - as I've said before... How often does one change his CPU?
NReed wrote:An idea that you can do is have a two phased compile for the kernel. The first phase will do a quick, tcc-style compile ( or you can just use the generic image ). The second phase will be done inside a background process inside the system and it will produce the optimized image and able to take more time. Then either through some dynamic patching of the kernel or just save this image for use next time.

For apps, you can just keep a generic exe to run. Then in a background process, you can start creating optimized exe of each app (probably in the order of usage).
I actually did think of this. I didn't want to mention it because I didn't see this as the primary design characteristic.
I think its a really cool idea, I just wonder how useful it is. :P
Probably not at all, I just want to experiment with a design that I find interesting. I think we can all agree that all our projects here are not very useful (except for a select few perhaps, like 01000101 who wants to earn something out of his project).
"Computers in the future may weigh no more than 1.5 tons.", Popular Mechanics (1949)
[ Project UDI ]
NReed
Posts: 24
Joined: Wed May 28, 2008 10:56 pm

Re: OS integrated compiler

Post by NReed »

Love4Boobies wrote:
I think its a really cool idea, I just wonder how useful it is. :P
Probably not at all, I just want to experiment with a design that I find interesting. I think we can all agree that all our projects here are not very useful (except for a select few perhaps, like 01000101 who wants to earn something out of his project).
Well I agree that most projects here are not going to be useful, but most people have a driving ideal behind their project. I agree that the idea has some potential. I should have formed it in the question form: What is the goal of the project? I just want to hear you justify why this is a good idea.
earlz
Member
Member
Posts: 1546
Joined: Thu Jul 07, 2005 11:00 pm
Contact:

Re: OS integrated compiler

Post by earlz »

Getting the advantages of JIT compiling, without the slowness maybe
User avatar
Love4Boobies
Member
Member
Posts: 2111
Joined: Fri Mar 07, 2008 5:36 pm
Location: Bucharest, Romania

Re: OS integrated compiler

Post by Love4Boobies »

earlz wrote:Getting the advantages of JIT compiling, without the slowness maybe
JIT is good when you want to be able to verify code on-the-run. I didn't say my code was safe... Just normal (C/C++ source) code.
NReed wrote:me potential. I should have formed it in the question form: What is the goal of the project? I just want to hear you justify why this is a good idea.
  • The system can be modified by the admin however he pleases (given he has systems programming knowledge).
  • Applications are portable than ever before across different CPU architectures - the package management system just has to compile and install the application for a PPC the same way it would for an x86. Also they will perform a bit better than if they were compiled for i386 or i686 since they could take advantage of stuff. Also the compiler would have to have knowledge of CPU erratas so bugs in CPUs wouldn't bring down the whole system (like the F00F bug). Applications cannot be executed if they are not properly installed.
  • If you're crazy about open source you could say that it discourages proprietary software (though this wasn't my primary concern).
"Computers in the future may weigh no more than 1.5 tons.", Popular Mechanics (1949)
[ Project UDI ]
User avatar
Colonel Kernel
Member
Member
Posts: 1437
Joined: Tue Oct 17, 2006 6:06 pm
Location: Vancouver, BC, Canada
Contact:

Re: OS integrated compiler

Post by Colonel Kernel »

Have you looked into Universal Binaries in Mac OS X? I think it would achieve your goals, although by trading off disk space to make it much faster. In OS X the same set of binaries can run on PPC, Intel, 32- or 64-bit, just by including compiled code for each combination (the loader picks the right one to run). I think even the kernel is a Universal Binary.
Top three reasons why my OS project died:
  1. Too much overtime at work
  2. Got married
  3. My brain got stuck in an infinite loop while trying to design the memory manager
Don't let this happen to you!
earlz
Member
Member
Posts: 1546
Joined: Thu Jul 07, 2005 11:00 pm
Contact:

Re: OS integrated compiler

Post by earlz »

I think it comes down to this:
which one would a user want to do when he wants to use firefox?
1. Download the binary for their OS and arch. which is usually about 20M. Takes less than 5 minutes on even slow connections.
2. Download the source of firefox(I believe this is bigger than the binary) then download the source for all it's dependencies, and then compile it all(such a CPU intensive task that your computer is noticably slower) and all it's dependencies which took a good 7 hours on my computer which does nothing but compile stuff(so it was completely idle except for that compile process. If your trying to do other things while compiling, it could take a while longer)

People compile and distribute binaries for a reason...
User avatar
Love4Boobies
Member
Member
Posts: 2111
Joined: Fri Mar 07, 2008 5:36 pm
Location: Bucharest, Romania

Re: OS integrated compiler

Post by Love4Boobies »

As I was saying, I don't expect to have many users. My goals aren't to have a commercial OS, just one I can play around with.

As for the binaries in Mac OS, no, I had no clue. Thanks for the tip. I would expect it to be some byte code that gets translated before being executed. It would waste less memory.
"Computers in the future may weigh no more than 1.5 tons.", Popular Mechanics (1949)
[ Project UDI ]
whowhatwhere
Member
Member
Posts: 199
Joined: Sat Jun 28, 2008 6:44 pm

Re: OS integrated compiler

Post by whowhatwhere »

Colonel Kernel wrote:Have you looked into Universal Binaries in Mac OS X? I think it would achieve your goals, although by trading off disk space to make it much faster. In OS X the same set of binaries can run on PPC, Intel, 32- or 64-bit, just by including compiled code for each combination (the loader picks the right one to run). I think even the kernel is a Universal Binary.
I have been working towards forward porting Apple's assembler and dynamic linking to extend binutils for some time now. It's not exactly trivial though....
NReed
Posts: 24
Joined: Wed May 28, 2008 10:56 pm

Re: OS integrated compiler

Post by NReed »

earlz wrote: 2. Download the source of firefox(I believe this is bigger than the binary) then download the source for all it's dependencies, and then compile it all(such a CPU intensive task that your computer is noticably slower) and all it's dependencies which took a good 7 hours on my computer which does nothing but compile stuff(so it was completely idle except for that compile process. If your trying to do other things while compiling, it could take a while longer)
To be fair, Love4Boobies has an advantage here. All of the source code for firefox includes files for many different platforms. His programs will either only be for his operating system or he can strip the unnecessary files before sticking them in the repository. Also, he has not yet implemented all of the C++ spec, and he probably won't be able to do it anytime soon, so compilation will be quicker because it is a slimmed down spec.

Also, assume all his apps are custom made for his platform. All the dependencies should already be there, so that should cut down compile time.

edit: Also on the point of universal binaries. Since in this case of a hobby os, everything is being downloaded. An universal binary has no advantage over a binary for each architecture and a large disadvantage of being larger. Universal binaries are only useful, in my understanding, for usbs or cd/dvds to run apps off.
User avatar
Colonel Kernel
Member
Member
Posts: 1437
Joined: Tue Oct 17, 2006 6:06 pm
Location: Vancouver, BC, Canada
Contact:

Re: OS integrated compiler

Post by Colonel Kernel »

Love4Boobies wrote:As for the binaries in Mac OS, no, I had no clue. Thanks for the tip. I would expect it to be some byte code that gets translated before being executed. It would waste less memory.
Actually, no memory gets wasted -- only disk space. The loader only loads sections of the binary that are compiled for the current architecture.
NReed wrote:Universal binaries are only useful, in my understanding, for usbs or cd/dvds to run apps off.
It can really pay off to have universal binaries if your users want to upgrade to a totally different kind of machine. For example, to migrate installed apps from an old PPC Mac to a new Intel Mac, all you have to do is drag-and-drop the application bundles. Any apps with Universal Binaries run natively on either machine without the need for emulation.

About download sizes... I'm not convinced it would really be all that bad. For a big app like Firefox, how much of it is actually code, versus data and GUI resources (string tables, bitmaps, etc.)?
Top three reasons why my OS project died:
  1. Too much overtime at work
  2. Got married
  3. My brain got stuck in an infinite loop while trying to design the memory manager
Don't let this happen to you!
Post Reply