Micro History

Question about which tools to use, bugs, the best way to implement a function, etc should go here. Don't forget to see if your question is answered in the wiki first! When in doubt post here.
Post Reply
OSMAN

Micro History

Post by OSMAN »

Hello.

I was thinking that MINIX took very long to compile in 1980's. Is that why Andy constructed it of seperate modules( object files ) which could be compiled without depending on each other? (Of course depending, but you got it!)

Nowadays I can simply make a single binary file of the whole kernel in two seconds.
Crazed123

Re:Micro History

Post by Crazed123 »

Nope. MINIX was written as a microkernel because the author felt that microkernels were the best designs.
User avatar
Solar
Member
Member
Posts: 7615
Joined: Thu Nov 16, 2006 12:01 pm
Location: Germany
Contact:

Re:Micro History

Post by Solar »

OSMAN wrote: Nowadays I can simply make a single binary file of the whole kernel in two seconds.
Because the CPUs today are faster than those in the 80ies by a factor of 1000 and more...
Every good solution is obvious once you've found it.
OSMAN

Re:Micro History

Post by OSMAN »

I didn't mean that way... BTW what did I actually mean? Of course he could have constructed a micro kernel with one binary file, but I think he did it with many object files because then he could compile just what he modified; not the whole code, because of that slow CPU. I'm repeating this explanation... great.

(does anyone get what I mean?)
durand
Member
Member
Posts: 193
Joined: Wed Dec 21, 2005 12:00 am
Location: South Africa
Contact:

Re:Micro History

Post by durand »

Do you mean compiling each .c (or whatever file) into an object file (.o?) and then doing linking into a single binary later?

And that's as opposed to compiling all .c files directly into a binary?

Well... I guess that's kind of a preference. It does make more sense to compile each source code file into a compiled unit and then link later. It saves time and does not waste CPU to compile again. Plus you have more control over how everything gets compiled, processed and linked.

Maybe speed and time was his reason? It would make sense.
User avatar
gaf
Member
Member
Posts: 349
Joined: Thu Oct 21, 2004 11:00 pm
Location: Munich, Germany

Re:Micro History

Post by gaf »

Hello,
quite frankly I don't think that Tanenbaum has wasted a lot of time thinking about compilation-time when designing MINIX. After all the kernel only has to be rebuild rather seldomly which makes optimization less important. Nevertheless you're of course right that a modular code design helps to cut compilation time drastically as it reduces cross-references and keeps the code local. It's however just not the most important advantage..

regards,
gaf
Post Reply