I'm pretty new to this topic so please forgive my basic questions and possible rantings.
Started to look at toolchain porting as the filesystem and related updates have been more stable now.
Longer term goal is to have the 'usual' tools like gcc gdb running and maybe even be able to build themselves. For now I'm trying to make my way through the wiki pages linked by https://wiki.osdev.org/Porting_GCC_to_your_OS
A big challenge I've noticed is the versioning of tools, esp. the minor dependencies like the auto* family, and the mismatched requirements.
For example, this page (https://wiki.osdev.org/Porting_Newlib) mentioned that newlib needs automake v1.11, and the version of newlib I got also needed autoconf 2.68 (and a certain version of m4).
But the binutils 2.36.1 I downloaded instead wants autoconf 2.69 (ld wants it), and also takes a dependency (unsure if it is from binutils or from autoconf 2.69) on automake 1.15+
From what I understand, binutils itself is just a "minor" dependency when compared with heavy weights like gdb and gcc. There're also articles encouraging a practice of "combined source tree" but it seems like asking for trouble if I simply throw together source trees that each depend on different versions of other tools?
My questions:
1. Is there a recommended set of stable (as in work with each other, not necessarily functionally) toolchain versions that are suitable for porting to a hobby OS? Obviously no need for any "latest and greatest" features, I'd be more than happy if it can build a helloworld
Understood that there could be bugs, that is fixed in current versions. But it's not like the current versions don't have any bugs that will be fixed tomorrow together with more dependencies/ tool version dependencies. It certainly doesn't seem worth the time to keep resolving the whole 'dependency graph' again and again.
I know that LFS publishes a "tested and working" set, but even LFS is quite up to date these days and their recommended version of binutils also needs autoconf 2.69. So not sure if suitable for our purposes.
2. Am I approaching this the wrong way by dreaming that that one set of dependencies/tool versions would work? Like, is it a prerequisite to install X different versions of auto* and other tools and artfaully manage the PATH (or spin up many different VMs)?
3. Could you share how are you managing this?
toolchain versions or good practices wrt toolchain versions?
-
- Member
- Posts: 5568
- Joined: Mon Mar 25, 2013 7:01 pm
Re: toolchain versions or good practices wrt toolchain versi
I'm pretty sure Newlib is an exception. In most cases, you'll be able to get away with a single version of each dependency. (I can't promise it will be easy though!)
Re: toolchain versions or good practices wrt toolchain versi
Thanks! Created a different user that has old auto* in its ~ for newlib and another one to run newer versions of tools for Binutils and GCC.Octocontrabass wrote:I'm pretty sure Newlib is an exception. In most cases, you'll be able to get away with a single version of each dependency. (I can't promise it will be easy though!)
Things in this post https://wiki.osdev.org/Hosted_GCC_Cross-Compiler build now and a helloword linked with newlib does run fine once I realized that their reent structure and impure pointer needs to be initalized.
Are there some useful but not too complex project to port first?
It seems that GCC and Binutils are all dynamically linked and I'm still a long way from supporting that.
I also read somewhere that even newlib has a self-test, but how sure how it works as it is just a library?
Re: toolchain versions or good practices wrt toolchain versi
Asking questions is good. Rantings are usually understandable in both senses of the word.xeyes wrote:I'm pretty new to this topic so please forgive my basic questions and possible rantings.
Dependency version incompatibilities like this aren't very common, but they happen often enough that shared libraries are installed with a versioning system. That's what all the symlinks are. Outside of shared libraries or when multiple versions of whole packages are required, there are 2 solutions: either one gets installed to a different place just as you have, or the files get different names, or a mix of both. I've seen this occasionally in several of the Linux installations I've had over the years, not counting python2/python3 which was this constantly... oh and python1/python2 before that.
Uh, no, gcc without binutils is like an engine without a gearbox, driveshafts, or wheels.xeyes wrote:From what I understand, binutils itself is just a "minor" dependency when compared with heavy weights like gdb and gcc.
Absolutely yes. Dependencies can be awful. In fact, dependency trouble is the reason I ditched POSIX, (my desktops run Windows now,) one of the reasons I'm looking at languages "off the beaten path", and it's even the reason I'm developing an OS at all. I nearly switched to developing a game-like virtual world last week, (not for the first time,) but visions of dependency hell sent me scurrying back! I have really had enough of it! Part of the reason is my health problems; other people are often better able to keep up, but I still suspect there are smart people trapped in a cycle of keeping up with dependencies when they could have time to code amazing things if they weren't! Anyway, if I understand what a "combined source tree" is, I think it could be good if and only if you're very careful about the dependencies of its various components.xeyes wrote:There're also articles encouraging a practice of "combined source tree" but it seems like asking for trouble if I simply throw together source trees that each depend on different versions of other tools?
Kaph — a modular OS intended to be easy and fun to administer and code for.
"May wisdom, fun, and the greater good shine forth in all your work." — Leo Brodie
"May wisdom, fun, and the greater good shine forth in all your work." — Leo Brodie
Re: toolchain versions or good practices wrt toolchain versi
Yeah... no, not a good idea. The idea was that you could combine the source codes of glibc, gcc, and binutils into one source directory (basically extracting the sources of two of those projects into the root of the third, then renaming the root directories such that the version numbers are taken off), and then the build system would take care of the weird extra steps that are part of building a compiler suite. Unfortunately, this never really worked, the whole thing only works with select version combinations, and if you are disinclined towards glibc, you are only saving the tiny little bit of work that is compiling binutils yourself. At least if you do that, you know exactly what version of binutils you are using now.eekee wrote:Anyway, if I understand what a "combined source tree" is, I think it could be good if and only if you're very careful about the dependencies of its various components.
But to support this feature that hardly works we have no syntax checking of configure flags. Each of the configure scripts accepts unknown flags because they might be supported by one of the other two projects. And if you mistyped the option, you will only know when the build has substantially completed and you see your option not being taken into account.
Oh, and you can unpack GCC's dependencies into its source directory as well in the same manner. I don't really see why a compiler needs multi-precision integer and floating-point support, but then I am not a compiler author.
Carpe diem!
Re: toolchain versions or good practices wrt toolchain versi
Ah! Well, that's educated me on that point. Thanks nullplan. Sorry to hear about the option syntax issues.
MP support was added for chosing optimizations when the weighting values given to different choices were found to add up (or multiply?) to more than normal integer ranges. (As far as I remember, as always.) Floating point was added after I stopped paying attention, I guess.
MP support was added for chosing optimizations when the weighting values given to different choices were found to add up (or multiply?) to more than normal integer ranges. (As far as I remember, as always.) Floating point was added after I stopped paying attention, I guess.
Kaph — a modular OS intended to be easy and fun to administer and code for.
"May wisdom, fun, and the greater good shine forth in all your work." — Leo Brodie
"May wisdom, fun, and the greater good shine forth in all your work." — Leo Brodie
Re: toolchain versions or good practices wrt toolchain versi
Haha thanks!eekee wrote:Asking questions is good. Rantings are usually understandable in both senses of the word.xeyes wrote:I'm pretty new to this topic so please forgive my basic questions and possible rantings.
I'm curious how does the symlink solution work?eekee wrote: Dependency version incompatibilities like this aren't very common, but they happen often enough that shared libraries are installed with a versioning system. That's what all the symlinks are. Outside of shared libraries or when multiple versions of whole packages are required, there are 2 solutions: either one gets installed to a different place just as you have, or the files get different names, or a mix of both. I've seen this occasionally in several of the Linux installations I've had over the years, not counting python2/python3 which was this constantly... oh and python1/python2 before that.
I noticed some symlinks that have the name of the tool and points to an executable which as a version prefix like autoX -> autoX-x-y-z. But once the link is in the PATH all builds would be forced to use version x-y-z right?
Managing folders of different versions is one thing, but once the complex build flow kicks off it becomes a mystery as of which ones would be used Guess I just don't understand their make flow well enough.
Didn't mean it is not useful, but that binutils build so fast I figured that there must be a lot less chances for things to go wrong during the build, when compared to gcc and esp. gdb.eekee wrote:Uh, no, gcc without binutils is like an engine without a gearbox, driveshafts, or wheels.xeyes wrote:From what I understand, binutils itself is just a "minor" dependency when compared with heavy weights like gdb and gcc.
Wow that's the most ambitious goal I read here. If you ditch POSIX wouldn't that mean most of the user space have to be written or ported (with a huge compatibility layer which looks like POSIX from above) by yourself (or team). From web browser to games (and the engines for both), from spreadsheet to video player, and even the GUI and GUI toolkits below all theseeekee wrote:Absolutely yes. Dependencies can be awful. In fact, dependency trouble is the reason I ditched POSIX, (my desktops run Windows now,) one of the reasons I'm looking at languages "off the beaten path", and it's even the reason I'm developing an OS at all. I nearly switched to developing a game-like virtual world last week, (not for the first time,) but visions of dependency hell sent me scurrying back! I have really had enough of it! Part of the reason is my health problems; other people are often better able to keep up, but I still suspect there are smart people trapped in a cycle of keeping up with dependencies when they could have time to code amazing things if they weren't! Anyway, if I understand what a "combined source tree" is, I think it could be good if and only if you're very careful about the dependencies of its various components.xeyes wrote:There're also articles encouraging a practice of "combined source tree" but it seems like asking for trouble if I simply throw together source trees that each depend on different versions of other tools?
You are right for sure, health comes first and IMO it's not worth it to be fighting with dependencies in a cycle.