Brendan wrote:You're saying "Make has 1 job, to use recipes to create targets based on a set of prerequisites (and this job has 2 purposes, telling the compiler/linker how and optimising the build process)".
It has nothing to do with the compiler/linker specifically; and it doesn't optimize the build process, it just rebuilds when the prerequisites have changed.
Brendan wrote:Wajideu wrote:Nope. Autotools has 8 jobs:
- Detecting the platform of the host and target systems; including the architecture, machine, and subsystem.
Which is necessary because the "build environment" is a massive cluster-bork of fail where virtually everything either has no standard or multiple standards (e.g. where differences between platforms/targets is not abstracted adequately via. things like the C/C++ standard library).
No, it's necessary so the code can be configured specifically for the platform. ie. If your target architecture is "i586", you can use MMX extensions to boost the speed of matrix math. Or if you're writing an emulator and you detect that both the host and target architectures are the same, you can skip most of the dynamic recompiling stage.
Brendan wrote:Wajideu wrote:- Detecting the programs available on the host; (eg. detecting if it should use gcc or lcc as the c compiler) and ensuring that each of these is functioning properly.
Which is necessary because the "build environment" is a massive cluster-bork of fail where virtually everything either has no standard or multiple standards (e.g. where you can't just have a standard environment variable saying the name of the C compiler or even expect the C compiler to function properly).
Nope. It's necessary because not everyone uses the same build environment.
Brendan wrote:Wajideu wrote:- Detecting the headers and libraries available on the system
- Detecting the functions and type definitions listed within headers and libraries to determine if the ones that the host has installed are capable of being used
Which are necessary because either:
- You failed to use something to handle non-standard dependencies (e.g. package manager; which is a new but unrelated "lack of standard package format" cluster-bork all of its own); or
- The "build environment" is a massive cluster-bork of fail where virtually everything either has no standard or multiple standards (e.g. where you can't assume the language's "standard library" is standard).
Nope. Pack manager has nothing to do with it, there are often differences between versions of libraries or software that make them incompatible. For example, several programs (including part of the autotools project itself) have changed so much since their initial version, that in order to provide backwards compatibility they release several different versions of the same program and provide a wrapper around it. Additionally, considering changes between coding standards. C99 introduces new features like generic overloading and atomic types, and Embedded C introduces accumulative and fractional fixed point and hardware io standardization. However, neither of these are available in C89 or pre-standard C. Additionally, not all target platforms support those features. ie. GCC supports Embedded-C fixed point types, but if you try to use them to compile an x86 program, it'll fail because the architecture itself doesn't support them natively.
Brendan wrote:Wajideu wrote:- Providing an interface to configure the project with specific options and features; eg. "--enable-language=c,c++", "--with-headers=", etc.
This is mixing 2 things. For configuring the project because the "build environment" is a massive cluster-bork of fail where virtually everything either has no standard or multiple standards, see previous comments.
For configuring the project because the project itself is a massive cluster-bork of fail (e.g. compile time options where run-time options were needed or the project failed to define a standard for the final executable's behaviour) you're right, in that (something like) auto-conf is needed to allow incompetent fools to spread the disease further.
No, it's you trying to mix 2 things. This is why it's separate from make. Configuring and making are 2 completely different things.
Brendan wrote:Wajideu wrote:- Generating a header containing important package information and specific information detected by the configure process to be used by the pre-processor to allow platform-specific configuration and optimization
Which is necessary because the "build environment" is a massive cluster-bork of fail where virtually everything either has no standard or multiple standards (e.g. where you actually need to abuse the pre-processor to work-around the complete and utter failure of the tools to abstract differences between platforms).
Again, no. It's necessary in order to provide machine specific optimization and to allow the inclusion of meta-information into the binary. ie.
Code: Select all
printf ("This version of " PACKAGE_NAME " is " PACKAGE_VERSION "\n");
# if TARGET_ARCH == i386
printf ("this is optimized for i386 processors\n");
# else
printf ("this is unoptimized\n");
# endif
~$ test
This version of MyPackage is 1.0.0
this is optimized for i386 processors
Brendan wrote:Wajideu wrote: - To provide an easy system of building and distributing packages. (ie. "./configure; make all distcheck" would configure and build the package, then bundle it into a tarball for distribution)
Which is necessary because the "build environment" is a massive cluster-bork of fail where virtually everything either has no standard or multiple standards (e.g. where there is no standard way to generate a package).
Agreed. But now we're talking about a problem with Unix/Linux, and no one wants to fix it. When someone else brings up the idea of fixing it, they get bashed and talked down to, much like what happened about 3 or so pages ago in this very topic.
Brendan wrote:Wajideu wrote:- To provide a standardized way of managing open source projects. Each project usually has a specific set of files such as ChangeLog, AUTHORS, INSTALL, BUGS, README, and LICENSE files which respectively 1) keep track of changes since the last distributions, so if the current distribution has broken something you can roll back to an older distribution 2) keep track of the contributors to the project 3) Provide installation instructions for the package 4) keep track of bugs that need to be fixed, 5) provide basic information about the project, and 6) provide licensing information essential to developers who may wish to fork or distribute the project
You mean there's 6 files that get included in a tarball that are ignored by every other part of the build process? Oh my - we're going to need a team of 12 "over-engineers" working around the clock for the next six years handle a massively complex requirement like that (but don't worry, I'm sure we can recover the research and development expenses when we get our "method for including a thing in another thing" patent)!
Because they're not part of the build process, they're information for the developers and users. And as I stated before, not everyone is happy about them being there.
There are many reasons why we need autotools, but many problems faced with using it. That's why I want to write a utility that fixes those quirks.