The C standard doesn't define any pre-processor macro's specifying the target machine. If you're using any in your code right now, know that they are extensions and your code will not build properly in all build environments.Brendan wrote:For someone compiling for themselves (where target architecture is the architecture the compiler is running on) the compiler can auto-detect the target. For compiling for other people (e.g. for a pre-compiled binary only package) you want "all targets" but this isn't very practical so you end up selecting a few common subsets (e.g. 32-bit 80x86 without any features, 32-bit 80x86 with MMX/SSE, and 64-bit 80x86) and most people end up running code that isn't optimised for their system.Wajideu wrote:No, it's necessary so the code can be configured specifically for the platform. ie. If your target architecture is "i586", you can use MMX extensions to boost the speed of matrix math. Or if you're writing an emulator and you detect that both the host and target architectures are the same, you can skip most of the dynamic recompiling stage.
In both of these cases, detecting the target in make solves nothing.
This is why for the longest time, people often used lines like:
Code: Select all
# if defined (_WIN32) || defined (__WIN32__)
Code: Select all
# if defined (__aarch64__) || defined (_M_AMD64)
A ridiculous hack that can be avoided just by accepting the fact that you need a configure step.Brendan wrote:Note: A better approach is to use 2 compilers; such that the first compiler does as much as sanity checking and optimisation as possible and generates some form of portable intermediate code (where this compiler is used as part of the build process), and the second compiler compiles the portable intermediate code into native code optimised specifically for the end user's specific computer when the package is installed by the end user. This solves all of the problems, including the "people end up running code that isn't optimised for their system", and means that the build process simply doesn't care what the target architecture is. Also note that this is a large part of why something like Java (where there is a JIT compiler optimising for the specific target machine) can be as efficient as C++ (where you can't optimise for the end-user's specific machine) despite the additional overhead of "JIT-ting". Sadly, traditional tools are traditionally bad.
Nope. (nōp) Definiton: "informal variant of no".Brendan wrote:Your words say "Nope", but the meaning behind those words says "Yes, I agree completely, there is no standard build environment and therefore not everyone uses a compatible build environment".
Standards have nothing to do with it. autotools exists specifically so that older development systems that cannot be altered will still function as properly as the standardized ones. Primary example, the IBM Indigo has a proprietary software development kit called mipspro and a fork of GCC 1.1. The current GCC is incapable of targeting the Indigo afaik. Even if it was, you would either (a) need a configuration step, or (b) need to compile a new ported GCC on a secondary machine and transfer that to the Indigo. Expecting people to do (b) is just ludicrous. It's already difficult enough to add a new architecture to the GCC without having to go back and forth between 2 separate machines just to check it. Before you even mention it, there is no emulator for the Indigo either. Pretty much everything about the computer was proprietary.
No, I'm saying there are different revisions of the same standard. C89, C99, and C11 are all well-fleshed C standards and most compilers attempt to target, but that doesn't mean that C11 will compile under C99, that C99 or C11 will compile under C89, or surprisingly, that C89 will even compile under C11 (due to the fact that C11 strictly prohibits some things which would otherwise have no problem compiling under C89). Embedded-C is a whole different beast atm. It's an absolute must for embedded systems, but it cannot be incorporated into the normal C standard because most dedicated systems don't have native support for fixed point types.Brendan wrote:You're agreeing with me again (saying there's either no standards or multiple standards for libraries, languages, etc).
I've already provided like 20 of them. Go back and read. I provide at least 1 or 2 examples to back every reason I've given as to why a configure step is needed. By, "give me an example", you surely mean "I don't care how many examples you give me, I'm right and you're wrong."Brendan wrote:Can you provide an example of where an interface to configure the project is necessary; which is not a case of bad build environment and not a case of bad project? Please note that it would be in your interest to use some foresight here - e.g. for any potential example, attempt to predict my response; and attempt to avoid providing an example where it's easy to find a way to improve the language/tools/build environment instead, and attempt to avoid providing an example where it shouldn't have been run-time option.
They're inside of a header that's created by autotools. ie.Brendan wrote:Where does PACKAGE_NAME and PACKAGE_VERSION come from? More specifically, how does either auto-conf or make automatically guess both the package name and package version correctly and prevent the need for programmers to explicitly set them somewhere (e.g. in a header file)?
Code: Select all
AC_PREREQ([2.69])
AC_INIT([mypackage], [1.0.0], [[email protected]]) # package info here
AM_INIT_AUTOMAKE([-Wall -Werror foreign])
AC_PROG_CC
AC_CONFIG_HEADERS([config.h]) # header file that the configuration is output into
AC_CONFIG_FILES([ # list of makefiles to generate
Makefile
src/Makefile
])
AC_OUTPUT
Yep. Even if they introduced it now, it wouldn't work for older development kits (see the prior Indigo example as a reason as to why this makes it absolutely necessary to have a configure step)Brendan wrote:Are you saying that the C/C++ language specifications have both failed to include a standard "__TARGET_ARCH__" pre-processor macro?
Brendan is wrong; he obviously has absolutely no experience whatsoever using autotools and is more focused on pointing fingers or asserting the existence of problems which cannot be fixed rather than focussing on what can. He's a bigot who believes that entire associations and thousands of developers who have far more experience than he could ever obtain as a single person and have collectively put far more time and effort into the consideration of standards and the tools that we use today are all a bunch of idiots who would create nothing but a "cluster-bork of fail".Brendan wrote:Please repeat this daily: "Brendan is right; the languages, tools and build environment we're using are all a complete cluster-bork of fail; however (for whatever reason) I don't want to fix any of the many problems created by poor languages, poor tools and poor build environments; and I only want a "slightly less bad" work-around that fails to fix any of the many problems."Wajideu wrote:There are many reasons why we need autotools, but many problems faced with using it. That's why I want to write a utility that fixes those quirks.
@b.zaar. You didn't provide example of anything. We're on the topic of makefiles here, not the idealism of what you wish existed.