Page 6 of 9

Re: Waji's Standards

Posted: Tue Oct 14, 2014 6:16 pm
by Wajideu
Brendan wrote:
Wajideu wrote:No, it's necessary so the code can be configured specifically for the platform. ie. If your target architecture is "i586", you can use MMX extensions to boost the speed of matrix math. Or if you're writing an emulator and you detect that both the host and target architectures are the same, you can skip most of the dynamic recompiling stage.
For someone compiling for themselves (where target architecture is the architecture the compiler is running on) the compiler can auto-detect the target. For compiling for other people (e.g. for a pre-compiled binary only package) you want "all targets" but this isn't very practical so you end up selecting a few common subsets (e.g. 32-bit 80x86 without any features, 32-bit 80x86 with MMX/SSE, and 64-bit 80x86) and most people end up running code that isn't optimised for their system.

In both of these cases, detecting the target in make solves nothing.
The C standard doesn't define any pre-processor macro's specifying the target machine. If you're using any in your code right now, know that they are extensions and your code will not build properly in all build environments.

This is why for the longest time, people often used lines like:

Code: Select all

# if defined (_WIN32) || defined (__WIN32__)
Even now, if you wanted to ensure that your code compiles with both the GCC and the Microsoft compilers, to target x86_64 you'd need to use

Code: Select all

# if defined (__aarch64__) || defined (_M_AMD64)
And even here, you've only added support for 2 build environments. Changing the way the compiler/linker works isn't an option. It'll break the code for anyone using older systems.
Brendan wrote:Note: A better approach is to use 2 compilers; such that the first compiler does as much as sanity checking and optimisation as possible and generates some form of portable intermediate code (where this compiler is used as part of the build process), and the second compiler compiles the portable intermediate code into native code optimised specifically for the end user's specific computer when the package is installed by the end user. This solves all of the problems, including the "people end up running code that isn't optimised for their system", and means that the build process simply doesn't care what the target architecture is. Also note that this is a large part of why something like Java (where there is a JIT compiler optimising for the specific target machine) can be as efficient as C++ (where you can't optimise for the end-user's specific machine) despite the additional overhead of "JIT-ting". Sadly, traditional tools are traditionally bad.
A ridiculous hack that can be avoided just by accepting the fact that you need a configure step.
Brendan wrote:Your words say "Nope", but the meaning behind those words says "Yes, I agree completely, there is no standard build environment and therefore not everyone uses a compatible build environment".
Nope. (nōp) Definiton: "informal variant of no".

Standards have nothing to do with it. autotools exists specifically so that older development systems that cannot be altered will still function as properly as the standardized ones. Primary example, the IBM Indigo has a proprietary software development kit called mipspro and a fork of GCC 1.1. The current GCC is incapable of targeting the Indigo afaik. Even if it was, you would either (a) need a configuration step, or (b) need to compile a new ported GCC on a secondary machine and transfer that to the Indigo. Expecting people to do (b) is just ludicrous. It's already difficult enough to add a new architecture to the GCC without having to go back and forth between 2 separate machines just to check it. Before you even mention it, there is no emulator for the Indigo either. Pretty much everything about the computer was proprietary.

Brendan wrote:You're agreeing with me again (saying there's either no standards or multiple standards for libraries, languages, etc).
No, I'm saying there are different revisions of the same standard. C89, C99, and C11 are all well-fleshed C standards and most compilers attempt to target, but that doesn't mean that C11 will compile under C99, that C99 or C11 will compile under C89, or surprisingly, that C89 will even compile under C11 (due to the fact that C11 strictly prohibits some things which would otherwise have no problem compiling under C89). Embedded-C is a whole different beast atm. It's an absolute must for embedded systems, but it cannot be incorporated into the normal C standard because most dedicated systems don't have native support for fixed point types.
Brendan wrote:Can you provide an example of where an interface to configure the project is necessary; which is not a case of bad build environment and not a case of bad project? Please note that it would be in your interest to use some foresight here - e.g. for any potential example, attempt to predict my response; and attempt to avoid providing an example where it's easy to find a way to improve the language/tools/build environment instead, and attempt to avoid providing an example where it shouldn't have been run-time option.
I've already provided like 20 of them. Go back and read. I provide at least 1 or 2 examples to back every reason I've given as to why a configure step is needed. By, "give me an example", you surely mean "I don't care how many examples you give me, I'm right and you're wrong."
Brendan wrote:Where does PACKAGE_NAME and PACKAGE_VERSION come from? More specifically, how does either auto-conf or make automatically guess both the package name and package version correctly and prevent the need for programmers to explicitly set them somewhere (e.g. in a header file)?
They're inside of a header that's created by autotools. ie.

Code: Select all

AC_PREREQ([2.69])
AC_INIT([mypackage], [1.0.0], [[email protected]]) # package info here

AM_INIT_AUTOMAKE([-Wall -Werror foreign])

AC_PROG_CC

AC_CONFIG_HEADERS([config.h]) # header file that the configuration is output into
AC_CONFIG_FILES([  # list of makefiles to generate
  Makefile
  src/Makefile
])

AC_OUTPUT
Including that "config.h" header in your source code will give you access to the information obtained from the configure step.

Brendan wrote:Are you saying that the C/C++ language specifications have both failed to include a standard "__TARGET_ARCH__" pre-processor macro?
Yep. Even if they introduced it now, it wouldn't work for older development kits (see the prior Indigo example as a reason as to why this makes it absolutely necessary to have a configure step)
Brendan wrote:
Wajideu wrote:There are many reasons why we need autotools, but many problems faced with using it. That's why I want to write a utility that fixes those quirks.
Please repeat this daily: "Brendan is right; the languages, tools and build environment we're using are all a complete cluster-bork of fail; however (for whatever reason) I don't want to fix any of the many problems created by poor languages, poor tools and poor build environments; and I only want a "slightly less bad" work-around that fails to fix any of the many problems."
Brendan is wrong; he obviously has absolutely no experience whatsoever using autotools and is more focused on pointing fingers or asserting the existence of problems which cannot be fixed rather than focussing on what can. He's a bigot who believes that entire associations and thousands of developers who have far more experience than he could ever obtain as a single person and have collectively put far more time and effort into the consideration of standards and the tools that we use today are all a bunch of idiots who would create nothing but a "cluster-bork of fail".


@b.zaar. You didn't provide example of anything. We're on the topic of makefiles here, not the idealism of what you wish existed.

Re: Waji's Standards

Posted: Tue Oct 14, 2014 6:40 pm
by b.zaar
Wajideu wrote:@b.zaar. You didn't provide example of anything.
You haven't even downloaded it to check...

Does the config step in make++ in the example produce a config.mk file to include in the master makefile and a config.h file to include in all the source files?

Where else should I write the automatically detected information to?

Read the configure script created by autotools. It's long and complicated but it's just a shell script which can be produced by any language that can write to a file, including make. It the example make skips this step and writes it straight to the include files it needs.

Would you like make to produce a configure script that can be run so it produces a makefile? I'm sure I can add another layer if it needs it.
Wajideu wrote:We're on the topic of makefiles here, not the idealism of what you wish existed.
you mean like planfiles?

Re: Waji's Standards

Posted: Tue Oct 14, 2014 6:53 pm
by Wajideu
@b.zaar, I didn't ask you to provide a make++ example, and this has nothing to do with planfiles. You asked me why you couldn't detect things like the host/target platform in make (not make++), and I asked you to provide me an example of how you can. No matter how you look at it, some other tool is needed.

By showing me a make++ example, you are only verifying what I said.

Re: Waji's Standards

Posted: Tue Oct 14, 2014 6:57 pm
by b.zaar
Wajideu wrote:You asked me why you couldn't detect things like the host/target platform in make (not make++), and I asked you to provide me an example of how you can. No matter how you look at it, some other tool is needed.

Code: Select all

#! /bin/sh
# make++ shell script
# Run a make++ makefile
#

# Initialize a new make++ project
function init {
	echo Initializing new make++ project
	cp .mkinclude/Makefile.m++ Makefile.m++
	cp .mkinclude/configure.mk configure.mk
}

if [ $# -eq 0 ]; then
	make -f Makefile.m++
else
 if [ $1 = "init" ]; then
	init
 else
	make -f Makefile.m++ $1 $2 $3 $4 $5
 fi
fi
That looks like make is still doing all the work to me...

I guess I was just trying to be fancy.

Re: Waji's Standards

Posted: Tue Oct 14, 2014 7:03 pm
by Wajideu
@b.zaar, I quote myself again
Wajidaeu wrote:@b.zaar, I didn't ask you to provide a make++ example, and this has nothing to do with planfiles. You asked me why you couldn't detect things like the host/target platform in make (not make++), and I asked you to provide me an example of how you can. No matter how you look at it, some other tool is needed.
I want a make example. Not a make++/shellscript example. Using tools aside from make just validates my point, in which we are debating about ___?

Re: Waji's Standards

Posted: Tue Oct 14, 2014 7:15 pm
by b.zaar
Wajideu wrote:I want a make example. Not a make++/shellscript example. Using tools aside from make just validates my point, in which we are debating about ___?
Ok so wait, you'll need to set all the ground rules first.

Autotools is allowed to use shell scripting to detect settings but make is not?

I don't see how to build using make if I must ignore shell commands.

After ignoring my silliness of the required

Code: Select all

make++ init
as a shell script simply renaming Makefile.m++ to Makefile then running plain old make will still produce a SYSTEM, BUILD and ARCH variable in both makefile format and C pre processor format.

Re: Waji's Standards

Posted: Tue Oct 14, 2014 7:36 pm
by Wajideu
b.zaar wrote:
Wajideu wrote:I want a make example. Not a make++/shellscript example. Using tools aside from make just validates my point, in which we are debating about ___?
Ok so wait, you'll need to set all the ground rules first.

Autotools is allowed to use shell scripting to detect settings but make is not?
make "technically" isn't supposed to make any assertions about it's environment. hence why when running from the command line, you have to use 'del' instead of 'rm'; but I'm overlooking that for the moment to focus on the main issue. That being that your example appears to be using make++ files, not makefiles. If you're replacing make just to add extra functionality, then you are verifying my statement that another tool is needed to do so. At which point, we're just walking in circles around each other defending the same side of an argument.

Re: Waji's Standards

Posted: Tue Oct 14, 2014 7:49 pm
by b.zaar
Wajideu wrote:make "technically" isn't supposed to make any assertions about it's environment
My code doesn't. It checks for a bin/sh type shell or a COMSPEC environment variable.
Wajideu wrote:That being that your example appears to be using make++ files, not makefiles.
make++ does not exist... This version produces the EXACT same output using make

Re: Waji's Standards

Posted: Tue Oct 14, 2014 8:01 pm
by Wajideu
@b.zaar, I'm failing to see how this example shows the detecting of the host/target platforms and the configuration of make in such a way as to selectively build only specific files and to pass information about the host/target environments to the source code.

It just looks like an alternative way of recursively calling make.

Re: Waji's Standards

Posted: Tue Oct 14, 2014 8:08 pm
by b.zaar
Wajideu wrote:I'm failing to see...
I know

Re: Waji's Standards

Posted: Tue Oct 14, 2014 8:11 pm
by Wajideu
b.zaar wrote:
Wajideu wrote:I'm failing to see...
I know
Since I'm apparently blind, show me where it's at.

Makefile

Code: Select all

config_h := config.h
config_mk := config.mk

all:

include configure.mk

all: $(config_h) $(config_mk)
	@echo Done

clean:
	rm $(config_h)
	rm $(config_mk)

configure.mk

Code: Select all

config: autoconfig

# Check config.h file name
ifndef config_h
 config_h := config.h
endif


# Check config.mk file name
ifndef config_mk
 config_mk := config.mk
endif


# Generate configuration files
ifeq ($(MAKECMDGOALS),config)
 include .mkinclude/autoconfig.mk
else
# Not defined
 ifneq ($(MAKECMDGOALS),clean)
  ifneq ($(MAKECMDGOALS),help)
   -include $(config_mk)
  endif
 endif
endif

$(config_mk) $(config_h):
	$(MAKE) config
Maybe there's some invisible ink or something here.


EDIT: btw, the configure.mk could be simplified to

Code: Select all

config: autoconfig

config_h ?= config.h
config_mk ?= config.mk

# Generate configuration files
ifeq ($(MAKECMDGOALS),config)
 include .mkinclude/autoconfig.mk
else
# Not defined
 ifneq ($(MAKECMDGOALS),clean)
  ifneq ($(MAKECMDGOALS),help)
   -include $(config_mk)
  endif
 endif
endif

$(config_mk) $(config_h):
	$(MAKE) config
The ':=' is unnecessary because you're not evaluating an expression; either '=' or '?=' (define if doesn't exist) would have sufficed.

Re: Waji's Standards

Posted: Tue Oct 14, 2014 8:24 pm
by b.zaar
Wajideu wrote:

Code: Select all

 include .mkinclude/autoconfig.mk
Maybe there's some invisible ink or something here.
Yes

Re: Waji's Standards

Posted: Tue Oct 14, 2014 8:35 pm
by Wajideu
You know, I honestly don't know why I'm even trying to explain this to you people. :|

As someone who shows a general distaste towards Linux and mostly uses Windows, and who was criticized simply for joking about stereotypical Linux users; I find it hilarious that I seem to be have a better understanding of some of your tools than you do and am actually wasting a lot of my time debating with you about the importance of these tools.

I'm done arguing about this. Hopefully, I've piqued your curiosity enough that you might take a glance into the documentation on autotools for future reference.

Re: Waji's Standards

Posted: Tue Oct 14, 2014 8:54 pm
by Brendan
Hi,
Wajideu wrote:
Brendan wrote:
Wajideu wrote:No, it's necessary so the code can be configured specifically for the platform. ie. If your target architecture is "i586", you can use MMX extensions to boost the speed of matrix math. Or if you're writing an emulator and you detect that both the host and target architectures are the same, you can skip most of the dynamic recompiling stage.
For someone compiling for themselves (where target architecture is the architecture the compiler is running on) the compiler can auto-detect the target. For compiling for other people (e.g. for a pre-compiled binary only package) you want "all targets" but this isn't very practical so you end up selecting a few common subsets (e.g. 32-bit 80x86 without any features, 32-bit 80x86 with MMX/SSE, and 64-bit 80x86) and most people end up running code that isn't optimised for their system.

In both of these cases, detecting the target in make solves nothing.
The C standard doesn't define any pre-processor macro's specifying the target machine. If you're using any in your code right now, know that they are extensions and your code will not build properly in all build environments.
I was thinking of something like the GCC "-march=native" command line option; where the compiler automatically detects what it can use and automatically optimises the code to suit the auto detected machine.

The lack of standard pre-processor macro has nothing to do with choosing which architecture, and is separate issue (providing a way for source code know which architecture was chosen).
Wajideu wrote:This is why for the longest time, people often used lines like:

Code: Select all

# if defined (_WIN32) || defined (__WIN32__)
Even now, if you wanted to ensure that your code compiles with both the GCC and the Microsoft compilers, to target x86_64 you'd need to use
You mean, even now, if you wanted to ensure that your code compiles with both the GCC and the Microsoft compilers; because the entire "build environment" is a massive cluster-bork of fail where virtually everything either has no standard or multiple standards you have no choice but to use hideously stupid work-arounds.
Wajideu wrote:Changing the way the compiler/linker works isn't an option. It'll break the code for anyone using older systems.
Changing the way the languages and tools work is the only option that solves the problems. Do you think (e.g.) the people that designed Java, or C#, or D, or Python, or Go, or Rust, or.... said to themselves "C/C++ is an incredibly bad piece of crap, but we won't bother to ever try to do anything better because ancient COBOL code won't compile on it"?
Wajideu wrote:
Brendan wrote:Note: A better approach is to use 2 compilers; such that the first compiler does as much as sanity checking and optimisation as possible and generates some form of portable intermediate code (where this compiler is used as part of the build process), and the second compiler compiles the portable intermediate code into native code optimised specifically for the end user's specific computer when the package is installed by the end user. This solves all of the problems, including the "people end up running code that isn't optimised for their system", and means that the build process simply doesn't care what the target architecture is. Also note that this is a large part of why something like Java (where there is a JIT compiler optimising for the specific target machine) can be as efficient as C++ (where you can't optimise for the end-user's specific machine) despite the additional overhead of "JIT-ting". Sadly, traditional tools are traditionally bad.
A ridiculous hack that can be avoided just by accepting the fact that you need a configure step.
A configure step is not needed. For the majority of programming languages (mostly everything except C/C++), most projects do not have any configure step. Repeating an unproven and incorrect assertion will not make that assertion magically become a fact.
Wajideu wrote:
Brendan wrote:Your words say "Nope", but the meaning behind those words says "Yes, I agree completely, there is no standard build environment and therefore not everyone uses a compatible build environment".
Nope. (nōp) Definiton: "informal variant of no".
Words. Definition: The plural of "word", implying more than one word.

It was foolish of me to assume that someone who thinks a configuration step is necessary actually has the intelligence to understand simple English; especially given that there is no evidence so far that you've actually managed to understand a single point I've made (and insist on merely repeating incorrect assertions instead).
Wajideu wrote:Standards have nothing to do with it. autotools exists specifically so that older development systems that cannot be altered will still function as properly as the standardized ones. Primary example, the IBM Indigo has a proprietary software development kit called mipspro and a fork of GCC 1.1. The current GCC is incapable of targeting the Indigo afaik. Even if it was, you would either (a) need a configuration step, or (b) need to compile a new ported GCC on a secondary machine and transfer that to the Indigo. Expecting people to do (b) is just ludicrous. It's already difficult enough to add a new architecture to the GCC without having to go back and forth between 2 separate machines just to check it. Before you even mention it, there is no emulator for the Indigo either. Pretty much everything about the computer was proprietary.
Here we go again. "Standards have nothing to do with it" followed by "Indigo suffered greatly because there is no standard". It's like I'm talking to a schizophrenic that disagrees with me and proves my point at the same time.
Wajideu wrote:
Brendan wrote:You're agreeing with me again (saying there's either no standards or multiple standards for libraries, languages, etc).
No, I'm saying there are different revisions of the same standard. C89, C99, and C11 are all well-fleshed C standards and most compilers attempt to target, but that doesn't mean that C11 will compile under C99, that C99 or C11 will compile under C89, or surprisingly, that C89 will even compile under C11 (due to the fact that C11 strictly prohibits some things which would otherwise have no problem compiling under C89). Embedded-C is a whole different beast atm. It's an absolute must for embedded systems, but it cannot be incorporated into the normal C standard because most dedicated systems don't have native support for fixed point types.
I see - you're saying there is multiple standards (or multiple incompatible versions of the standards), which is effectively the same as no standard; and this lack of a "standard standard" is the problem?
Wajideu wrote:
Brendan wrote:Can you provide an example of where an interface to configure the project is necessary; which is not a case of bad build environment and not a case of bad project? Please note that it would be in your interest to use some foresight here - e.g. for any potential example, attempt to predict my response; and attempt to avoid providing an example where it's easy to find a way to improve the language/tools/build environment instead, and attempt to avoid providing an example where it shouldn't have been run-time option.
I've already provided like 20 of them. Go back and read. I provide at least 1 or 2 examples to back every reason I've given as to why a configure step is needed. By, "give me an example", you surely mean "I don't care how many examples you give me, I'm right and you're wrong."
You've completely failed to provide any example where a configuration step is necessary because it's impossible to either improve the language/tools/build environment or use a run-time option. All the example you have provided were already shown to be inadequate before I asked for an example that wasn't mere wishful thinking.
Wajideu wrote:
Brendan wrote:Where does PACKAGE_NAME and PACKAGE_VERSION come from? More specifically, how does either auto-conf or make automatically guess both the package name and package version correctly and prevent the need for programmers to explicitly set them somewhere (e.g. in a header file)?
They're inside of a header that's created by autotools. ie.

Code: Select all

AC_PREREQ([2.69])
AC_INIT([mypackage], [1.0.0], [[email protected]]) # package info here

AM_INIT_AUTOMAKE([-Wall -Werror foreign])

AC_PROG_CC

AC_CONFIG_HEADERS([config.h]) # header file that the configuration is output into
AC_CONFIG_FILES([  # list of makefiles to generate
  Makefile
  src/Makefile
])

AC_OUTPUT
I asked how either auto-conf or make automatically guesses both the package name and package version correctly and prevents the need for programmers to explicitly set them somewhere. You gave me an example of auto-conf failing to do this and requiring programmers to explicitly set the package name and package version somewhere (in the auto-conf script).
Wajideu wrote:
Brendan wrote:Are you saying that the C/C++ language specifications have both failed to include a standard "__TARGET_ARCH__" pre-processor macro?
Yep. Even if they introduced it now, it wouldn't work for older development kits (see the prior Indigo example as a reason as to why this makes it absolutely necessary to have a configure step)
If they introduced it now all future projects would be able to use it.

Do realise that your "work around the problems to avoid solving them" tool will not work with older projects that are designed to use auto-tools?
Wajideu wrote:
Brendan wrote:
Wajideu wrote:There are many reasons why we need autotools, but many problems faced with using it. That's why I want to write a utility that fixes those quirks.
Please repeat this daily: "Brendan is right; the languages, tools and build environment we're using are all a complete cluster-bork of fail; however (for whatever reason) I don't want to fix any of the many problems created by poor languages, poor tools and poor build environments; and I only want a "slightly less bad" work-around that fails to fix any of the many problems."
Brendan is wrong; he obviously has absolutely no experience whatsoever using autotools and is more focused on pointing fingers or asserting the existence of problems which cannot be fixed rather than focussing on what can. He's a bigot who believes that entire associations and thousands of developers who have far more experience than he could ever obtain as a single person and have collectively put far more time and effort into the consideration of standards and the tools that we use today are all a bunch of idiots who would create nothing but a "cluster-bork of fail".
I am prepared to do the work needed to (attempt to) solve the problems. You are not. You're not even capable of finding them. This alone guarantees that you will fail to produce anything better than auto-tools.


Cheers,

Brendan

Re: Waji's Standards

Posted: Tue Oct 14, 2014 9:24 pm
by Wajideu
@Brendan, I have a lot I could retort to your argument, but as I stated, I'm done discussing it. I'm fairly certain from what's been said so far that there is absolutely nothing I could say nor any example I could give which would alter your opinion.