Making make make faster

All off topic discussions go here. Everything from the funny thing your cat did to your favorite tv shows. Non-programming computer questions are ok too.
Post Reply
User avatar
B.E
Member
Member
Posts: 275
Joined: Sat Oct 21, 2006 5:29 pm
Location: Brisbane Australia
Contact:

Making make make faster

Post by B.E »

I found a way to make compiler faster. The method uses -j option (see man pages for more information), for example

make -j 4 <target>

The J option, uses the specified number of Jobs(processes) to build the target (i.e 4 different files can be compiled at once in the example).

I've tested it on BSD,linux and Cygwin. On a SCO system(Edit: I've only tested it on version 6.1), use the -P option (see man page for more information).
Image
Microsoft: "let everyone run after us. We'll just INNOV~1"
User avatar
B.E
Member
Member
Posts: 275
Joined: Sat Oct 21, 2006 5:29 pm
Location: Brisbane Australia
Contact:

Post by B.E »

bcat wrote: (On a 1 CPU/1 core/no HT setup, it'll probably just make make make slower, though. :?)
No, because the compilation process is not cpu intensive, it makes make make faster.
Image
Microsoft: "let everyone run after us. We'll just INNOV~1"
User avatar
Candy
Member
Member
Posts: 3882
Joined: Tue Oct 17, 2006 11:33 pm
Location: Eindhoven

Post by Candy »

bcat wrote:Yeah, this helps alot on multicore/multiprocessor machines. It should also help a bit on hyperthreaded single processor boxes.

(On a 1 CPU/1 core/no HT setup, it'll probably just make make make slower, though. :?)
Could you quit the confusing sentences on make making?

You can also get make to build faster with non-recursive (IE, full) scripts. Got my OS compile time down from a minute to 3.5 seconds. Since I have a dual-core, -j 2 got that down to 2.0 seconds (but the difference is trivial, really).
TheQuux
Member
Member
Posts: 73
Joined: Sun Oct 22, 2006 6:49 pm

Post by TheQuux »

Or, use makepp. The speed loss from being written in perl (not that I notice anything, but somebody will...) is made up for with things like smart updating (only rebuild something if its changes are significant, so running indent on your source will not trigger a rebuild, but adding "-O3 -march=super-duper-athlon-256 -funroll-loops -fplacebo-optimization" will :- )

In addition it makes recursive make worthless, with its incredible support for recursive makefile searching, so you can depend on, say, hardware/built-in.o and makepp searches hardware/Makeppfile for a way to build built-in.o

Or, you can depend on *.o, and you get all object files that can be built, which is REALLY nice.

Oh, and it is a (mostly) drop-in replacement for make. Plus it works anywhere perl does, which is, well, everywhere.

Note: I have no part in the development of makepp, I just love using it.
My project: Xenon
Legend
Member
Member
Posts: 195
Joined: Tue Nov 02, 2004 12:00 am
Contact:

Post by Legend »

Make sucks anyway. :P

And good that I have a dual core cpu! ;)
User avatar
Solar
Member
Member
Posts: 7615
Joined: Thu Nov 16, 2006 12:01 pm
Location: Germany
Contact:

Post by Solar »

TheQuux wrote:Or, you can depend on *.o, and you get all object files that can be built, which is REALLY nice.
For blowing make times completely out of proportion? I bet. :D

Last time I looked, the idea of make was to not do more than necessary...

The best way to make it faster is writing correct Makefiles, which I agree is much harder than it should be. Not using recursive make is the first step down a long road, and reading the manual helps. (It also lists the -j option, and many other things like why to use := instead of =...)
Every good solution is obvious once you've found it.
TheQuux
Member
Member
Posts: 73
Joined: Sun Oct 22, 2006 6:49 pm

Post by TheQuux »

Solar wrote:
TheQuux wrote:Or, you can depend on *.o, and you get all object files that can be built, which is REALLY nice.
For blowing make times completely out of proportion? I bet. :D

Last time I looked, the idea of make was to not do more than necessary...
... but no less, either.

And, yes, you can get globbing to work as in classic make, or not use it.

Still, the advantages of making everything possible are that you probably only have to make something once, and that you don't end up with a failed build because you forgot to add a dependency.

Plus, it makes writing makefiles a lot faster.

And finally, makepp has a server mode that caches the expanded makefiles and dependency tree, which improves performance significantly.

And, I'll shut up now, because I don't want to end up in a flame war.
My project: Xenon
User avatar
Solar
Member
Member
Posts: 7615
Joined: Thu Nov 16, 2006 12:01 pm
Location: Germany
Contact:

Post by Solar »

TheQuux wrote:...you don't end up with a failed build because you forgot to add a dependency.
Excerpt from the PDCLib Makefile:

Code: Select all

SRCFILES := $(shell find $(PROJDIRS) -mindepth 1 -maxdepth 3 -name "*.c")
DEPFILES := $(patsubst %.c,%.d,$(SRCFILES))

-include $(DEPFILES)

%.o: %.c Makefile
 	@echo " CC	$(patsubst functions/%,%,$@)"
 	@$(CC) $(CFLAGS) -MMD -MP -MT "$*.d $*.t" -c $< -o $@
No more "forgotten dependencies", as GCC creates them automatically...
Every good solution is obvious once you've found it.
Legend
Member
Member
Posts: 195
Joined: Tue Nov 02, 2004 12:00 am
Contact:

Post by Legend »

*takes a look at the code*

...

Wtf? :shock:
User avatar
Solar
Member
Member
Posts: 7615
Joined: Thu Nov 16, 2006 12:01 pm
Location: Germany
Contact:

Post by Solar »

:D

OK, there are still some project-specific things in there. I'll make a cleaner example:

Code: Select all

SRCFILES := $(shell find . -mindepth 1 -name "*.c") 
DEPFILES := $(patsubst %.c,%.d,$(SRCFILES)) 

-include $(DEPFILES) 

%.o: %.c Makefile 
    $(CC) $(CFLAGS) -MMD -MP -MT "$*.d" -c $< -o $@ 
The Makefile is in the root directory, and the only (!!!) Makefile of your project.

Only a single project-wide Makefile ensures make gets to know the full ruleset and dependency tree. Recursive make means your "submakes" don't know everything, and might make wrong decisions as a result. If you are lucky, it's just slower than it need to be. If you are unlucky, it will produce wrong results.

Let's go through it line by line.

Code: Select all

SRCFILES := $(shell find . -mindepth 1 -name "*.c") 
SRCFILES is a list of all .c files contained in subdirectories of the current (root) directory. (This excludes any "test.c" or something you might write ad-hoc in the rootdir.)

My original example above explicitly states PROJDIRS (to exclude some subdirs containing reference code I do not want to be buildt), and only searches to a maximum depth of 3 dirlevels (to make the search quicker facing deeply nested subdirs I keep for different purposes).

Code: Select all

DEPFILES := $(patsubst %.c,%.d,$(SRCFILES)) 
DEPFILES is the same as SRCFILES, but with the .c endings replaced with .d (for "dependency"). Those files do not exist in the beginning; see below.

Code: Select all

-include $(DEPFILES) 
This includes all DEPFILES into the Makefile, insofar as they do exist. The leading "-" suppresses error messages for missing files. This means, in the first (virgin) build, nothing is included here.

Code: Select all

%.o: %.c Makefile 
Any .o file depends on its .c counterpart and the Makefile. (After all, if you changed e.g. CFLAGS, you'd want to have your sources recompiled, don't you?)

Code: Select all

    $(CC) $(CFLAGS) -MMD -MP -MT "$*.d" -c $< -o $@ 
$(CC) $(CFLAGS) -c $< -o $@ compiles the .c file into a .o file.

-MMD -MP -MT "$*.d" is the real magic here. You can look up the exact meaning of the individual options in the gcc manual, but what it does is this:

When compiling a .c file, gcc (more precisely, the preprocessor) does create an additional .d file, in which it writes make dependency rules listing any include files.

This might look like this, for function/example.c:

Code: Select all

functions/example.d functsions/example.o: functions/example.c ./includes/foo.h ./includes/bar.h

./includes/foo.h:

./includes/bar.h:
On any subsequent builds, these dependencies will be included by make, so make "knows" that example.o must be updated whenever the source or any included header files are touched. Moreover, it knows that the dependency must be likewise updated.

The "dummy" dependencies where the header files themselves depend on nothing are a workaround on some strange make behaviour should the headers be removed; it's been some time since, I can't really remember the details anymore.

The original example also included .t files as target, since every one of my object files can, by defining -DTEST and compiling to executable instead of object file, be turned into a test driver for itself - but that's a different story.

Bottom line, gcc gives you the real, current dependencies of your sources automatically, as a byproduct of your "normal" build routine. You just have to know this is possible. ;)

All in all, it's a good example of one of my personal "golden rules": Before you start looking for a solution (additional tools or libraries), see if you really got a problem (that the tools / libraries you're already using cannot solve).
Every good solution is obvious once you've found it.
User avatar
Candy
Member
Member
Posts: 3882
Joined: Tue Oct 17, 2006 11:33 pm
Location: Eindhoven

Post by Candy »

Solar wrote: The Makefile is in the root directory, and the only (!!!) Makefile of your project.
*COUGH*

That phrase meant to say "the only makefile used as make-starting makefile". My project (atlantisos) distributes the knowledge over subdirectories that are recursively included in the top makefile, so that the parts are easier to maintain. I'm just missing at least one thing in make itself - include relative (see all file names and references as relative to the directory of the makefile in which they were named).

Code: Select all

-include $(DEPFILES) 
This includes all DEPFILES into the Makefile, insofar as they do exist. The leading "-" suppresses error messages for missing files. This means, in the first (virgin) build, nothing is included here.
How do you still get make to build the dependency files?
Bottom line, gcc gives you the real, current dependencies of your sources automatically, as a byproduct of your "normal" build routine. You just have to know this is possible. ;)
Aah... by not making them explicit targets. Helps with a few other issues I've been having too, forgot -MP in the process...
User avatar
Solar
Member
Member
Posts: 7615
Joined: Thu Nov 16, 2006 12:01 pm
Location: Germany
Contact:

Post by Solar »

Candy wrote:That phrase meant to say "the only makefile used as make-starting makefile".
If your top-level Makefile is the only one ever getting passed to make, and includes files from lower subdirectories, those other files aren't "Makefiles" but includes, right?
I'm just missing at least one thing in make itself - include relative (see all file names and references as relative to the directory of the makefile in which they were named).
As far as I can see, GNU make doesn't offer this. Since the included file is not a "Makefile" in itself, it doesn't get parsed seperately, thus the information on its path is lost.

However, it should be possible to do some trickery that stores the path of the make-include in some variable, which you could then use to prepend your file references... dunno really, I have arranged myself with root-relative references.
How do you still get make to build the dependency files?
Make doesn't build them, GCC does when first compiling the sources. Perceivably, you could confuse make by building the .o, deleting the .d and then modifying some of the headers (which make wouldn't catch because the information the header is used by the source was lost), but that would be "breaking on purpose".
Every good solution is obvious once you've found it.
Post Reply