Making make make faster
Making make make faster
I found a way to make compiler faster. The method uses -j option (see man pages for more information), for example
make -j 4 <target>
The J option, uses the specified number of Jobs(processes) to build the target (i.e 4 different files can be compiled at once in the example).
I've tested it on BSD,linux and Cygwin. On a SCO system(Edit: I've only tested it on version 6.1), use the -P option (see man page for more information).
make -j 4 <target>
The J option, uses the specified number of Jobs(processes) to build the target (i.e 4 different files can be compiled at once in the example).
I've tested it on BSD,linux and Cygwin. On a SCO system(Edit: I've only tested it on version 6.1), use the -P option (see man page for more information).
Microsoft: "let everyone run after us. We'll just INNOV~1"
Could you quit the confusing sentences on make making?bcat wrote:Yeah, this helps alot on multicore/multiprocessor machines. It should also help a bit on hyperthreaded single processor boxes.
(On a 1 CPU/1 core/no HT setup, it'll probably just make make make slower, though. )
You can also get make to build faster with non-recursive (IE, full) scripts. Got my OS compile time down from a minute to 3.5 seconds. Since I have a dual-core, -j 2 got that down to 2.0 seconds (but the difference is trivial, really).
Or, use makepp. The speed loss from being written in perl (not that I notice anything, but somebody will...) is made up for with things like smart updating (only rebuild something if its changes are significant, so running indent on your source will not trigger a rebuild, but adding "-O3 -march=super-duper-athlon-256 -funroll-loops -fplacebo-optimization" will :- )
In addition it makes recursive make worthless, with its incredible support for recursive makefile searching, so you can depend on, say, hardware/built-in.o and makepp searches hardware/Makeppfile for a way to build built-in.o
Or, you can depend on *.o, and you get all object files that can be built, which is REALLY nice.
Oh, and it is a (mostly) drop-in replacement for make. Plus it works anywhere perl does, which is, well, everywhere.
Note: I have no part in the development of makepp, I just love using it.
In addition it makes recursive make worthless, with its incredible support for recursive makefile searching, so you can depend on, say, hardware/built-in.o and makepp searches hardware/Makeppfile for a way to build built-in.o
Or, you can depend on *.o, and you get all object files that can be built, which is REALLY nice.
Oh, and it is a (mostly) drop-in replacement for make. Plus it works anywhere perl does, which is, well, everywhere.
Note: I have no part in the development of makepp, I just love using it.
My project: Xenon
For blowing make times completely out of proportion? I bet.TheQuux wrote:Or, you can depend on *.o, and you get all object files that can be built, which is REALLY nice.
Last time I looked, the idea of make was to not do more than necessary...
The best way to make it faster is writing correct Makefiles, which I agree is much harder than it should be. Not using recursive make is the first step down a long road, and reading the manual helps. (It also lists the -j option, and many other things like why to use := instead of =...)
Every good solution is obvious once you've found it.
... but no less, either.Solar wrote:For blowing make times completely out of proportion? I bet.TheQuux wrote:Or, you can depend on *.o, and you get all object files that can be built, which is REALLY nice.
Last time I looked, the idea of make was to not do more than necessary...
And, yes, you can get globbing to work as in classic make, or not use it.
Still, the advantages of making everything possible are that you probably only have to make something once, and that you don't end up with a failed build because you forgot to add a dependency.
Plus, it makes writing makefiles a lot faster.
And finally, makepp has a server mode that caches the expanded makefiles and dependency tree, which improves performance significantly.
And, I'll shut up now, because I don't want to end up in a flame war.
My project: Xenon
Excerpt from the PDCLib Makefile:TheQuux wrote:...you don't end up with a failed build because you forgot to add a dependency.
Code: Select all
SRCFILES := $(shell find $(PROJDIRS) -mindepth 1 -maxdepth 3 -name "*.c")
DEPFILES := $(patsubst %.c,%.d,$(SRCFILES))
-include $(DEPFILES)
%.o: %.c Makefile
@echo " CC $(patsubst functions/%,%,$@)"
@$(CC) $(CFLAGS) -MMD -MP -MT "$*.d $*.t" -c $< -o $@
Every good solution is obvious once you've found it.
OK, there are still some project-specific things in there. I'll make a cleaner example:
Code: Select all
SRCFILES := $(shell find . -mindepth 1 -name "*.c")
DEPFILES := $(patsubst %.c,%.d,$(SRCFILES))
-include $(DEPFILES)
%.o: %.c Makefile
$(CC) $(CFLAGS) -MMD -MP -MT "$*.d" -c $< -o $@
Only a single project-wide Makefile ensures make gets to know the full ruleset and dependency tree. Recursive make means your "submakes" don't know everything, and might make wrong decisions as a result. If you are lucky, it's just slower than it need to be. If you are unlucky, it will produce wrong results.
Let's go through it line by line.
Code: Select all
SRCFILES := $(shell find . -mindepth 1 -name "*.c")
My original example above explicitly states PROJDIRS (to exclude some subdirs containing reference code I do not want to be buildt), and only searches to a maximum depth of 3 dirlevels (to make the search quicker facing deeply nested subdirs I keep for different purposes).
Code: Select all
DEPFILES := $(patsubst %.c,%.d,$(SRCFILES))
Code: Select all
-include $(DEPFILES)
Code: Select all
%.o: %.c Makefile
Code: Select all
$(CC) $(CFLAGS) -MMD -MP -MT "$*.d" -c $< -o $@
-MMD -MP -MT "$*.d" is the real magic here. You can look up the exact meaning of the individual options in the gcc manual, but what it does is this:
When compiling a .c file, gcc (more precisely, the preprocessor) does create an additional .d file, in which it writes make dependency rules listing any include files.
This might look like this, for function/example.c:
Code: Select all
functions/example.d functsions/example.o: functions/example.c ./includes/foo.h ./includes/bar.h
./includes/foo.h:
./includes/bar.h:
The "dummy" dependencies where the header files themselves depend on nothing are a workaround on some strange make behaviour should the headers be removed; it's been some time since, I can't really remember the details anymore.
The original example also included .t files as target, since every one of my object files can, by defining -DTEST and compiling to executable instead of object file, be turned into a test driver for itself - but that's a different story.
Bottom line, gcc gives you the real, current dependencies of your sources automatically, as a byproduct of your "normal" build routine. You just have to know this is possible.
All in all, it's a good example of one of my personal "golden rules": Before you start looking for a solution (additional tools or libraries), see if you really got a problem (that the tools / libraries you're already using cannot solve).
Every good solution is obvious once you've found it.
*COUGH*Solar wrote: The Makefile is in the root directory, and the only (!!!) Makefile of your project.
That phrase meant to say "the only makefile used as make-starting makefile". My project (atlantisos) distributes the knowledge over subdirectories that are recursively included in the top makefile, so that the parts are easier to maintain. I'm just missing at least one thing in make itself - include relative (see all file names and references as relative to the directory of the makefile in which they were named).
How do you still get make to build the dependency files?This includes all DEPFILES into the Makefile, insofar as they do exist. The leading "-" suppresses error messages for missing files. This means, in the first (virgin) build, nothing is included here.Code: Select all
-include $(DEPFILES)
Aah... by not making them explicit targets. Helps with a few other issues I've been having too, forgot -MP in the process...Bottom line, gcc gives you the real, current dependencies of your sources automatically, as a byproduct of your "normal" build routine. You just have to know this is possible.
If your top-level Makefile is the only one ever getting passed to make, and includes files from lower subdirectories, those other files aren't "Makefiles" but includes, right?Candy wrote:That phrase meant to say "the only makefile used as make-starting makefile".
As far as I can see, GNU make doesn't offer this. Since the included file is not a "Makefile" in itself, it doesn't get parsed seperately, thus the information on its path is lost.I'm just missing at least one thing in make itself - include relative (see all file names and references as relative to the directory of the makefile in which they were named).
However, it should be possible to do some trickery that stores the path of the make-include in some variable, which you could then use to prepend your file references... dunno really, I have arranged myself with root-relative references.
Make doesn't build them, GCC does when first compiling the sources. Perceivably, you could confuse make by building the .o, deleting the .d and then modifying some of the headers (which make wouldn't catch because the information the header is used by the source was lost), but that would be "breaking on purpose".How do you still get make to build the dependency files?
Every good solution is obvious once you've found it.