hey! include files!

Question about which tools to use, bugs, the best way to implement a function, etc should go here. Don't forget to see if your question is answered in the wiki first! When in doubt post here.
Adek336

hey! include files!

Post by Adek336 »

hi! I have #includes for all my files, like
#include <malloc.cpp>. The malloc func is coded entirely in the file, but I heard that someone only included a header file with prototypes, so the file with the real func was recompiled only after changes. Hm, it would be much better! Do you do such a thing in your projects? How? :)

I use g++ 3.2.

Cheers,
Adrian.
Curufir

Re:hey! include files!

Post by Curufir »

Hmm, well I'm not great at C/C++ but I'll give it a go explaining. C code:

Code: Select all

/* my_func.c */

void my_func(void){
  /*Do Something*/
}

---------------------------

/* my_func.h */
%ifndef FUNCTION1
%define FUNCTION1

extern void my_func(void);

%endif
---------------------------

/* main.c */

#include "my_func.h"

int main(void){
  my_func();
  return 0;
}
Ok, nothing amazing codewise I know.

I'll go through in steps.

First you compile the two .c files into .o (Object format) files. This gives you my_func.o and main.o.

Now the final step in creating an executable is to link together the object files using the linker.

The important thing here is that because my_func.h has been included into main.c, and most especially because my_func.h declares that there is an function called my_func() that is external (Ie not declared in) main.c, the file main.o will contain instructions telling the linker that there is an external function called my_func() that main.o contains references to.
The linker is then smart enough to figure out that my_func.o contains the function my_func() and whilst creating the executable makes sure that any references to it in main.c use the funtion in my_func.

Ok, not very interesting (And a seriously simplified view of linking) so far.

The useful part comes when you realise that you don't have to recompile my_func.c into my_func.o anymore unless you change the my_func.c source file.

So long as my_func.c remains unchanged, and you still have my_func.o, it doesn't matter how much you alter main.c, becuase as long as it still includes my_func.h the linker will still have enough information to create the executable. Ie you only have to recompile the file you change and then link everything together again.

Now to take this to a larger example. Let's say you're working on a project with a few hundred source files, which comprise a few megabytes of code. Recompiling everything for every minor change would become tedious in the extreme. Now what something like make does is check to see which source files have changed since the last time an executable was created. It then compiles these changed source files into object format and links all the object files together (The old ones from files that are unchanged + the new ones from files that have changed) into the new executable. This is far more efficient.

Hope that helps, if there's any errors hopefully one of the C/C++ gurus will point 'em out.

**

Note: The define is there to stop the function being declared twice should the header be included into multiple source files.

**

Almost forgot to mention the other important way these header files are used...libraries.

Let's say, for instance, you include math.h into a standard C file because you want to use the sin() function. Now if you take a look at math.h all you'll find are a lot of definitions of constants and a lot of function prototypes. At link time the linker uses those function prototypes to change references to sin() within your C file to point at the entry to the sin() function within the standard C library. Same principle as before, only the standard C library has taken the place of the my_func.o object file.
Adek336

Re:hey! include files!

Post by Adek336 »

Thank you!
I will try doing so, so linking my code is more efficient. One more question: what do I need to do so that make compiles only new files? I will try to google for that, anyways.

Thank you very much and Cheers,
Adrian.
Tux

Re:hey! include files!

Post by Tux »

[me=Tux]pokes topic[/me]
Oh, it's not IRC. Forgot.

I don't want to be a flame thrower :)
But I got to keep my dignity. ;D

YOUR KERNEL BECOMES LARGER DUE TO IT.

Yes, it's more efficient to change modules, but here is a chart:

nano: include .h
micro: include .h
anything else: link

They both have sides to it, but the smaller your kernel's operation tasks, the more you include.
User avatar
Solar
Member
Member
Posts: 7615
Joined: Thu Nov 16, 2006 12:01 pm
Location: Germany
Contact:

Re:hey! include files!

Post by Solar »

I'm not great at C/C++...
Your example and explanation were fine; just exchange the "%" in my_func.h with "#"... ;-)

I'll add an equivalent example for C++, with some "details" added.

Code: Select all

/* MyClass.hpp */

#if !defined(_MYCLASS_H_)
#define _MYCLASS_H_ _MYCLASS_H_

namespace my_namespace
{

class MyClass
{
    public:
        MyClass();
        virtual ~MyClass();

        void doSomething();

    private:
        MyClass(MyClass&);
        MyClass& operator=(MyClass const &);

        int mSomeData;
};

} // namespace my_namespace

#endif // _MYCLASS_H_
As Curufir explained, the if-define-endif thingy is a "header guard", to avoid having a function declared more than once. That's standard technique in both C and C++.

ifndef SYMBOL only tests if SYMBOL is defined. I prefer using if !defined(SYMBOL), which tests if SYMBOL is defined to a value. (This allows you to undefine the symbol. Now that's high mojo, but I prefer having all options open.)

If you wonder "why is the destructor declared virtual?", that's because a non-virtual destructor invites memory / ressource leaks if you ever inherit from MyClass.

If you wonder "why are copy constructor and assignment operator declared private?", that's because this way the compiler will tell you when you use them (by compiler error), instead of quietly creating their implementation for you.

If either of the last two paragraphs didn't make sense to you, I seriously suggest reading Scott Meyer's "Effective C++". ;-)

Code: Select all

/* MyClass.cpp */
#include "MyClass.hpp"

namespace my_namespace
{

MyClass::MyClass()
{
    // ...
}

virtual MyClass::~MyClass()
{
    // ...
}

void MyClass::doSomething()
{
    // ...
}

}
Same as in the C example, basically. If you wonder where the definition for copy constructor and assignment operator went... Scott Meyers will tell you. ;-)

Code: Select all

/* main.cpp */

#include "MyClass.hpp"

int main(int argc, char* argv[])
{
    MyClass tmp;
    tmp.doSomething();
    return 0;
}
Different language, same concept.

(The parameters to main are the standard ones. You can always define int main() and ignore the parameters - as long as you remember main() returns int. ;-) )
Every good solution is obvious once you've found it.
Curufir

Re:hey! include files!

Post by Curufir »

Tux wrote: YOUR KERNEL BECOMES LARGER DUE TO IT.
Well I don't mean to flame either, but if size is your primary concern then why the heck would you be using C++ in the first place?
Tux

Re:hey! include files!

Post by Tux »

In C too it happens. The HALfix kernel went from 18 kb to 9kb by removing linking. ANyway, back to coding my classic games kernel :)
Therx

Re:hey! include files!

Post by Therx »

What he means is use assembly. And when you consider the ammount more because of linking is not always twice its always about 10kb and even in old 486 PCs with 4 or 8mb of RAM this is not critical and the speed is the same with or without linking.

Pete
Tux

Re:hey! include files!

Post by Tux »

Let's all come to the conclusion hopefully that it is how a person wants to use it. I like including, some people like linking. So next time someone links/includes, don't force em to do the it your way. Keep it an opinion :)
Curufir

Re:hey! include files!

Post by Curufir »

He asked for an explanation of this mechanism dude, I didn't force my opinion on anybody.

I am curious though, that seems a very large increase simply due to linking. Is that binary stripped?
nullify

Re:hey! include files!

Post by nullify »

Tux wrote: So next time someone links/includes, don't force em to do the it your way. Keep it an opinion :)
Well, in my opinion, doing the "include" way isn't all that practical when you're working with a big codebase. I have not seen one major project not use the "link" method. Imagine being a Linux kernel developer or QT developer and having to rebuild the *entire* source tree after one minor change. :)

Sure, you may be able to get away without linking separate modules for smaller projects, but such a method doesn't seem too scalable.

In addition, one of the primary purposes of using makefiles is to automatically determine which modules have been altered and only rebuild them (and their dependencies). If we didn't care about linking together separate modules, we'd all be using simple shell scripts that rebuilds everything all the time :)

P.S. Yes, the size increase after linking does seem a bit bigger than I'd expect. I'd try and verify it but I don't have access to my GNU development toolchain atm. :(
Adek336

Re:hey! include files!

Post by Adek336 »

wohoo! I now have the headers in *.hpp files. Only a pitty there is almost twice as much files, but now it compiles&links faster.

One more question: for what reason we divide our source files in directories? If all the source files compile into one binary, it should be in one dir yes?

Cheers,
Adrian.
RuneOfFire

Re:hey! include files!

Post by RuneOfFire »

The size increase when you link is probably due to aligning the segments on 4KB boundaries. You CAN turn this off (if it is, in fact, the problem).
Schol-R-LEA

Re:hey! include files!

Post by Schol-R-LEA »

Adek336 wrote:One more question: for what reason we divide our source files in directories?
To organize them, primarily. On large projects, keeping track of source files can become a serious task unto itself, and it is best to follow Good Design Practices from the start rather than have to reorganize things later (though chances are you'll have to do that sooner or later, anyway, if the project goes on long enough). It's entirely up to you how you keep track of your source files, but keep in mind that other people probably will need to read them too - even if you don't think the project is anything more than a quick one-off.
User avatar
Solar
Member
Member
Posts: 7615
Joined: Thu Nov 16, 2006 12:01 pm
Location: Germany
Contact:

Re:hey! include files!

Post by Solar »

Adek336 wrote: If all the source files compile into one binary, it should be in one dir yes?
When you write a (simple) application, you will be using standard includes (e.g. <stdlib.h>), system includes (i.e., the stuff provided by the OS you are using), includes of third party libraries (e.g. Boost libraries), library stuff of your own you wrote earlier (e.g. "MyLib"), and the code of the application you are working on.

Quite naturally, you would want to organize them accordingly instead of putting everything in one directory.

And usually, to figure out how e.g. "MyClass" works, all you need is the header file with the declarations (and, hopefully, code comments), not the implementation file. Thus it is common practice to keep headers and implementation files apart: Headers are interesting for everyone, implementation files only for those actually working on the class in question.
Every good solution is obvious once you've found it.
Post Reply