Question

I've been wondering about the best ways to compile projects that are > 1000 lines of code. I just have some basic questions concerning the best way to have a project compiled. I'm using GCC, if it makes a difference. My questions are:

  • Does including a library in multiple source files cause the functions to be copied twice?

By this I mean, If I have two files like this:

Souce.c

#include<stdio.h>
...
void static test_func( ){ printf( "Hey!" ); }

Source2.c

#include<stdio.h>
...
void static test_func( ){ printf( "Hey!" ); }

Does the function, printf, get copied into the executable file twice? I'm thinking I want to avoid this because a lot of times I will have multiple source files that share the same header files and I'm wondering if that is a bad practice or not.

One time, I just included other source files by using this:

Source.c

#include<stdio.h>
#include "source2.c"

But I wasn't sure if that, as well, is a bad practice or not. Which leads me to another question:

  • Is including source files a bad practice?

Note: By bad practice, I mean if something is either against convention or causes some sort of inefficiency.

EDIT: I've just read that the libraries are shared among files. I guest that means the answer to my first question is no. I am curious, however, if including source files using the preprocessor is a common practice or not.

Was it helpful?

Solution

Note that library header files like stdio.h only contain declarations for functions like printf; they do not contain the actual code for those functions. The actual code for functions like printf is added at link time, when all the object files and relevant libraries are combined to form the final executable.

Including source files like in your example generally considered bad practice, although there may be specific use cases for it; I just can't think of any good ones offhand. You run a greater risk of duplicate definition errors, and you wind up rebuilding code unnecessarily. The compiler may have a limit on how much code it can digest in one sitting; including source files that include source files that include source files may result in extremely long build times or worse. It's been a while, but I've seen compilers choke on very large files, especially if you're trying to optimize the output.

The beauty of splitting code up into multiple source files is that if I only change something in one file, I (usually) don't have to rebuild the entire project (of course, this depends on what was changed); I only have to recompile that one file and relink.

OTHER TIPS

Library functions are included only once during a static linkage. If the library is dynamic, no copies are added at all, only dynamic link information.

Including .c files is weird, unless you have a specific reason to do so. One such reason is for an external module-test program, where you don't want a main function in the module at all, but at the same time in order to test it properly, the main needs access to all file-static data which could not be accessed by a different file.

No including a library multiple times across files will not make it included multiple times in the final executable

The linker is responsible for resolving the functions and when it finds that it already knows the function and it has already included it will not try to include it again in static linking.Also the linker will make sure that it includes only the necessary files that contain the particular definition in static linking.

In dynamic linking, the linker will load the dynamic library files that are needed by the application at the run time into the memory

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top