Question

So I have been coming across many comments/posts/etc regarding creating makefiles directly, and how it is a silly thing to do in 2015. I am aware of tools such as CMake, and I actually use CMake quite often. The thing is, CMake is just creating the Makefile for you and helping to remove the tedium of doing it yourself. Of course it adds a lot of other great features... but its still a Makefile in the end.

So my question is, is the 'obsolete' talk regarding make referring to the entire Make utility, or just the idea of manually writing your own Makefiles? I do not use an IDE for C/C++ development at all (just emacs), so I have always written Makefiles.

If Make is considered outdated, what should a C/C++ dev be using to build small, personal projects?

Was it helpful?

Solution

The big difference is that CMake is a cross-platform meta-build system. A single CMake project can produce the usual Unix/Linux makefile, a Visual Studio project for Windows, an XCode project for Mac, and almost any other non-meta build system you might want to use or support.

I wouldn't say using make directly or even manually editing makefiles is "obsolete", but those are things you probably shouldn't do unless you are working on Unix/Linux stuff that won't need porting to Windows, much like how you should not be editing Visual Studio project files directly if you ever want to port them to not-Windows. If you have any interest in portability, it's worth learning a meta-build system like Scons or CMake.

OTHER TIPS

Is make really outdated?

I don't think so. In the end, make is still powerful enough to provide all the functionality desired, like conditional compilation of changed source and alike. Saying make was outdated, would be the same as saying writing custom linker scripts was outdated.

But what raw make doesn't provide, are extended functionalities and stock libraries for comfort, such as parametric header generation, integrated test framework, or even just conditional libraries in general.

On the other hand, CMake is directly tailored towards generating generic ELF libraries and executables. Whenever you depart from them predefined procedures, you have to start hacking CMake just as you used to hack your own Makefiles.

Considering that you can create far more in C++ than just your average application for an system which knows ELF loaders, CMake surely isn't suited for all scenarios. However if you are working in such a standardized environment, CMake or any other of these modern script generator frameworks certainly are your best friends.

It always depends on how specialized your application is, and how well the structure of the CMake framework fits your work flow.

Once upon a time high level languages were just an idea. People tried to implement compilers. Back then there were severe hardware limitations - there were no graphical tools so "plain text" ended up being used for the input file format; computers typically had an extremely tiny amount of RAM so the source code had to be broken into pieces and the compiler got split into separate utilities (pre-processor, compiler, assembler, linker); CPUs were slow and expensive so you couldn't do expensive optimisations, etc.

Of course with many source files and many utilities being used to create many object files, it's a mess to build anything manually. As a work-around for the design flaws in the tools (caused by severely limited hardware) people naturally started writing scripts to remove some of the hassle.

Sadly, scripts were awkward and messy. To work around the problems with scripts (which were a work-around for the design flaws in the tools caused by limited hardware) eventually people invented utilities to make things easier, like make.

However; makefiles are awkward and messy. To work around the problems with makefiles (which were a work-around for a work-around for the design flaws in the tools caused by limited hardware) people started experimenting with auto-generated makefiles; starting with things like getting the compiler to generate dependencies; and leading up to tools like auto-conf and cmake.

This is where we are now: work-arounds for a work-arounds for a work-around for the design flaws in the tools caused by limited hardware.

It's my expectation that by the end of the century there'll be a few more layers of "work-arounds for work-arounds" on top of the existing pile. Ironically, the severe hardware limitations that caused all of this disappeared many decades ago and the only reason we're still using such an archaic mess now is "this is how it's always been done".

make (the tool or direct use of it via a Makefile) is not outdated, particularly for "small, personal projects" as you use it for.

Of course, you can also use it for larger projects, including those targeted for multiple platforms. With target-specific variables you can easily customize how you build for different platforms. Nowadays, Linux distributions come with cross-compiling toolchains (e.g. mingw-w64) so you can build a complete Windows software package (with installer if you like) from Linux, all driven from your Makefile.

Tools like cmake and qmake can be useful, but they are not without their problems. They are usually fine for building an application itself along with any libraries (though I always had problems with qmake doing proper dependency checking between libraries and programs that use them), but I always struggle with the constraints of those tools when doing the rest of the job (creating installers, generating/installing documentation or translation files, doing unusual stuff like turning an archive into a shared library, etc.). All of this can be done in make, though things like dependency tracking can require a little effort.

IDEs like Qt Creator and Eclipse can also import Makefile-based projects, so you can share with IDE-using developers. I think the Qt Creator IDE is excellent as a C++ IDE, but after spending some time to become more effective with Emacs, and since I'm doing more Ada development, I'm finding myself preferring Emacs for everything. To relate back to the question about make, as a post-link step in my Makefile, I update my TAGS file (for symbol navigation in Emacs), loaded with M-x visit-tags-table:

find $(SRC_DIR) $(TEST_DIR) -regex ".*\.[ch]\(pp\)?" -print | etags -

Or for Ada development:

find $(SRC_DIR) $(TEST_DIR) -name "*.ad?" -print | etags -

This answer complements @lxrec answer.

Makefiles can be used for many things, not just creating a program/library from source code. Build systems such as CMake or autotools are designed to take code, and build it in such a way as to fit into the user's platform (i.e. find libraries or specify correct compile options). You could for example have a makefile which helps automate some release tasks, such as: build a zip file containing code with the version derived from git; run tests against your code; upload said zip file to some hosting service. Some build systems (such as automake) might provide an easy way to do this, others may not.

That's not to say you need to use makefiles for this, you could use a scripting language (shell, python etc.) for such tasks.

I don't think that human written Makefile-s are obsolete, especially when:

  1. using POSIX make, which gives you a portable Makefile
  2. or using GNU make 4, which gives you many very interesting features, in particular GUILE scriptability, which enables to code efficiently the fancy features provided by Makefile generators (I believe that the features of autotools or Cmake could be easily written by Guile customization of GNU make). Of course, the price to pay is to require GNU make 4 (not a big deal, IMHO).

Makefiles are not obsolete, in the same way that text files are not obsolete. Storing all data in plain text is not always the right way of doing things, but if all you want is a Todo List then a plain text file is fine. For something more complicated you might want a more complicated format like Markdown or XML or a custom binary format or anything in between, but for simple cases plain text works fine.

Similarly, if all you want is a way to avoid writing out g++ -g src/*.c -o blah -W -Wall -Werror && ./blah all the time, a hand-written Makefile is perfect!

If you want to build something that is highly portable, where you don't have to manage that portability yourself, you probably want something like Autotools to generate the right Makefile for you. Autotools will detect the features supported by various platforms. A program written in standard C89 built with Autotools will compile virtually everywhere.

if your project is simple and contains very few files, then no make file is needed.

However, when the project is complex, uses numerous memory areas, has many files, then it is necessary to place each memory area in the right spot in the addressable memory, it is highly desirable to not recompile every file every time a trivial change is made to one file,

If you want to erase old files/compile/link/install with the minimum amount of hassle and chance of keypress mistakes, the makefile is a real boon.

When you get into the 'real world' projects rarely ever are just 1 or 2 files, but rather hundreds of files. a makefile will always perform the right actions and no unnecessary actions (after it is debugged) so you only type one simple command and the makefile does all the work

Licensed under: CC-BY-SA with attribution
scroll top