I'm not very proficient in, nor a big fan of the autotools, but my understanding is this:
The purpose of the autotools was to improve portability, for example to systems that didn't have GNU Make available. Examples included HP-UX, Solaris, IRIX and the BSD family. The autotools users could build programs with an always-constant sequence of instructions:
./configure
make
Here, ./configure
is a /bin/sh
script that is generated on the developer's system and is portable so it can run on a wide range of systems. The generated Makefile
was platform-specific and included handling for platform-specific requirements/issues. Also, ./configure
generates a config.h
file which allows C code to react to the availability/absence of certain libraries to build options passed to ./configure
.
The Wikipedia page on GNU build system includes a nice diagram of the files involved in the whole process:
Everything up to configure
and Makefile.in
is pre-generated on the developer system, config.status
and below are generated on the build (user) system.
What is the relevance of the autotools today? Obviously, they are still used in many projects. However, the importance of non-Linux UNIX operating systems has dropped tremendously. Those that do still exist all have GNU Make available. As you said, GNU Make has an astonishing number of features and makes it very easy to write Makefile
s that automatically track dependencies and handle available/missing libraries.
Thus, I personally write all my make
systems based on GNU Make and I'm happy that I haven't had to touch the autotools in years.