Question

Those who know C++ may know what I mean by 'unity build':

  1. *.cpp files of a project are all effectively #include-ed into a single supermassive source file following #include directives specified in *.cpp and *.h files
  2. this source file is fed into the compiler
  3. finish! You get the output binary!

Doing things this way means that there are that there are fewer intermediate files (*.o), fewer file reads and disk IO overheads, fewer invocations of the compiler, leading to a better build performance.

My question is, is this possible for Latex at all? I want it because there is a slow post-processing pass that I would like to run over .tex files before building my final .pdf using pdflatex. Currently, it takes around 7 seconds to process my growing list of .tex files. I believe that running this pass over one file is significantly faster. This motivates my question!

To summarize, I want to

  1. 'merge' all the .tex files into a supermassive .tex source file by following the \input{} and \include{} macros in each .tex file
  2. feed the supermassive .tex source file into the slow post-processing pass (actually the Ott tex-filter, fyi)
  3. pipe the output straight into pdflatex
  4. finish! I get the output PDF file!

The first step is the problem here. Any ideas welcome. It's best if I don't need to write my own script to do this step!

Many thanks!

Was it helpful?

Solution

A good tool that can handle this is rubber, with the help of its combine module. It will gather all dependencies, and produce a single file ready for consumption.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top