I'm going to take a fairly broad view of the question in this answer, mostly because of the comments under WillNess's answer.
In a blog post, Max Rabkin introduced some work on folding combinators. Conal Elliott picked up this idea and published a few more blog posts as well as the ZipFold package on hackage. I would highly recommend reading this material, it's short and pretty accessible. The ZipFold package is probably very usable, although it hasn't been updated for some time.
Edward Kmett's recent tour-de-force, lens
, also includes some folding combinators. I'm not sure I'd want to use it just for that, but if you're using lens anyway it's probably worth looking in to.
An alternative approach is to use parallelism. If you write
import Control.Parallel
minimax2 dt = let a = minimum dt
b = maximum dt
in a `par` b `pseq` [a,b]
and link with -threaded, then it's possible for minimax2
to run in something close to constant space, depending on the vagaries of the scheduler, phases of the moon, etc (mostly the scheduler and allocation patterns of the functions IIRC). Of course this doesn't provide reliable guarantees, but it can work well in practice. Generalizing this approach to dpMap
should be straightforward, you'd probably want to use Control.Parallel.Strategies
or similar rather than using the lower-level par
directly.
Finally, most of the iteratee-derived libraries are quite good at handling this sort of task. In general they provide explicit control over when input streams are produced and consumed. In iteratee
I provide sequence_ which does almost the same as dpMap
, will be adding sequence
which will do exactly the same thing as dpMap
, and a number of zips, all of which run in constant space (provided that the consuming functions are themselves constant-space). I wouldn't be surprised if most of the other packages provided similar operations.