Question

I've been trying to understand continuations / CPS and from what I can gather it builds up a delayed computation, once we get to the end of the list we invoke the final computation.

What I don't understand is why CPS prevents stackoverflow when it seems analogous to building up a nested function as per the naive approach in Example 1. Sorry for the long post but tried to show the idea (and possibly where it goes wrong) from basics:

So:

let list1 = [1;2;3]

Example 1: "Naive approach"

let rec sumList = function
    |[] -> 0
    |h::t -> h + sumList t

So when this runs, iteratively it results in:

  1. 1 + sumList [2;3]
  2. 1 + (2 + sumList [3])
  3. 1 + (2 + (3 + 0))

So the nesting (and overflow issues) can be overcome by Tail Recursion - running an accumulator i.e.

"Example 2: Tail Recursion"

let sumListACC lst =
    let rec loop l acc =
        match l with
        |[] -> acc
        |h::t -> loop t (h + acc)
    loop lst 0

i.e,

  1. sumList[2;3] (1+0)
  2. sumList[3] (2+1)
  3. sumList[] (3+3)

So because the accumulator is evaluated at each step, there is no nesting and we avoid bursting the stack. Clear!

Next comes CPS, I understand this is required when we already have an accumulator but the function is not tail recursive e.g. with Foldback. Although not required in the above example, applying CPS to this problem gives:

"Example 3: CPS"

let sumListCPS lst =
    let rec loop l cont =
        match l with
        |[] -> cont 0
        |h::t -> loop t (fun x -> cont( h + x))
    loop lst (fun x -> x)

To my understanding, iteratively this could be written as:

  1. loop[2;3] (fun x -> cont (1+x))
  2. loop[3] (fun x ->cont (1+x) -> cont(2+x))
  3. loop[] (fun x -> cont (1+x) -> cont(2+x) -> cont (3+x)

which then reduces sequentially from the right with the final x = 0 i.e:

  • cont(1+x)-> cont(2+x) -> cont (3+0)
  • cont(1+x)-> cont(2+x) -> 3
  • cont(1+x) -> cont (2+3)
  • ...
  • cont (1+5) -> 6

which I suppose is analogous to:

cont(1+cont(2+cont(3+0)))    
(1+(2+(3+0)))

correction to original post - realised that it is evaluated from the right, as for example replacing cont(h +x) with cont(h+2*x) yields 17 for the above example consistent with: (1+2*(2+2*(3+2*0)))

i.e. exactly where we started in example 1, based on this since we still need to keep track of where we came from why does using it prevent the overflow issue that example 1 suffers from?

As I know it doesn't, where have I gone wrong?

I've read the following posts (multiple times) but the above confusion remains.

http://www.markhneedham.com/blog/2009/06/22/f-continuation-passing-style/

http://codebetter.com/matthewpodwysocki/2008/08/13/recursing-on-recursion-continuation-passing/

http://lorgonblog.wordpress.com/2008/04/05/catamorphisms-part-one/

Was it helpful?

Solution

What happens is quite simple.

.NET (and other platforms, but we're discussing F# right now) stores information in two locations: the stack (for value types, for pointer to objects, and for keeping track of function calls) and the heap (for objects).

In regular non-tail recursion, you keep track of your progress in the stack (quite obviously). In CPS, you keep track of your progress in lambda functions (which are on the heap!), and tail recursion optimization makes sure that the stack stays clear of any tracking.

As the heap is significantly larger than the stack, it is (in some cases) better to move the tracking from the stack to the heap - via CPS.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top