Asked By : Cris Stringfellow
Answered By : cody
type Tree = Leaf | Node Tree Tree
Now the full tree of order $n$ is:
full : Int -> Tree full n | n == 0 = Leaf full n = Node (full (n-1)) (full (n-1))
And the depth of a tree is computed by
depth : Tree -> Int depth Leaf = 0 depth (Node t1 t2) = 1 + max (depth t1) (depth t2)
Now you can see that any computation of $mathrm{depth} (mathrm{full} n)$ will first construct the full tree of order $n$ using $mathrm{full}$ and then deconstruct that tree using $mathrm{depth}$. Deforestation relies on the observation that such a pattern (construction followed by deconstruction) can often be short-circuited: we can replace any calls to $mathrm{depth} (mathrm{full} n)$ by a single call to $mathrm{full_depth}$:
full_depth : Int -> Int full_depth n | n == 0 = 0 full_depth n = 1 + max (full_depth (n-1)) (full_depth (n-1))
This avoids memory allocation of the full tree, and the need to perform pattern matching, which greatly improves performance. In addition, if you add the optimization
max t t --> t
Then you have turned an exponential time procedure into a linear time one… It would be cool if there was an additional optimization that recognized that $mathrm{full_depth}$ was the identity on integers, but I am not sure that any such optimization is used in practice. The only mainstream compiler that performs automatic deforestation is GHC, and if I recall correctly, this is performed only when composing built-in functions (for technical reasons).
Best Answer from StackOverflow
Question Source : http://cs.stackexchange.com/questions/10129