Question

Working in immutable data with single assignments has the obvious effect of requiring more memory, one would presume, because you're constantly creating new values (though compilers under the covers do pointer tricks to make this less of an issue).

But I've heard a few times now that the losses there in performance are outweighed by the gains in the way that the CPU (its memory controller specifically) can take advantage of the fact that the memory is not mutated (as much).

I was hoping someone could shed some light on how this is true (or if it's not?).

In a comment on another post it was mentioned that Abstract Data Types (ADT's) have to do with this which made me further curious, how do ADTs specifically effect the way the CPU deals with memory? This is however an aside, mostly I'm just interested in how purity of language necessarily affects the performance of the CPU and its caches etc.

No correct solution

Licensed under: CC-BY-SA with attribution
scroll top