Question

Imagine a humongous web aplication built using Single Page Application framework such as AngularJS. With its each route it downloads a couple of HTML template files. Each of these template files contain JS files unique to it. So, now when each of these template files download it will also pull in and download the JS files as well.

This is just a hypothetical situation to understand how browsers would work under such a situation.

What will happen in terms of memory and performance when user quickly hops across all the pages (which can be 100s or 1000s of such pages)? Will all the JS variables consume the memory and unnecessarily clog it or will the JS Garbage Collector come to the rescue?

Was it helpful?

Solution

It depends upon your JavaScript code and your DOM (and in principle, of the JavaScript implementation in the browser; invite your users to use recent versions of browsers). You need to read more about garbage collection techniques (e.g. the GC handbook), see Mozilla SpiderMonkey's GC page.

A value which is still indirectly reachable from global or local variables won't be garbage collected. The GC will follow references in aggregate values (including closures, arrays, objects, ...).

You might want to carefully and cleverly use weak references, e.g. JavaScript WeakMaps.

Conceptually, the GC is computing the transitive closure of the reference graph, starting from global and local variables (on the call stack). These values are alive and the dead values are eventually destroyed by the GC since they are garbage. A value which is only reachable thru weak references won't be kept.

Even with a GC, a program can have a memory leak, in particular if it keeps in some variable a value which would never be useful in the future. And detecting this thru static code analysis techniques is an undecidable problem, provably equivalent to the halting problem.

Licensed under: CC-BY-SA with attribution
scroll top