Question

If data structures such as trees and certain sorts(quick sort, merge sort) work using recursive algorithms and recursive algorithms take up a lot of memory space in the stack and can only work on a limited data sets(if not it'll lead to stackoverflow), then how come trees and the aforementioned sorts used practically to handle large data sets. Wouldn't it lead to stackoverflow?

Was it helpful?

Solution

Your question is too broad and too general. Regarding data structures and algorithms, you really should read a book on them, such as Introduction to Algorithms (a.k.a. CLRS)

recursive algorithms take up a lot of memory space in the stack

This is not always true (and usually false).

First, the call stack is not that small on current computers (laptops, desktops, servers) running usual operating systems (Windows, Linux, Android, MacOSX) in 2018. As a rule of thumb, it can take one or a few megabytes (which is not tiny: a typical pocket book hold less than a million of characters and might almost fit on it; the Holy Bible holds about four millions characters and is a long book to read; even very long novels like War and Peace are not much bigger and probably smaller).

On cheap microcontrollers like those inside your mouse or keyboard or appliance, or an Arduino, the call stack is smaller: kilobytes, or even a few hundred bytes; but the RAM is also smaller, the clock rate is slower (MHz).

Second, and most importantly, good recursive algorithms and data structures (think of trees, notably of self-balancing binary trees) consume a recursion depth which is often following some logarithmic growth. So for example doubling the size of the data structure would increment the recursion depth by a small, fixed, amount.

Of course all the above is approximate and blur (like is your question). Some recursive algorithms are linear, but then you usually don't use them (at least not on large data sets). Be also aware of tail-recursion.

Of course, read more about time complexity.

Wouldn't it lead to stackoverflow?

In practice not, if the growth is logarithmic. Remember that 264 is really a huge number (more than the number of atoms in your body), but 64 is not. Be aware of order of magnitudes.

Keep in mind that a good programmer writes only two or three dozens of thousands of line of code each year, or only a megabyte -or two- of source code each year. And a typical computer can hold thousands times more of bytes in its RAM (i.e. most computers have more than a gigabyte of RAM in 2018)


I don't have space or time to explain more, but I do want to give the impression that programming and computer science is difficult, and you need years to learn it. So getting a degree in CS takes time, but is worth the effort.

Licensed under: CC-BY-SA with attribution
scroll top