Question

I am compiling following code VS2012 - 32BIT. I know that max_size() returns

"maximum potential size the container can reach" which is in my case: 1.073.741.823 (yeay)

So how can i know, how many object my container can really store? (I have 64gb RAM)

unsigned int = 100000000;
vector<int*> data;
std::cout<<"max cap: "<<data.max_size()<<std::endl;
for(unsigned int i = 0; i < operationUnit; i++)
data.push_back(new int());

This will end-up in a bad-alloc. However, as i am targetting x64 this problem doesn't occur, as the max-cap is much higher, but i still cannot figure the exact elements, when i would like to reduce it down to clamp user-input.

thanks!

Was it helpful?

Solution

Well, it is OS dependent ofcourse, but the results would be, similar for every one. For example when run as 32bit executable, consistenly a build with VS2012 will stop at 26,906,977 elements in a vector of int*, not posing a threat to your memory (not even by close).

Now it gets interesting when you build a 64bit version, in which case, throwing a bad_alloc happens when (almost) all your memory is drained. In that case, no C++ not any other language can protect you.

In the screenshot that follows I'm posting an example of this happening: by the time bad_alloc gets thrown, the program is in no position to catch it or do anything with it. The OS steps in and kills every process and memory is deallocated at once (see graph). In the respective 32 version the exception was caught normally and deallocation would take about 10 minutes.

Now this is a very simplistic way of seeing this, I'm sure OS gurus could supply more insights but feel free to try this at home (and burn out some memory - I can't stop thinking that I can smell something burnt after this)

exceeding 16GB of memory

the code in the screenshot

#include <iostream>
#include <vector>
#include <exception>

using namespace std;


int main() 
{

    vector<int*> maxV;

    try 
    {
        while (true) maxV.push_back(new int);
    } 
    catch (bad_alloc &e) 
    {
        cout << "I caught bad alloc at element no " << maxV.size() << "\n";
        for (auto i : maxV) delete i;
    } 
    catch (exception &e) 
    {
        cout << "Some other exception happened at element no " << maxV.size() << "\n";
        for (auto i : maxV) delete i;
    }

    return 0;
}

OTHER TIPS

You can't. OS could totally run out of memory. You may find that using deque data structure can become larger before error than vector for huge amounts of data, as the data is not contiguous and so it is less effected by memory fragmentation, which can be significant when you end up allocating more than half your entire memory..

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top