Question

So, I'm finding the Eigen package crashes when I try to declare a matrix larger than 10000x10000. I need to declare a matrix like this.. about 13000x13000 elements reliably. I ran a test like:

for( int tortureEigen = 1 ; tortureEigen < 50000 ; tortureEigen++ )
{
  printf( "Torturing Eigen with %dx%d..\n", tortureEigen, tortureEigen ) ;
  Eigen::MatrixXd m( tortureEigen, tortureEigen ) ;
}

Crashes on my machine (6 GB RAM) at 14008 elements.

I'm kind of disappointed! I thought Eigen was like MATLAB or octave and should not crash using larger arrays, even if it does hit the disk or something..

And what's more is when I run this test and keep TaskMan open, the process that is creating these matrices doesn't even use that much memory. TaskMan reports under 2k usage.

Using Eigen 2.0.15 stable release

Was it helpful?

Solution 2

All the answers here are helpful!

It turns out that when compiled as a 32-bit app, Eigen will crash if you try and declare a dense MatrixXd, as I was, larger than 14000 elements or so. The crash happens with _aligned_malloc returning 0 in the Eigen alloc code (MatrixXd::resize()), meaning 1.5GB of contiguous, aligned RAM couldn't be allocated under 32-bit, which makes sense, since this is getting close to half the maximum addressable memory loc. Finding more than 1.5 GB contiguous out of 4.0 becomes really unlikely, I suppose! Upgrading to Eigen 3.0 unfortunately does not solve the problem.

Solution #1

Ok then, so I compiled in 64-bit, and on my 6GB machine, the program runs successfully, with the dense MatrixXd allocation and solution working just fine.

Solution #2

Another solution is using a DynamicSparseMatrix<double>. Sparse does not crash on huge size alloc, even as 32 bit app, but API support for solving is another story (API seems to want to convert to MatrixXd dense type in order to solve, which leaves us with the same original problem.)

OTHER TIPS

eigen developer here. You'd be far better off asking Eigen questions on our support channels e.g. forum... ;-)

Short answer: are you using fixed or dynamic-size matrices?

  • if fixed-size, switch to dynamic-size (for such huge sizes, it's a no-brainer anyway)

  • if you're getting the bug with dynamic-sized matrices, I'm suprised but at the same time I can see where the value 10000 comes from. In any case, if you upgrade to eigen3 (the development branch), your problem will be gone.

Out of the Eigen doc:

Dense versus sparse: This Matrix class handles dense, not sparse matrices and vectors. For sparse matrices and vectors, see the Sparse module.
Dense matrices and vectors are plain usual arrays of coefficients. All the coefficients are stored, in an ordinary contiguous array. This is unlike Sparse matrices and vectors where the coefficients are stored as a list of nonzero coefficients.

Lets see, 10000x10000x8 (double-Matrix) makes about 1.5GB . Thats about the maximum size of a continuos heap block under a 32bit os, one would expect. Try sparse matrixes.

If you really need such large dense matrixes, then you have got quite some other problems: Will the calculation end before the next power outage?

Given the specs of your hard-ware I can only assume you are running on a 64 bit OS.

You can still crash even if memory gets paged out to the page file. It might mean memory is fragmented, or that your page file is still too small. If so, you should bump up your page file to something pretty big like 8 or 12 GB's or so.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top