Question

Is it possible, with some sort of algorithm or something like that, to speed up a cellular automata? I'm using a Conway's Game of Life implementation made in XNA and it works perfectly, but the problem is that when I use a grid larger than 128x128 cells it becomes awfully slow.

I don't think is has to do with the code or how XNA handles textures and drawing, but the fact that updating so many cells (i.e. evaluating each of the cell's neighbors and based on that obtaining its new state) it's a lot of computation.

Of course, an ideal cellular automata should be infinitely large, but in reality that's impossible to do. But 128x128 is just too small to actually see how the system behaves, in my opinion.

Any help would be greatly appreciated!

Was it helpful?

Solution

If you try this a few times, you'll see where the time goes.

One shouldn't guess, but my guess is essentially all the time goes into rendering. The evaluation of neighbors may look like a lot of code, but chances are it's extremely simple. If you have a way to avoid re-rendering cells that haven't changed, that might save a lot.

OTHER TIPS

The Hashlife algorithm uses quad-trees, hashing, and memorization to compress the time and space of the CA for a massive increase in performance. Check out Golly for an example implementation.

I'm still trying to figure it out myself, and look for good libraries.

There's good explanation here (with example code): http://www.drdobbs.com/jvm/an-algorithm-for-compressing-space-and-t/184406478.

I would recomend to use OpenGL and GLSL. This way you can eliminate the data transfer from cpu to gpu and gain a nice speedup of 10x or more.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top