Question

has anyone here ever had success exporting a bigmatrix to a snow cluster in R? The commented lines in the examples of bigmatrix and attach.resource say that it's possible to do so, but I haven't had success.

library(bigmemory)
library(snow)
z <- big.matrix(3, 3, type='integer', init=3)
cl = makeCluster(8, type = "SOCK")
clusterEvalQ(cl,{library(bigmemory)})
zdescription <- describe(z)
clusterExport(cl,"zdescription")
clusterEvalQ(cl,{y <- attach.resource(zdescription)}) #attach.big.matrix also crashes

It also crashes even if I use a file backed big matrix (which is strange since this doesn't even use shared memory)

[[1]]
Warning: This is not advised.  Here is the head of the matrix:

 *** caught segfault ***
address 0x10, cause 'memory not mapped'

Traceback:
 1: .Call("CGetNrow", x@address)
 2: nrow(x)
 3: nrow(x)
 4: .local(x, ...)
 5: head(x)
 6: head(x)

Possible actions:
1: abort (with core dump, if enabled)
2: normal R exit
3: exit R without saving workspace
4: exit R saving workspace
Was it helpful?

Solution

I finally discovered the problem. Looks like the problem was while printing the results:

If obj is a big.matrix, then big.clusterEvalQ(cl,{obj}) will give an error.

So a way to resolve the problem is by simplily adding a constant after the attach.resource function:

clusterEvalQ(cl,{y <- attach.resource(zdescription);1})
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top