Domanda

I am posting a similar question a second time because I believe I now have a far more precise view of the problem.

Environment : Hadoop 2.2.0 running as a Single Node Cluster on an Ubuntu 14.04 laptop machine. RStudio version 0.98.507, R version 3.0.2 (2013-09-25), Java Version 1.7.0_55

Any R (or Python) program works perfectly with the Hadoop Streaming utility located at /usr/local/hadoop220/share/hadoop/tools/lib/hadoop-streaming-2.2.0.jar

Errors appear when we use package "rmr" ( part of RHadoop) and call mapreduce() from inside an R program being run in RStudio.

To simplify this post, I am showing a very simple program that fails ( other bigger programs fail with identical error messages)

Sys.setenv(HADOOP_CMD="/usr/local/hadoop220/bin/hadoop")
Sys.setenv(HADOOP_STREAMING="/usr/local/hadoop220/share/hadoop/tools/lib/hadoop-streaming-2.2.0.jar")
library('rhdfs')
library('rmr2')
hdfs.init()
hdfs.ls("/user/hduser")
small.ints = to.dfs(1:1000)
mapreduce(
  input = small.ints, 
  map = function(k, v) cbind(v, v^2))

the errors that show up on the R-Studio console are

> Sys.setenv(HADOOP_CMD="/usr/local/hadoop220/bin/hadoop")
> Sys.setenv(HADOOP_STREAMING="/usr/local/hadoop220/share/hadoop/tools/lib/hadoop-streaming-2.2.0.jar")
> library('rhdfs')
Loading required package: rJava

HADOOP_CMD=/usr/local/hadoop220/bin/hadoop

Be sure to run hdfs.init()
> library('rmr2')
Loading required package: Rcpp
Loading required package: RJSONIO
Loading required package: bitops
Loading required package: digest
Loading required package: functional
Loading required package: reshape2
Loading required package: stringr
Loading required package: plyr
Loading required package: caTools
> hdfs.init()
14/05/10 14:20:39 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
> hdfs.ls("/user/hduser")
  permission  owner      group size          modtime                 file
1 drwxr-xr-x hduser supergroup    0 2014-05-07 17:44      /user/hduser/BT
2 drwxr-xr-x hduser supergroup    0 2014-05-09 07:14  /user/hduser/BT-out
3 drwxr-xr-x hduser supergroup    0 2014-05-09 20:30 /user/hduser/BTR-out
4 drwxr-xr-x hduser supergroup    0 2014-05-07 17:44  /user/hduser/BTj-in
> small.ints = to.dfs(1:1000)
Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /usr/local/hadoop220/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'.
14/05/10 14:20:50 WARN util.NativeCodeLoader: ... using builtin-java classes where applicable

[ these two messages repeat multiple times ]

> mapreduce(
+   input = small.ints, 
+   map = function(k, v) cbind(v, v^2))

Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /usr/local/hadoop220/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'.
14/05/10 14:21:12 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

14/05/10 14:21:20 WARN streaming.StreamJob: -file option is deprecated, please use generic option -files instead.

packageJobJar: [/tmp/RtmpYCerEW/rmr-local-env282d4c7a3b53, /tmp/RtmpYCerEW/rmr-global-env282d77c9da92, /tmp/RtmpYCerEW/rmr-streaming-map282d4225651a, /tmp/hadoop-hduser/hadoop-unjar678942474363050554/] [] /tmp/streamjob8073315154972274831.jar tmpDir=null
14/05/10 14:21:21 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032
14/05/10 14:21:21 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032
14/05/10 14:21:22 INFO mapred.FileInputFormat: Total input paths to process : 1
14/05/10 14:21:22 INFO mapreduce.JobSubmitter: number of splits:2
14/05/10 14:21:22 INFO Configuration.deprecation: user.name is deprecated. Instead, use mapreduce.job.user.name
14/05/10 14:21:22 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
14/05/10 14:21:22 INFO Configuration.deprecation: mapred.cache.files.filesizes is deprecated. Instead, use mapreduce.job.cache.files.filesizes
14/05/10 14:21:22 INFO Configuration.deprecation: mapred.cache.files is deprecated. Instead, use mapreduce.job.cache.files
14/05/10 14:21:22 INFO Configuration.deprecation: mapred.reduce.tasks is deprecated. Instead, use mapreduce.job.reduces
14/05/10 14:21:22 INFO Configuration.deprecation: mapred.output.value.class is deprecated. Instead, use mapreduce.job.output.value.class
14/05/10 14:21:22 INFO Configuration.deprecation: mapred.mapoutput.value.class is deprecated. Instead, use mapreduce.map.output.value.class
14/05/10 14:21:22 INFO Configuration.deprecation: mapred.job.name is deprecated. Instead, use mapreduce.job.name
14/05/10 14:21:22 INFO Configuration.deprecation: mapred.input.dir is deprecated. Instead, use mapreduce.input.fileinputformat.inputdir
14/05/10 14:21:22 INFO Configuration.deprecation: mapred.output.dir is deprecated. Instead, use mapreduce.output.fileoutputformat.outputdir
14/05/10 14:21:22 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
14/05/10 14:21:22 INFO Configuration.deprecation: mapred.cache.files.timestamps is deprecated. Instead, use mapreduce.job.cache.files.timestamps
14/05/10 14:21:22 INFO Configuration.deprecation: mapred.output.key.class is deprecated. Instead, use mapreduce.job.output.key.class
14/05/10 14:21:22 INFO Configuration.deprecation: mapred.mapoutput.key.class is deprecated. Instead, use mapreduce.map.output.key.class
14/05/10 14:21:22 INFO Configuration.deprecation: mapred.working.dir is deprecated. Instead, use mapreduce.job.working.dir
14/05/10 14:21:23 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1399709731242_0003
14/05/10 14:21:23 INFO impl.YarnClientImpl: Submitted application application_1399709731242_0003 to ResourceManager at /0.0.0.0:8032
14/05/10 14:21:23 INFO mapreduce.Job: The url to track the job: http://yantrajaal:8088/proxy/application_1399709731242_0003/
14/05/10 14:21:23 INFO mapreduce.Job: Running job: job_1399709731242_0003
14/05/10 14:21:30 INFO mapreduce.Job: Job job_1399709731242_0003 running in uber mode : false
14/05/10 14:21:30 INFO mapreduce.Job:  map 0% reduce 0%
14/05/10 14:21:43 INFO mapreduce.Job: Task Id : attempt_1399709731242_0003_m_000000_0, Status : FAILED
Error: java.lang.RuntimeException: PipeMapRed.waitOutputThreads(): subprocess failed with code 1
    at org.apache.hadoop.streaming.PipeMapRed.waitOutputThreads(PipeMapRed.java:320)
    at org.apache.hadoop.streaming.PipeMapRed.mapRedFinished(PipeMapRed.java:533)
    at org.apache.hadoop.streaming.PipeMapper.close(PipeMapper.java:130)
    at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:61)
    at org.apache.hadoop.streaming.PipeMapRunner.run(PipeMapRunner.java:34)
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:429)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157)

14/05/10 14:21:44 INFO mapreduce.Job: Task Id : attempt_1399709731242_0003_m_000001_0, Status : FAILED
Error: java.lang.RuntimeException: PipeMapRed.waitOutputThreads(): subprocess failed with code 1
    at org.apache.hadoop.streaming.PipeMapRed.waitOutputThreads(PipeMapRed.java:320)
    at org.apache.hadoop.streaming.PipeMapRed.mapRedFinished(PipeMapRed.java:533)
    at org.apache.hadoop.streaming.PipeMapper.close(PipeMapper.java:130)
    at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:61)
    at org.apache.hadoop.streaming.PipeMapRunner.run(PipeMapRunner.java:34)
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:429)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157)

14/05/10 14:22:04 INFO mapreduce.Job: Task Id : attempt_1399709731242_0003_m_000000_1, Status : FAILED
Error: java.lang.RuntimeException: PipeMapRed.waitOutputThreads(): subprocess failed with code 1
    at org.apache.hadoop.streaming.PipeMapRed.waitOutputThreads(PipeMapRed.java:320)
    at org.apache.hadoop.streaming.PipeMapRed.mapRedFinished(PipeMapRed.java:533)
    at org.apache.hadoop.streaming.PipeMapper.close(PipeMapper.java:130)
    at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:61)
    at org.apache.hadoop.streaming.PipeMapRunner.run(PipeMapRunner.java:34)
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:429)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157)

Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143

14/05/10 14:22:04 INFO mapreduce.Job: Task Id : attempt_1399709731242_0003_m_000001_1, Status : FAILED
Error: java.lang.RuntimeException: PipeMapRed.waitOutputThreads(): subprocess failed with code 1
    at org.apache.hadoop.streaming.PipeMapRed.waitOutputThreads(PipeMapRed.java:320)
    at org.apache.hadoop.streaming.PipeMapRed.mapRedFinished(PipeMapRed.java:533)
    at org.apache.hadoop.streaming.PipeMapper.close(PipeMapper.java:130)
    at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:61)
    at org.apache.hadoop.streaming.PipeMapRunner.run(PipeMapRunner.java:34)
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:429)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157)

14/05/10 14:22:17 INFO mapreduce.Job: Task Id : attempt_1399709731242_0003_m_000001_2, Status : FAILED
Error: java.lang.RuntimeException: PipeMapRed.waitOutputThreads(): subprocess failed with code 1
    at org.apache.hadoop.streaming.PipeMapRed.waitOutputThreads(PipeMapRed.java:320)
    at org.apache.hadoop.streaming.PipeMapRed.mapRedFinished(PipeMapRed.java:533)
    at org.apache.hadoop.streaming.PipeMapper.close(PipeMapper.java:130)
    at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:61)
    at org.apache.hadoop.streaming.PipeMapRunner.run(PipeMapRunner.java:34)
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:429)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157)

14/05/10 14:22:17 INFO mapreduce.Job: Task Id : attempt_1399709731242_0003_m_000000_2, Status : FAILED
Error: java.lang.RuntimeException: PipeMapRed.waitOutputThreads(): subprocess failed with code 1
    at org.apache.hadoop.streaming.PipeMapRed.waitOutputThreads(PipeMapRed.java:320)
    at org.apache.hadoop.streaming.PipeMapRed.mapRedFinished(PipeMapRed.java:533)
    at org.apache.hadoop.streaming.PipeMapper.close(PipeMapper.java:130)
    at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:61)
    at org.apache.hadoop.streaming.PipeMapRunner.run(PipeMapRunner.java:34)
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:429)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157)

14/05/10 14:22:26 INFO mapreduce.Job:  map 100% reduce 0%
14/05/10 14:22:26 INFO mapreduce.Job: Job job_1399709731242_0003 failed with state FAILED due to: Task failed task_1399709731242_0003_m_000001
Job failed as tasks failed. failedMaps:1 failedReduces:0

14/05/10 14:22:26 INFO mapreduce.Job: Counters: 10
    Job Counters 
        Failed map tasks=7
        Killed map tasks=1
        Launched map tasks=8
        Other local map tasks=6
        Data-local map tasks=2
        Total time spent by all maps in occupied slots (ms)=91997
        Total time spent by all reduces in occupied slots (ms)=0
    Map-Reduce Framework
        CPU time spent (ms)=0
        Physical memory (bytes) snapshot=0
        Virtual memory (bytes) snapshot=0
14/05/10 14:22:26 ERROR streaming.StreamJob: Job not Successful!
Streaming Command Failed!
Error in mr(map = map, reduce = reduce, combine = combine, vectorized.reduce,  : 
  hadoop streaming failed with error code 1
> 

I have Googled the two irritating warnings (a) disabled stack guard and have discovered from this link that there is "nothing to worry about" just a warning (b) Unable to load native-hadoop library for your platform... using builtin-java classes where applicble .. this is also a warning as per this link nothing to worry about

After discounting these two warnings as not being the cause, the main error that I find is here

14/05/10 14:21:43 INFO mapreduce.Job: Task Id : attempt_1399709731242_0003_m_000000_0, Status : FAILED
Error: java.lang.RuntimeException: PipeMapRed.waitOutputThreads(): subprocess failed with code 1
    at org.apache.hadoop.streaming.PipeMapRed.waitOutputThreads(PipeMapRed.java:320)
    at org.apache.hadoop.streaming.PipeMapRed.mapRedFinished(PipeMapRed.java:533)
    at org.apache.hadoop.streaming.PipeMapper.close(PipeMapper.java:130)
    at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:61)
    at org.apache.hadoop.streaming.PipeMapRunner.run(PipeMapRunner.java:34)
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:429)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157)

i have reinstalled the RHadoop packages, rmr and rhdfs and have also reinstalled rJava. Have in the past tried with Hadoop 1.3 as well, but errors are same.

would be really grateful if someone can suggest some way forward on this

È stato utile?

Soluzione

I resolved the problem by changing the directory of installation of rmr2, rhdfs,... packages. Basically you need to install all the packages in a system folder instead of custom folder. There seem to be a problem with the location of installation. Initially I installed the packages in a custom folder:

/home/user/R/3.1

By re-installing the packages at:

/usr/lib/R/library

I got the code working.

I hope this will be of some help.

Autorizzato sotto: CC-BY-SA insieme a attribuzione
Non affiliato a StackOverflow
scroll top