Pergunta

I get the following error while executing a MapReduce program. I have placed all jars in hadoop/lib directory and have also mentioned the jars in -libjars.

This is the cmd I am executing:

$HADOOP_HOME/bin/hadoop --config $HADOOP_HOME/conf jar /home/shash/distinct.jar  HwordCount -libjars $LIB_JARS WordCount HWordCount2

java.lang.RuntimeException: java.lang.ClassNotFoundException: 
org.apache.hcatalog.mapreduce.HCatOutputFormat at 
org.apache.hadoop.conf.Configuration.getClass(Configuration.java:996) at 
org.apache.hadoop.mapreduce.JobContext.getOutputFormatClass(JobContext.java:248) at org.apache.hadoop.mapred.Task.initialize(Task.java:501) at 
org.apache.hadoop.mapred.MapTask.run(MapTask.java:306) at org.apache.hadoop.mapred.Child$4.run(Child.java:270) at 
java.security.AccessController.doPrivileged(Native Method) at 
javax.security.auth.Subject.doAs(Subject.java:415) at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1127) at 
org.apache.hadoop.mapred.Child.main(Child.java:264) Caused by: java.lang.ClassNotFoundException: org.apache.hcatalog.mapreduce.HCatOutputFormat 
at java.net.URLClassLoader$1.run(URLClassLoader.java:366) at 
java.net.URLClassLoader$1.run(URLClassLoader.java:355) at 
java.security.AccessController.doPrivileged(Native Method) at 
java.net.URLClassLoader.findClass(URLClassLoader.java:354) at 
java.lang.ClassLoader.loadClass(ClassLoader.java:423) at 
sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at 
java.lang.ClassLoader.loadClass(ClassLoader.java:356) at 
java.lang.Class.forName0(Native Method) at 
java.lang.Class.forName(Class.java:264) at 
org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:943) 
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:994) ... 
8 more
Foi útil?

Solução

Make sure LIB_JARS is a comma-separated list (as opposed to colon-separated like CLASSPATH)

Outras dicas

Applies To CDH 5.0.x CDH 5.1.x CDH 5.2.x CDH 5.3.x Sqoop

Cause Sqoop cannot pick up the HCatalog libraries because Cloudera Manager does not set the HIVE_HOME environment. It needs to be set manually.

This problem is tracked with below JIRA: https://issues.apache.org/jira/browse/SQOOP-2145

The fix of this JIRA has been included in CDH since version 5.4.0.

Workaround: Applicable to CDH versions lower than 5.4.0.

Execute below commands in shell before calling Sqoop command or adding them to /etc/sqoop/conf/sqoop-env.sh (create one, if it does not already exists):

export HIVE_HOME=/opt/cloudera/parcels/CDH/lib/hive   (for parcel installation)
export HIVE_HOME=/usr/lib/hive (for package installation)
Licenciado em: CC-BY-SA com atribuição
Não afiliado a StackOverflow
scroll top