Question

When I am trying to connect to HBase via oozie, I got the following exception:

`Failing Oozie Launcher, Main class [com.sample.util.HBaseBulkLoad], main() threw exception, org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@6f36e7f3 closed
java.io.IOException: org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@6f36e7f3 closed
    at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:794)
    at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:782)
    at org.apache.hadoop.hbase.client.HTable.finishSetup(HTable.java:249)
    at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:213)
    at com.sample.dao.HBaseDao.bulkLoadUsingMapper(HBaseDao.java:191)
    at com.sample.util.HBaseBulkLoad.main(HBaseBulkLoad.java:32)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:601)
    at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:454)
    at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:393)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:327)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:270)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
    at org.apache.hadoop.mapred.Child.main(Child.java:264)`

My workflow.xml looks like this,

`<workflow-app xmlns="uri:oozie:workflow:0.2" name="sample-wf">
<start to="java-node"/>
 <action name="java-node">
        <java>
            <job-tracker>${jobTracker}</job-tracker>
            <name-node>${nameNode}</name-node>
            <configuration>
                <property>
                    <name>mapred.job.queue.name</name>
                    <value>${queueName}</value>
                </property>
            </configuration>
            <main-class>com.sample.util.HBaseBulkLoad</main-class>
                <arg>TEST</arg>
                <arg>hdfs://centOS/user/sample/input/sample5.txt</arg>
                <arg>hdfs://centOS/user/sample/HFile</arg>
        </java>
        <ok to="end"/>
        <error to="fail"/>
    </action>
    <kill name="fail">
        <message>Java failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
    </kill>
    <end name="end"/>
</workflow-app>`

I am able to run the jar file using hadoop jar command but not using oozie. I set all the required configurations in java. But still getting the exception.

HMaster and HRegionServer are running. Anyone who knows, please help me ...

Était-ce utile?

La solution

Ensure the box the Oozie job is running on has access to your Zookeeper server(s) otherwise the HBaseClient won't be able to discover your HRegionServers.

Licencié sous: CC-BY-SA avec attribution
Non affilié à StackOverflow
scroll top