Question

I installed hadoop-2.0.5-alpha, hbase-0.95.1-hadoop2, and zookeeper-3.4.5. Hadoop and zookeper are running fine. HDFS and MR2 work great. But HBase will not boot. Has anyone seen this error before? I'll post my config and logs below. Thanks in advance for your help.

hbase-site.xml :

<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<configuration>
<property>
<name>hbase.zookeeper.quorum</name>
<value>master</value>
<description>The directory shared by RegionServers.
</description>
</property>
<property>
<name>hbase.zookeeper.property.dataDir</name>
<value>master</value>
<description>Property from ZooKeeper's config zoo.cfg.
The directory where the snapshot is stored.
</description>
</property>
<property>
<name>hbase.rootdir</name>
<value>hdfs://master:8020/hbase</value>
<description>The directory shared by RegionServers.
</description>
</property>
<property>
<name>hbase.cluster.distributed</name>
<value>true</value>
<description>The mode the cluster will be in. Possible values are
  false: standalone and pseudo-distributed setups with managed Zookeeper
  true: fully-distributed with unmanaged Zookeeper Quorum (see hbase-env.sh)
</description>
</property>
</configuration>

hbase-xxxx-master-master.log :

2013-07-02 14:33:14,791 FATAL [master:master:60000] master.HMaster: Unhandled 
exception. Starting shutdown. 
java.io.IOException: Failed on local exception:
com.google.protobuf.InvalidProtocolBufferException: Message missing required fields:
callId, status; Host Details : local host is: "master/192.168.255.130"; destination 
host is: "master":8020;
    at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:760)
    at org.apache.hadoop.ipc.Client.call(Client.java:1168)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invok
   (ProtobufRpcEngine.java:202)
    at com.sun.proxy.$Proxy10.setSafeMode(Unknown Source)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invok
   (DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod
  (RetryInvocationHandler.java:164)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke
  (RetryInvocationHandler.java:83)
    at com.sun.proxy.$Proxy10.setSafeMode(Unknown Source)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.setSafeMode 
  (ClientNamenodeProtocolTranslatorPB.java:514)
    at org.apache.hadoop.hdfs.DFSClient.setSafeMode(DFSClient.java:1896)
    at org.apache.hadoop.hdfs.DistributedFileSystem.setSafeMode
  (DistributedFileSystem.java:660)
    at org.apache.hadoop.hbase.util.FSUtils.isInSafeMode(FSUtils.java:421)
    at org.apache.hadoop.hbase.util.FSUtils.waitOnSafeMode(FSUtils.java:828)
    at org.apache.hadoop.hbase.master.MasterFileSystem.checkRootDir
  (MasterFileSystem.java:464)
    at org.apache.hadoop.hbase.master.MasterFileSystem.createInitialFileSystemLayout
  (MasterFileSystem.java:153)
    at org.apache.hadoop.hbase.master.MasterFileSystem.<init>(MasterFileSystem.java:137)
    at org.apache.hadoop.hbase.master.HMaster.finishInitialization(HMaster.java:728)
    at org.apache.hadoop.hbase.master.HMaster.run(HMaster.java:546)
    at java.lang.Thread.run(Thread.java:662)
Was it helpful?

Solution

Make sure you have built hbase properly(keeping all the hadoop-2.0.5 dependencies in mind). Verify that the hadoop-core jar in hbase/lib directory is same as hadoop jar inside your hadoop. Check the version of hadoop in your pom.xml once and build hbase accordingly.

If you still face any issue you can try the patch from HBASE-7904 and rebuild your HBase.

HTH

OTHER TIPS

there may be compatibility issue while installing hbase with hadoop 2.x please check

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top