Pregunta

I'm trying to setup HBase to authenticate against secure HDFS and ZooKeeper. HMaster authenticates with ZooKeeper successfully. However, it keeps doing SIMPLE authentication with HDFS. Don't know what is missing in my config.

<property>
    <name>hbase.rootdir</name>
    <value>hdfs://dev1.example.com:8020/hbase</value>
  </property>
  <property>
    <name>hbase.tmp.dir</name>
    <value>/mnt/hadoop/hbase</value>
  </property>
  <property>
    <name>hbase.cluster.distributed</name>
    <value>true</value>
  </property>

  <!-- authentication -->
  <property>
     <name>hbase.security.authentication</name>
     <value>kerberos</value>
  </property>
  <property>
     <name>hbase.rpc.engine</name>
     <value>org.apache.hadoop.hbase.ipc.SecureRpcEngine</value>
  </property>
  <property>
    <name>hbase.regionserver.kerberos.principal</name>
    <value>hbase/_HOST@EXAMPLE.COM</value>
  </property>
  <property>
    <name>hbase.regionserver.keytab.file</name>
    <value>/etc/security/keytab/hbase.keytab</value>
  </property>
  <property>
    <name>hbase.master.kerberos.principal</name>
    <value>hbase/_HOST@EXAMPLE.COM</value>
  </property>
  <property>
    <name>hbase.master.keytab.file</name>
    <value>/etc/security/keytab/hbase.keytab</value>
  </property>

Below is log from HMaster:

2014-01-24 17:14:59,278 DEBUG [master:dev1:60000] ipc.Client: Connecting to dev1.example.com/192.168.11.101:8020
2014-01-24 17:14:59,457 DEBUG [IPC Client (1780703664) connection to dev1.example.com/192.168.11.101:8020 from hbase] ipc.Client: IPC Client (1780703664) connection to dev1.example.com/192.168.11.101:8020 from hbase: starting, having connections 1
2014-01-24 17:14:59,465 DEBUG [IPC Parameter Sending Thread #0] ipc.Client: IPC Client (1780703664) connection to dev1.example.com/192.168.11.101:8020 from hbase sending #0
2014-01-24 17:14:59,491 DEBUG [IPC Client (1780703664) connection to dev1.example.com/192.168.11.101:8020 from hbase] ipc.Client: IPC Client (1780703664) connection to dev1.example.com/192.168.11.101:8020 from hbase got value #-1
2014-01-24 17:14:59,499 DEBUG [IPC Client (1780703664) connection to dev1.example.com/192.168.11.101:8020 from hbase] ipc.Client: closing ipc connection to dev1.example.com/192.168.11.101:8020: SIMPLE authentication is not enabled.  Available:[TOKEN, KERBEROS]
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): SIMPLE authentication is not enabled.  Available:[TOKEN, KERBEROS]
    at org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1042)
    at org.apache.hadoop.ipc.Client$Connection.run(Client.java:891)
2014-01-24 17:14:59,504 DEBUG [IPC Client (1780703664) connection to dev1.example.com/192.168.11.101:8020 from hbase] ipc.Client: IPC Client (1780703664) connection to dev1.example.com/192.168.11.101:8020 from hbase: closed
2014-01-24 17:14:59,504 DEBUG [IPC Client (1780703664) connection to dev1.example.com/192.168.11.101:8020 from hbase] ipc.Client: IPC Client (1780703664) connection to dev1.example.com/192.168.11.101:8020 from hbase: stopped, remaining connections 0
2014-01-24 17:14:59,514 DEBUG [master:dev1:60000] ipc.Client: The ping interval is 60000 ms.
2014-01-24 17:14:59,514 DEBUG [master:dev1:60000] ipc.Client: Connecting to dev1.example.com/192.168.11.101:8020
2014-01-24 17:14:59,531 DEBUG [IPC Parameter Sending Thread #0] ipc.Client: IPC Client (1780703664) connection to dev1.example.com/192.168.11.101:8020 from hbase sending #1
2014-01-24 17:14:59,531 DEBUG [IPC Client (1780703664) connection to dev1.example.com/192.168.11.101:8020 from hbase] ipc.Client: IPC Client (1780703664) connection to dev1.example.com/192.168.11.101:8020 from hbase: starting, having connections 1
2014-01-24 17:14:59,532 DEBUG [IPC Client (1780703664) connection to dev1.example.com/192.168.11.101:8020 from hbase] ipc.Client: IPC Client (1780703664) connection to dev1.example.com/192.168.11.101:8020 from hbase got value #-1
2014-01-24 17:14:59,532 DEBUG [IPC Client (1780703664) connection to dev1.example.com/192.168.11.101:8020 from hbase] ipc.Client: closing ipc connection to dev1.example.com/192.168.11.101:8020: SIMPLE authentication is not enabled.  Available:[TOKEN, KERBEROS]
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): SIMPLE authentication is not enabled.  Available:[TOKEN, KERBEROS]
    at org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1042)
    at org.apache.hadoop.ipc.Client$Connection.run(Client.java:891)
2014-01-24 17:14:59,536 FATAL [master:dev1:60000] master.HMaster: Unhandled exception. Starting shutdown.
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): SIMPLE authentication is not enabled.  Available:[TOKEN, KERBEROS]
    at org.apache.hadoop.ipc.Client.call(Client.java:1347)
    at org.apache.hadoop.ipc.Client.call(Client.java:1300)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
    at com.sun.proxy.$Proxy12.setSafeMode(Unknown Source)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
    at com.sun.proxy.$Proxy12.setSafeMode(Unknown Source)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.setSafeMode(ClientNamenodeProtocolTranslatorPB.java:561)
    at org.apache.hadoop.hdfs.DFSClient.setSafeMode(DFSClient.java:2146)
    at org.apache.hadoop.hdfs.DistributedFileSystem.setSafeMode(DistributedFileSystem.java:983)
    at org.apache.hadoop.hdfs.DistributedFileSystem.setSafeMode(DistributedFileSystem.java:967)
    at org.apache.hadoop.hbase.util.FSUtils.isInSafeMode(FSUtils.java:432)
    at org.apache.hadoop.hbase.util.FSUtils.waitOnSafeMode(FSUtils.java:851)
    at org.apache.hadoop.hbase.master.MasterFileSystem.checkRootDir(MasterFileSystem.java:435)
    at org.apache.hadoop.hbase.master.MasterFileSystem.createInitialFileSystemLayout(MasterFileSystem.java:146)
    at org.apache.hadoop.hbase.master.MasterFileSystem.<init>(MasterFileSystem.java:127)
    at org.apache.hadoop.hbase.master.HMaster.finishInitialization(HMaster.java:789)
    at org.apache.hadoop.hbase.master.HMaster.run(HMaster.java:606)
    at java.lang.Thread.run(Thread.java:744)
2014-01-24 17:14:59,537 INFO  [master:dev1:60000] master.HMaster: Aborting
2014-01-24 17:14:59,537 DEBUG [master:dev1:60000] master.HMaster: Stopping service threads
2014-01-24 17:14:59,538 INFO  [master:dev1:60000] ipc.RpcServer: Stopping server on 60000

I've been searching for the reason but still have no luck.

¿Fue útil?

Solución 3

One of my teammate was able to solve this problem by doing either of the methods below:

  • copy core-site.xml and hdfs-site.xml to hbase_dir/conf

  • put hadoop_dir/etc/hadoop into $HBASE_CLASSPATH

Those must be done on both master and region machines.

We were deploying hbase master and hdfs name node in the same box; hbase region and hdfs data node are also in the same box.

Otros consejos

I recently tried to set up Kerberos-secured Hadoop cluster and encountered the same problem. I found that the cause is my hbase user (the user used to launch Hbase service) has no necessary environment settings of Hadoop such as HADOOP_CONF_DIR. You can verify if your user account has those settings.

In the latest releases, Hadoop changed the default HDFS ports Just check NameNode listening ports. In my option with Hadoop v. 3.2.1 I solved the issues just added the property to hbase-site.xml :

  <property>
    <name>hbase.rootdir</name>
    <value>hdfs://localhost:9000/hbase</value>
  </property>

I was receiving this error on the HBase Master:

org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): SIMPLE authentication is not enabled.  Available:[TOKEN, KERBEROS]

The problem was as Virya said above, missing environment variables for HBase.

If you edit hbase-env.sh and add hadoop environment variables it should correct the problem.

export HADOOP_HOME="/opt/hadoop"  # REPLACE WITH YOUR INSTALL DIRECTORY FOR HADOOP
export HADOOP_HDFS_HOME="${HADOOP_HOME}"
export HADOOP_MAPRED_HOME="${HADOOP_HOME}"
export HADOOP_YARN_HOME="${HADOOP_HOME}"
export HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"

Reason for this error

It seems HBase reads various settings from Hadoops configuration file and installation directory. HBase reads the hdfs-site.xml and determines if the HDFS cluster is running in secure mode automatically. If it can't find this file it defaults to assuming HDFS is not in secure mode and you receive the error above.

Licenciado bajo: CC-BY-SA con atribución
No afiliado a StackOverflow
scroll top