在Ubuntu上,使用Couchbase 2.5.1,Cloudera CDH4,Couchbase和Oracle JDK 6.一切都安装精细(似乎),我可以独立地与Hadoop和Couchbase一起使用,没有任何问题,但是当我尝试使用时插件如下

sqoop import --connect http://127.0.0.1:8091/ --table DUMP
.

我收到以下错误

Please set $ACCUMULO_HOME to the root of your Accumulo installation.
14/04/11 11:44:08 INFO sqoop.Sqoop: Running Sqoop version: 1.4.3-cdh4.6.0
14/04/11 11:44:08 INFO tool.CodeGenTool: Beginning code generation
14/04/11 11:44:08 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/lib/hadoop-0.20-mapreduce
Note: /tmp/sqoop-vagrant/compile/30e6774902d338663db059706cde5b12/DUMP.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
14/04/11 11:44:09 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-vagrant/compile/30e6774902d338663db059706cde5b12/DUMP.jar
14/04/11 11:44:09 INFO mapreduce.ImportJobBase: Beginning import of DUMP
14/04/11 11:44:09 WARN util.Jars: No such class couchbase doesn't use a jdbc driver available.
14/04/11 11:44:11 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8021. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)
14/04/11 11:44:12 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8021. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)
14/04/11 11:44:13 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8021. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)
.

我出错的想法吗?或者我能做些什么来发现?

有帮助吗?

解决方案

似乎我使用的语法错了。假设我们希望将来自Couchbase导入的beer-sample存储桶到HDFS中,正确的语法如下所示,其中桶名称实际上作为username

sqoop import --connect http://localhost:8091/pools --password password --username beer-sample --table DUMP
.

其他提示

我不认为您将能够使用Couchbase Hadoop插件与密码连接到Couchbase桶。我曾经获得身份验证异常,并且从未能解决它。我编辑了源代码,然后我能够完成它。

许可以下: CC-BY-SA归因
不隶属于 StackOverflow
scroll top