Question

I want to run Spark on a local machine using pyspark. From here I use the commands:

sbt/sbt assembly
$ ./bin/pyspark 

The install completes, but pyspark is unable to run, resulting in the following error (in full):

138:spark-0.9.1 comp_name$ ./bin/pyspark
Python 2.7.6 |Anaconda 1.9.2 (x86_64)| (default, Jan 10 2014, 11:23:15) 
[GCC 4.0.1 (Apple Inc. build 5493)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
Traceback (most recent call last):
  File "/Users/comp_name/Downloads/spark-0.9.1/python/pyspark/shell.py", line 32, in <module>
    sc = SparkContext(os.environ.get("MASTER", "local"), "PySparkShell", pyFiles=add_files)
  File "/Users/comp_name/Downloads/spark-0.9.1/python/pyspark/context.py", line 123, in __init__
    self._jsc = self._jvm.JavaSparkContext(self._conf._jconf)
  File "/Users/comp_name/Downloads/spark-0.9.1/python/lib/py4j-0.8.1-src.zip/py4j/java_gateway.py", line 669, in __call__
  File "/Users/comp_name/Downloads/spark-0.9.1/python/lib/py4j-0.8.1-src.zip/py4j/protocol.py", line 300, in get_return_value
py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
: java.net.UnknownHostException: 138.7.100.10.in-addr.arpa: 138.7.100.10.in-addr.arpa: nodename nor servname provided, or not known
    at java.net.InetAddress.getLocalHost(InetAddress.java:1466)
    at org.apache.spark.util.Utils$.findLocalIpAddress(Utils.scala:355)
    at org.apache.spark.util.Utils$.localIpAddress$lzycompute(Utils.scala:347)
    at org.apache.spark.util.Utils$.localIpAddress(Utils.scala:347)
    at org.apache.spark.util.Utils$.localIpAddressHostname$lzycompute(Utils.scala:348)
    at org.apache.spark.util.Utils$.localIpAddressHostname(Utils.scala:348)
    at org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:395)
    at org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:395)
    at scala.Option.getOrElse(Option.scala:120)
    at org.apache.spark.util.Utils$.localHostName(Utils.scala:395)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:124)
    at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:47)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
    at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:234)
    at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:379)
    at py4j.Gateway.invoke(Gateway.java:214)
    at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:79)
    at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:68)
    at py4j.GatewayConnection.run(GatewayConnection.java:207)
    at java.lang.Thread.run(Thread.java:724)
Caused by: java.net.UnknownHostException: 138.7.100.10.in-addr.arpa: nodename nor servname provided, or not known
    at java.net.Inet6AddressImpl.lookupAllHostAddr(Native Method)
    at java.net.InetAddress$1.lookupAllHostAddr(InetAddress.java:894)
    at java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1286)
    at java.net.InetAddress.getLocalHost(InetAddress.java:1462)
    ... 22 more

Any ideas what I am doing wrong? I don't know where the IP address 138.7.100.10 comes from. I get this error when using (or not) MAMP to create a localhost. Thanks in advance!

Was it helpful?

Solution 3

I turns out, the Java version I was using was 1.7. I'm using a Macbook Air, running 10.9.2

$ java -version

gave me:

java version "1.7.0_25"
Java(TM) SE Runtime Environment (build 1.7.0_25-b15)
Java HotSpot(TM) 64-Bit Server VM (build 23.25-b01, mixed mode)

To downgrade to 1.6:

$ cd /Library/Java/JavaVirtualMachines
$ ls

returned:

jdk1.7.0_25.jdk

To delete that file (and downgrade java and fix my issue):

$ sudo rm -rf jdk1.7.0_25.jdk

Then I had:

$ java -version

Which gave the output:

java version "1.6.0_65"
Java(TM) SE Runtime Environment (build 1.6.0_65-b14-462-11M4609)
Java HotSpot(TM) 64-Bit Server VM (build 20.65-b04-462, mixed mode)

And finally, I am able to run Spark:

$ ./bin/pyspark

And all is happy:

Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /__ / .__/\_,_/_/ /_/\_\   version 0.9.1
      /_/

OTHER TIPS

The right solution is to set SPARK_LOCAL_IP environment variable to localhost or whatever your host name is.

I had the same problem with Spark and it is related to your Laptop IP.

My solution:

sudo /etc/hosts

below

127.0.0.1 localhost localhost.localdomain localhost4 localhost4.localdomain4

add

127.0.0.1 LAPTOPNAME

your LAPTOPNAME can be found with your Terminal and it is root@LAPTOPNAME (whichever you have set up during your installation)

It will run with Java1.7

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top