I turns out, the Java version I was using was 1.7. I'm using a Macbook Air, running 10.9.2
$ java -version
gave me:
java version "1.7.0_25"
Java(TM) SE Runtime Environment (build 1.7.0_25-b15)
Java HotSpot(TM) 64-Bit Server VM (build 23.25-b01, mixed mode)
To downgrade to 1.6:
$ cd /Library/Java/JavaVirtualMachines
$ ls
returned:
jdk1.7.0_25.jdk
To delete that file (and downgrade java and fix my issue):
$ sudo rm -rf jdk1.7.0_25.jdk
Then I had:
$ java -version
Which gave the output:
java version "1.6.0_65"
Java(TM) SE Runtime Environment (build 1.6.0_65-b14-462-11M4609)
Java HotSpot(TM) 64-Bit Server VM (build 20.65-b04-462, mixed mode)
And finally, I am able to run Spark:
$ ./bin/pyspark
And all is happy:
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/__ / .__/\_,_/_/ /_/\_\ version 0.9.1
/_/