我从源代码构建了Spark 1.6 SNAPSHOT,没有任何问题:$ mvn3 clean package -DskipTests .

我正在运行:OS X 10.10.5 . Java 1.8 Maven 3.3.3 Spark 1.6 SNAPSHOT Scala 2.11.7 Zinc 0.3.5.3 Hadoop 3.0 SNAPSHOT

我将以下依赖项添加到我的pom.xml中(尝试解决有关本机库的警告):

<dependency>
  <groupId>com.googlecode.netlib-java</groupId>
  <artifactId>netlib</artifactId>
  <version>1.1</version>
</dependency>

环境变量:

HADOOP_INSTALL=/Users/davidlaxer/hadoop/hadoop-dist/target/hadoop-3.0.0-SNAPSHOT
HADOOP_CONF_DIR=/Users/davidlaxer/hadoop/hadoop-dist/target/hadoop-3.0.0-SNAPSHOT/etc/hadoop
HADOOP_OPTS=-Djava.library.path=/Users/davidlaxer/hadoop/hadoop-dist/target/hadoop-3.0.0-SNAPSHOT/lib/native
CLASSPATH=/users/davidlaxer/trunk/core/src/test/java/:/Users/davidlaxer/hadoop/hadoop-dist/target/hadoop-dist-3.0.0-SNAPSHOT.jar:/Users/davidlaxer/clojure/target:/Users/davidlaxer/hadoop/lib/native:
SPARK_LIBRARY_PATH=/Users/davidlaxer/hadoop/hadoop-dist/target/hadoop-3.0.0-SNAPSHOT/lib/native

当我尝试使用:spark-shell启动spark时,我收到以下错误:

./spark-shell
Exception in thread "main" java.lang.IllegalStateException: Library directory '/Users/davidlaxer/spark/lib_managed/jars' does not exist.
    at org.apache.spark.launcher.CommandBuilderUtils.checkState(CommandBuilderUtils.java:249)
    at org.apache.spark.launcher.AbstractCommandBuilder.buildClassPath(AbstractCommandBuilder.java:227)
    at org.apache.spark.launcher.AbstractCommandBuilder.buildJavaCommand(AbstractCommandBuilder.java:115)
    at org.apache.spark.launcher.SparkSubmitCommandBuilder.buildSparkSubmitCommand(SparkSubmitCommandBuilder.java:196)
    at org.apache.spark.launcher.SparkSubmitCommandBuilder.buildCommand(SparkSubmitCommandBuilder.java:121)
    at org.apache.spark.launcher.Main.main(Main.java:86)

我恢复到Spark 1.5并且没有问题:

git clone git://github.com/apache/spark.git -b branch-1.5