首页 文章

使用maven构建spark 1.2与com.google.common包有错误

提问于
浏览
2

CentOS
6.2
Hadoop
2.6.0
scala
2.10.5
java version
"1.7.0_75" OpenJDK运行时环境(rhel-2.5.4.0.el6_6-x86_64 u75-b13)
OpenJDK 64位服务器VM(内置24.75-b04,混合模式)
mvn version
Apache Maven 3.3.1(cab6659f9874fa96462afef40fcf6bc033d58c1c; 2015-03-13T21:10:27 01:00)Maven home:/ opt / maven
Java版本:1.7.0_75,供应商:Oracle Corporation
Java主页:/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.75.x86_64/jre
默认语言环境:en_US,平台编码:UTF-8
操作系统名称:"linux",版本:"2.6.32-220.el6.x86_64",arch:"amd64",系列:"unix"
Environment variables

export SCALA_HOME=/opt/scala
export JAVA_HOME=/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.75.x86_64
export JRE_HOME=/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.75.x86_64/jre
export HADOOP_HOME=/home/tom/hadoop
export SPARK_HOME=/home/tom/spark
export HADOOP_INSTALL=$HADOOP_HOME
export HADOOP_MAPRED_HOME=$HADOOP_HOME
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_HDFS_HOME=$HADOOP_HOME
export YARN_HOME=$HADOOP_HOME
export HADOOP_YARN_HOME=$HADOOP_HOME
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib
export PATH=$PATH:$JAVA_HOME/bin:$JRE_HOME/bin:$HADOOP_HOME/sbin:$HADOOP_HOME/bin:$SPARK_HOME/bin:$MAVEN_HOME/bin:$SCALA_HOME/bin
export MAVEN_HOME=/opt/maven

export SPARK_EXAMPLES_JAR=$SPARK_HOME/spark-0.7.2/examples/target/scala-2.9.3/spark-examples_2.9.3-0.7.2.jar
export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
export YARN_CONF_DIR=$HADOOP_HOME/etc/hadoop
export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=$HADOOP_HOME/lib/"

build command
mvn -Pyarn -Phadoop-2.4 -Dhadoop.version = 2.6.0 -Phive -Phive-0.12.0 -Phive-thriftserver -DskipTests clean package
Error Message

[ERROR] /home/tom/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumeBatchFetcher.scala:22: object Throwables is not a member of package com.google.common.base
[ERROR] import com.google.common.base.Throwables
[ERROR]        ^
[ERROR] /home/tom/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumeBatchFetcher.scala:59: not found: value Throwables
[ERROR]           Throwables.getRootCause(e) match {
[ERROR]           ^
[ERROR] /home/tom/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumePollingInputDStream.scala:26: object util is not a member of package com.google.common
[ERROR] import com.google.common.util.concurrent.ThreadFactoryBuilder
[ERROR]                          ^
[ERROR] /home/tom/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumePollingInputDStream.scala:69: not found: type ThreadFactoryBuilder
[ERROR]     Executors.newCachedThreadPool(new ThreadFactoryBuilder().setDaemon(true).
[ERROR]                                       ^
[ERROR] /home/tom/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumePollingInputDStream.scala:76: not found: type ThreadFactoryBuilder
[ERROR]     new ThreadFactoryBuilder().setDaemon(true).setNameFormat("Flume Receiver Thread - %d").build())
[ERROR]         ^
[ERROR] 5 errors found


[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Spark Project Parent POM ........................... SUCCESS [ 10.121 s]
[INFO] Spark Project Networking ........................... SUCCESS [ 14.957 s]
[INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [ 10.858 s]
[INFO] Spark Project Core ................................. SUCCESS [07:33 min]
[INFO] Spark Project Bagel ................................ SUCCESS [ 52.312 s]
[INFO] Spark Project GraphX ............................... SUCCESS [02:19 min]
[INFO] Spark Project Streaming ............................ SUCCESS [03:28 min]
[INFO] Spark Project Catalyst ............................. SUCCESS [03:18 min]
[INFO] Spark Project SQL .................................. SUCCESS [03:48 min]
[INFO] Spark Project ML Library ........................... SUCCESS [03:40 min]
[INFO] Spark Project Tools ................................ SUCCESS [ 29.380 s]
[INFO] Spark Project Hive ................................. SUCCESS [02:53 min]
[INFO] Spark Project REPL ................................. SUCCESS [01:32 min]
[INFO] Spark Project YARN Parent POM ...................... SUCCESS [  5.124 s]
[INFO] Spark Project YARN Stable API ...................... SUCCESS [01:34 min]
[INFO] Spark Project Hive Thrift Server ................... SUCCESS [ 56.404 s]
[INFO] Spark Project Assembly ............................. SUCCESS [01:11 min]
[INFO] Spark Project External Twitter ..................... SUCCESS [ 36.661 s]
[INFO] Spark Project External Flume Sink .................. SUCCESS [ 50.006 s]
[INFO] Spark Project External Flume ....................... FAILURE [ 14.287 s]
[INFO] Spark Project External MQTT ........................ SKIPPED
[INFO] Spark Project External ZeroMQ ...................... SKIPPED
[INFO] Spark Project External Kafka ....................... SKIPPED
[INFO] Spark Project Examples ............................. SKIPPED
[INFO] Spark Project YARN Shuffle Service ................. SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 36:02 min
[INFO] Finished at: 2015-04-04T03:58:19+02:00
[INFO] Final Memory: 60M/330M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.0:compile (scala-compile-first) on project spark-streaming-flume_2.10: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.0:compile failed. CompileFailed -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:

我怀疑这是一些依赖性问题,但我无法弄清楚 . 有人能帮我吗?

3 回答

  • 0

    在使用如下命令构建时,我遇到了与 Apache Spark 1.2.1 相同的问题 -

    mvn -e -Pyarn -Phadoop-2.6 -Phive -Phive-thriftserver -DskipTests clean package
    

    Apache Maven的版本似乎在这里起作用 . 在失败的案例中,Maven版本是 -

    ./mvn -version

    Apache Maven **3.3.3** (7994120775791599e205a5524ec3e0dfe41d4a06; 2015-04-22T07:57:37-04:00)
    Maven home: /opt/Spark/amit/apache-maven/apache-maven-3.3.3
    Java version: 1.8.0, vendor: IBM Corporation
    Java home: /opt/Spark/amit/ibmjava8sdk/sdk/jre
    Default locale: en_US, platform encoding: UTF-8
    OS name: "linux", version: "3.14.8-200.fc20.x86_64", arch: "amd64", family: "unix"
    

    当我尝试使用较旧的maven时,构建成功了 . Apache Maven 3.2.X的使用似乎正在解决这个问题 . 我用过 -

    mvn -version

    Apache Maven **3.2.5** (12a6b3acb947671f09b81f49094c53f426d8cea1; 2014-12-14T12:29:23-05:00)
    Maven home: /opt/Spark/amit/apache-maven/apache-maven-3.2.5
    Java version: 1.8.0, vendor: IBM Corporation
    Java home: /opt/Spark/amit/ibmjava8sdk/sdk/jre
    Default locale: en_US, platform encoding: UTF-8
    OS name: "linux", version: "3.14.8-200.fc20.x86_64", arch: "amd64", family: "unix"
    

    希望这可以帮助 .

    谢谢,阿米特

  • 1

    如果您可以作弊,那么您可以跳过无法编译的模块,即

    spark-streaming-flume_2.10和spark-streaming-kafka_2.10

    以下命令用于使用CDH5.3.3和Spark 1.2.0编译具有Hive支持的Spark包的Spark包 .

    mvn -Pyarn -Dhadoop.version = 2.5.0-cdh5.3.3 -DskipTests -Phive -Phive-thriftserver -pl'!org.apache.spark:spark-streaming-flume_2.10,!org.apache.spark:spark -streaming-kafka_2.10'包

  • 0

    我今天遇到了类似的问题 . 这个 Spark Project External Flume ....................... FAILURE 日志让我烦恼,但我认为这是有帮助的 . -947806_ . 如果还不够,请尝试 git clean -Xdf . 再次运行 mvn ... . 祝好运!

相关问题