我正在尝试使用Kafka运行spark streaming . 我在scala 2.11.8上使用Scala版本2.11.8和Spark 2.1.0构建 . 我知道问题是scala版本不匹配,但所有依赖项都添加了正确的版本(图片附加),但我仍然收到此错误 .

Exception in thread "main" java.lang.NoClassDefFoundError: scala/collection/GenTraversableOnce$class
    at kafka.utils.Pool.<init>(Unknown Source)
    at kafka.consumer.FetchRequestAndResponseStatsRegistry$.<init>(Unknown Source)
    at kafka.consumer.FetchRequestAndResponseStatsRegistry$.<clinit>(Unknown Source)
    at kafka.consumer.SimpleConsumer.<init>(Unknown Source)
    at org.apache.spark.streaming.kafka.KafkaCluster.connect(KafkaCluster.scala:59)
    at org.apache.spark.streaming.kafka.KafkaCluster$$anonfun$org$apache$spark$streaming$kafka$KafkaCluster$$withBrokers$1.apply(KafkaCluster.scala:364)
    at org.apache.spark.streaming.kafka.KafkaCluster$$anonfun$org$apache$spark$streaming$kafka$KafkaCluster$$withBrokers$1.apply(KafkaCluster.scala:361)
    at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
    at scala.collection.mutable.WrappedArray.foreach(WrappedArray.scala:35)
    at org.apache.spark.streaming.kafka.KafkaCluster.org$apache$spark$streaming$kafka$KafkaCluster$$withBrokers(KafkaCluster.scala:361)
    at org.apache.spark.streaming.kafka.KafkaCluster.getPartitionMetadata(KafkaCluster.scala:132)
    at org.apache.spark.streaming.kafka.KafkaCluster.getPartitions(KafkaCluster.scala:119)
    at org.apache.spark.streaming.kafka.KafkaUtils$.getFromOffsets(KafkaUtils.scala:211)
    at org.apache.spark.streaming.kafka.KafkaUtils$.createDirectStream(KafkaUtils.scala:484)
    at org.apache.spark.streaming.kafka.KafkaUtils$.createDirectStream(KafkaUtils.scala:607)
    at com.forrester.streaming.kafka.App$.main(App.scala:19)
    at com.forrester.streaming.kafka.App.main(App.scala)

Dependecies

依赖

<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.11.8</version>
<scope>provided</scope>
</dependency>

<dependency>
<groupId>com.koverse</groupId>
<artifactId>koverse-shaded-deps</artifactId>
<version>${koverse.version}</version>
<scope>provided</scope>
</dependency>

<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_2.11</artifactId>
<version>2.1.0</version>
<exclusions>
<exclusion>
<groupId>*</groupId>
<artifactId>*</artifactId>
</exclusion>
</exclusions>
</dependency>


<dependency>
<groupId>org.scalanlp</groupId>
<artifactId>breeze_2.11</artifactId>
<version>0.11.2</version>
</dependency>

<dependency>
<groupId>org.xerial.snappy</groupId>
<artifactId>snappy-java</artifactId>
<version>1.0.5</version>
</dependency>


<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_2.11</artifactId>
<version>2.1.0</version>
<scope>test</scope>
</dependency>


<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.11</artifactId>
<version>2.1.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka-0-8-assembly_2.11</artifactId>
<version>2.1.0</version>
</dependency>

<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.1.0</version>
</dependency>
</dependencies>

我对不同版本做了更多分析:

Spark上的Spark构建| Kafka jar |结果

2.1.1关于2.11.8 | spark-streaming-kafka-0-8-assembly_2.11-2.1.1.jar | Working

2.1.1关于2.11.8 | spark-streaming-kafka-0-8-assembly_2.10-2.1.1.jar |错误预期

2.1.1关于2.11.8 | spark-streaming-kafka-0-8-assembly_2.10-2.1.0.jar |错误预期

2.1.1在2.11.8 | spark-streaming-kafka-0-8-assembly_2.10-2.1.0.jar |错误预期

2.1.1在2.11.8 | spark-streaming-kafka-0-8-assembly_2.11-2.1.0.jar | Error : ideally should pass

2.1.1在2.11.8 | spark-streaming-kafka-0-8-assembly_2.11-2.1.1.jar |错误预期

2.1.1在2.11.8 | spark-streaming-kafka-0-8-assembly_2.10-2.1.0.jar |错误预期

Error Message ClassNotFoundException:scala.collection.GenTraversableOnce $ class

Case-1正在工作但是case-5失败了,不应该抛出任何错误