首页 文章

Spark Kafka Streaming Issue

提问于
浏览
4

我正在使用maven

我添加了以下依赖项

<dependency> <!-- Spark dependency -->
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-streaming_2.10</artifactId>
      <version>1.1.0</version>
    </dependency>   <dependency> <!-- Spark dependency -->
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-streaming-kafka_2.10</artifactId>
      <version>1.1.0</version>
    </dependency>

我还在代码中添加了jar

SparkConf sparkConf = new SparkConf().setAppName("KafkaSparkTest");
JavaSparkContext sc = new JavaSparkContext(sparkConf);
sc.addJar("/home/test/.m2/repository/org/apache/spark/spark-streaming-kafka_2.10/1.0.2/spark-streaming-kafka_2.10-1.0.2.jar");
JavaStreamingContext jssc = new JavaStreamingContext(sc, new Duration(5000));

它可以很好地解决任何错误,当我通过spark-submit运行时,我收到以下错误,非常感谢任何帮助 . 谢谢你的时间 .

bin/spark-submit --class "KafkaSparkStreaming" --master local[4] try/simple-project/target/simple-project-1.0.jar

线程“main”中的异常java.lang.NoClassDefFoundError:org / apache / spark / streaming / kafka / KafkaUtils,位于KafkaSparkStreamingTest(KafkaSparkStreaming.java:40),位于sun.reflect的KafkaSparkStreaming.main(KafkaSparkStreaming.java:23) at.MativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method .java:606)在org.apache的org.apache.spark.deploy.SparkSubmit $ .launch(SparkSubmit.scala:303)org.apache.spark.deploy.SparkSubmit $ .main(SparkSubmit.scala:55) . spark.deploy.SparkSubmit.main(SparkSubmit.scala)引起:java.lang.ClassNotFoundException:java.net.URLClassLoader $ 1.run中的org.apache.spark.streaming.kafka.KafkaUtils(URLClassLoader.java:366)

2 回答

  • 9

    我遇到了同样的问题,我通过构建带依赖关系的jar来解决它 .

    • 在代码中删除“sc.addJar()” .

    • 将以下代码添加到pom.xml

    <build>
        <sourceDirectory>src/main/java</sourceDirectory>
        <testSourceDirectory>src/test/java</testSourceDirectory>
        <plugins>
          <!--
                       Bind the maven-assembly-plugin to the package phase
            this will create a jar file without the storm dependencies
            suitable for deployment to a cluster.
           -->
          <plugin>
            <artifactId>maven-assembly-plugin</artifactId>
            <configuration>
              <descriptorRefs>
                <descriptorRef>jar-with-dependencies</descriptorRef>
              </descriptorRefs>
              <archive>
                <manifest>
                  <mainClass></mainClass>
                </manifest>
              </archive>
            </configuration>
            <executions>
              <execution>
                <id>make-assembly</id>
                <phase>package</phase>
                <goals>
                  <goal>single</goal>
                </goals>
              </execution>
            </executions>
          </plugin>
        </plugins>
    </build>
    
    • mvn包

    • 提交“example-jar-with-dependencies.jar”

  • 0

    为了将来参考,如果你得到一个ClassNotFoundException,如果你搜索“org.apache.spark ...”,你将被带到maven页面,它将告诉你你的pom文件中缺少的依赖关系 . 它还会为您提供放入pom文件的代码 .

相关问题