首页 文章

Spark中的java.lang.NoSuchMethodError

提问于
浏览
2

我正在使用spark(spark版本1.2.1,scala版本:2.10.4)和cassandra(cassandra连接器1.2.0-rc3),我想使用 joinWithCassandraTable 函数 . 为此,我已经在spark-shell中尝试过它,它完美无缺 .

val customersInteractions= customers.joinWithCassandraTable(cassandraKeyspace, table).on(SomeColumns("c1","c2")).select("cl1", "cl2", "cl3","cl4","cl5").

但现在我想在maven项目中使用它 . 所以,在我的IntelliJ中,我使用了这些maven依赖项:

<?xml version="1.0" encoding="UTF-8"?>

http://maven.apache.org/xsd/maven-4.0.0.xsd“> aid-cim fr.aid.cim 0.9-SNAPSHOT 4.0.0

<artifactId>spark-cassandra</artifactId>

<properties>
    <spark.version>1.2.1</spark.version>
    <scala.version>2.11.0</scala.version>
</properties>

<repositories>
    <repository>
        <id>spark-jobserver</id>
        <name>spark-jobserver</name>
        <url>https://dl.bintray.com/spark-jobserver/maven</url>
    </repository>
</repositories>

<pluginRepositories>
    <pluginRepository>
        <id>scala-tools.org</id>
        <name>Scala-tools Maven2 Repository</name>
        <url>http://scala-tools.org/repo-releases</url>
    </pluginRepository>
</pluginRepositories>

<build>
    <plugins>
        <plugin>
            <groupId>org.scala-tools</groupId>
            <artifactId>maven-scala-plugin</artifactId>
            <executions>
                <execution>
                    <goals>
                        <goal>compile</goal>
                        <goal>testCompile</goal>
                    </goals>
                </execution>
            </executions>
        </plugin>
        <plugin>
            <artifactId>maven-assembly-plugin</artifactId>
            <configuration>
                <descriptorRefs>
                    <descriptorRef>jar-with-dependencies</descriptorRef>
                </descriptorRefs>
            </configuration>
            <executions>
                <execution>
                    <id>make-assembly</id>
                    <phase>package</phase>
                    <goals>
                        <goal>single</goal>
                    </goals>
                </execution>
            </executions>
        </plugin>
        <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-dependency-plugin</artifactId>
        </plugin>
    </plugins>
</build>

<dependencies>

    <dependency>
        <groupId>net.sf.jopt-simple</groupId>
        <artifactId>jopt-simple</artifactId>
        <version>4.8</version>
    </dependency>

    <dependency>
        <groupId>com.fasterxml.jackson.core</groupId>
        <artifactId>jackson-databind</artifactId>
    </dependency>

    <!-- Scala -->
    <dependency>
        <groupId>org.scala-lang</groupId>
        <artifactId>scala-library</artifactId>
        <version>${scala.version}</version>
        <scope>compile</scope>
    </dependency>

    <dependency>
        <groupId>org.scala-lang</groupId>
        <artifactId>scala-compiler</artifactId>
        <version>${scala.version}</version>
    </dependency>
    <!-- END Scala -->

    <dependency>
        <groupId>spark.jobserver</groupId>
        <artifactId>job-server-api</artifactId>
        <version>0.4.1</version>
        <scope>compile</scope>
    </dependency>

    <dependency>
        <groupId>org.apache.httpcomponents</groupId>
        <artifactId>httpclient</artifactId>
        <version>4.3.6</version>
    </dependency>


    <dependency>
        <groupId>com.googlecode.json-simple</groupId>
        <artifactId>json-simple</artifactId>
        <version>1.1</version>
    </dependency>

    <dependency>
        <groupId>joda-time</groupId>
        <artifactId>joda-time</artifactId>
        <version>2.6</version>
    </dependency>

    <!-- START Logger -->
    <dependency>
        <groupId>ch.qos.logback</groupId>
        <artifactId>logback-classic</artifactId>
    </dependency>

    <dependency>
        <groupId>org.slf4j</groupId>
        <artifactId>slf4j-api</artifactId>
    </dependency>
    <!-- END Logger -->

    <!-- Tests -->
    <dependency>
        <groupId>junit</groupId>
        <artifactId>junit</artifactId>
        <scope>text</scope>
    </dependency>

    <dependency>
        <groupId>org.hamcrest</groupId>
        <artifactId>hamcrest-core</artifactId>
    </dependency>

    <dependency>
        <groupId>info.cukes</groupId>
        <artifactId>cucumber-picocontainer</artifactId>
        <version>1.2.0</version>
        <scope>test</scope>
    </dependency>

    <dependency>
        <groupId>info.cukes</groupId>
        <artifactId>cucumber-core</artifactId>
        <scope>test</scope>
    </dependency>

    <dependency>
        <groupId>info.cukes</groupId>
        <artifactId>cucumber-junit</artifactId>
        <scope>test</scope>
    </dependency>
    <dependency>
        <groupId>org.cassandraunit</groupId>
        <artifactId>cassandra-unit</artifactId>
        <version>2.1.3.1</version>
        <exclusions>
            <exclusion>
                <artifactId>slf4j-log4j12</artifactId>
                <groupId>org.slf4j</groupId>
            </exclusion>
        </exclusions>
        <scope>test</scope>
    </dependency>
    <!-- END Tests -->
</dependencies>
<profiles>
    <profile>
        <id>local</id>
        <activation>
            <activeByDefault>true</activeByDefault>
        </activation>
        <dependencies>
            <!-- Apache Spark -->
            <dependency>
                <groupId>org.apache.spark</groupId>
                <artifactId>spark-core_2.10</artifactId>
                <version>${spark.version}</version>
                <scope>compile</scope>
            </dependency>

            <dependency>
                <groupId>org.apache.spark</groupId>
                <artifactId>spark-streaming_2.10</artifactId>
                <version>${spark.version}</version>
                <scope>compile</scope>
            </dependency>
            <!-- END Apache Spark -->

            <!-- START Spark Cassandra Connector -->
            <dependency>
                <groupId>com.datastax.spark</groupId>
                <artifactId>spark-cassandra-connector_2.10</artifactId>
                <version>1.2.0-rc3</version>
                <!--<version>${spark.version}</version> !-->
                <exclusions>
                    <exclusion>
                        <groupId>org.apache.spark</groupId>
                        <artifactId>spark-streaming_2.10</artifactId>
                    </exclusion>
                </exclusions>
            </dependency>

            <dependency>
                <groupId>com.datastax.spark</groupId>
                <artifactId>spark-cassandra-connector-java_2.10</artifactId>
                <version>1.2.0-rc3</version>
            <!-- <version>${spark.version}</version> !-->
             <exclusions>
                 <exclusion>
                     <groupId>org.apache.spark</groupId>
                     <artifactId>spark-streaming_2.10</artifactId>
                 </exclusion>
             </exclusions>
         </dependency>
     </dependencies>
 </profile>
 <profile>
     <id>cluster</id>
     <dependencies>
         <!-- Apache Spark -->
            <dependency>
                <groupId>org.apache.spark</groupId>
                <artifactId>spark-core_2.10</artifactId>
                <version>${spark.version}</version>
                <scope>provided</scope>
            </dependency>

            <dependency>
                <groupId>org.apache.spark</groupId>
                <artifactId>spark-streaming_2.10</artifactId>
                <version>${spark.version}</version>
                <scope>provided</scope>
            </dependency>
            <!-- END Apache Spark -->

            <!-- START Spark Cassandra Connector -->
            <dependency>
                <groupId>com.datastax.spark</groupId>
                <artifactId>spark-cassandra-connector_2.10</artifactId>
                <!-- <version>${spark.version}</version> !-->
                <version>1.2.0-rc3</version>
                <exclusions>
                    <exclusion>
                        <groupId>org.apache.spark</groupId>
                        <artifactId>spark-streaming_2.10</artifactId>
                    </exclusion>
                </exclusions>
                <scope>provided</scope>
            </dependency>

            <dependency>
                <groupId>com.datastax.spark</groupId>
                <artifactId>spark-cassandra-connector-java_2.10</artifactId>
               <!-- <version>${spark.version}</version> !-->
                <version>1.2.0-rc3</version>
                <exclusions>
                    <exclusion>
                        <groupId>org.apache.spark</groupId>
                        <artifactId>spark-streaming_2.10</artifactId>
                    </exclusion>
                </exclusions>
                <scope>provided</scope>
            </dependency>
        </dependencies>
    </profile>

</profiles>

但是当我试图执行我的程序时,我收到了这个错误:

Exception in thread "main" java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaUniverse$JavaMirror;

我的代码中的错误行:是我的 JoinWithCassandraTable 函数 .

我做了一些maven依赖错误???我该如何解决这个问题?在此先感谢您的帮助 .

2 回答

  • 1

    Spark 1.2.1 depends on scala version 2.10.4 . 您可以在Maven Repo中检查依赖项版本https://mvnrepository.com/artifact/org.apache.spark

    你必须改变依赖

    <properties>
            <spark.version>1.2.1</spark.version>
            <scala.version>2.11.0</scala.version>
     </properties>
    

    <properties>
        <spark.version>1.2.1</spark.version>
        <scala.version>2.10.4</scala.version>
    </properties>
    
  • 2

    要提交到集群,依赖maven是不够的 . 您通常必须组装一个包含依赖项的胖jar并将其传递给spark,以便它可以使执行程序上的所有代码都可用 .

相关问题