首页 文章

运行Spark hello world代码时sbt出错?

提问于
浏览
5

运行spark hello world程序时出现以下错误 .

[info] Updating {file:/C:/Users/user1/IdeaProjects/sqlServer/}sqlserver...
[info] Resolving org.apache.spark#spark-core_2.12;2.1.1 ...
[warn]  module not found: org.apache.spark#spark-core_2.12;2.1.1
[warn] ==== local: tried
[warn]   C:\Users\user1\.ivy2\local\org.apache.spark\spark-core_2.12\2.1.1\ivys\ivy.xml
[warn] ==== public: tried
[warn]   https://repo1.maven.org/maven2/org/apache/spark/spark-core_2.12/2.1.1/spark-core_2.12-2.1.1.pom
[warn] ==== local-preloaded-ivy: tried
[warn]   C:\Users\user1\.sbt\preloaded\org.apache.spark\spark-core_2.12\2.1.1\ivys\ivy.xml
[warn] ==== local-preloaded: tried
[warn]   file:/C:/Users/user1/.sbt/preloaded/org/apache/spark/spark-core_2.12/2.1.1/spark-core_2.12-2.1.1.pom
[info] Resolving jline#jline;2.14.3 ...
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  ::          UNRESOLVED DEPENDENCIES         ::
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  :: org.apache.spark#spark-core_2.12;2.1.1: not found
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]
[warn]  Note: Unresolved dependencies path:
[warn]          org.apache.spark:spark-core_2.12:2.1.1 (C:\Users\user1\IdeaProjects\sqlServer\build.sbt#L7-8)
[warn]            +- mpa:mpa_2.12:1.0
[trace] Stack trace suppressed: run last *:update for the full output.
[error] (*:update) sbt.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.12;2.1.1: not found
[error] Total time: 1 s, completed May 9, 2017 11:05:44 AM

这是 build.sbt

name := "Mpa"
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.1.1"

我的Spark webcome消息 .

Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.1.1
      /_/

Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_111)
Type in expressions to have them evaluated.
Type :help for more information.

Update:

我把_2544442改为了

name := "Mpa"
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies += "org.apache.spark" %% "spark-core_2.11" % "2.1.0"

但还是得到了

[info] Updating {file:/C:/Users/user1/IdeaProjects/sqlServer/}sqlserver...
[info] Resolving org.apache.spark#spark-core_2.11_2.11;2.1.0 ...
[warn]  module not found: org.apache.spark#spark-core_2.11_2.11;2.1.0
[warn] ==== local: tried
[warn]   C:\Users\user1\.ivy2\local\org.apache.spark\spark-core_2.11_2.11\2.1.0\ivys\ivy.xml
[warn] ==== public: tried
[warn]   https://repo1.maven.org/maven2/org/apache/spark/spark-core_2.11_2.11/2.1.0/spark-core_2.11_2.11-2.1.0.pom
[warn] ==== local-preloaded-ivy: tried
[warn]   C:\Users\user1\.sbt\preloaded\org.apache.spark\spark-core_2.11_2.11\2.1.0\ivys\ivy.xml
[warn] ==== local-preloaded: tried
[warn]   file:/C:/Users/user1/.sbt/preloaded/org/apache/spark/spark-core_2.11_2.11/2.1.0/spark-core_2.11_2.11-2.1.0.pom
[info] Resolving jline#jline;2.12.1 ...
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  ::          UNRESOLVED DEPENDENCIES         ::
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  :: org.apache.spark#spark-core_2.11_2.11;2.1.0: not found
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]
[warn]  Note: Unresolved dependencies path:
[warn]          org.apache.spark:spark-core_2.11_2.11:2.1.0 (C:\Users\user1\IdeaProjects\sqlServer\build.sbt#L7-8)
[warn]            +- mpa:mpa_2.11:1.0
[trace] Stack trace suppressed: run last *:update for the full output.
[error] (*:update) sbt.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.11_2.11;2.1.0: not found
[error] Total time: 1 s, completed May 9, 2017 1:01:01 PM

4 回答

  • 0

    您在built.sbt文件中有错误,必须将 %% 更改为 %

    name := "Mpa"
    version := "1.0"
    scalaVersion := "2.11.8"
    libraryDependencies += "org.apache.spark" % "spark-core" % "2.1.1"
    

    %% 要求Sbt将当前的scala版本添加到工件中

    您可以 spark-core_2.11% 来解决问题 .

    // https://mvnrepository.com/artifact/org.apache.spark/spark-core_2.11
    libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.1.0"
    

    希望这可以帮助!

  • 11

    我得到了同样的错误 .

    build.sbt

    name := "Simple Project"  
    version := "1.0"  
    scalaVersion := "2.12.3"  
    libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.2.0"
    

    只需将scalaVersion更改为2.11.8或更低 . 它有效 .

  • 1

    我得到了同样的错误并通过以下步骤解决了它 . 基本上,文件名与sbt配置不匹配 .

    • 检查$ SPARK_HOME / jars中的spark core jar的文件名(它是spark-core_ 2.11 -2.1.1.jar) .
    • 安装scala 2.11 .11 .
    • 将build.sbt编辑为scalaVersion:=“ 2.11 .11” .
  • 3

    这对我有用 . 示例build.sbt

    name := "testproj"
    
    version := "0.1"
    
    scalaVersion := "2.11.9"
    
    libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.2.0"
    

相关问题