首页 文章

当代码在Spark shell中工作时,spark-submit不能引用“--jars”指定的jar?

提问于
浏览
0

我使用intelliJ创建了一个sbt项目 . 我在项目的 lib 文件夹中复制了所需的jdbc jar sqljdbc42.jar . sbt package 圆满结束 . 我在 Windowsspark-shell --driver-class-path C:\sqljdbc_6.0\enu\jre8\sqljdbc42.jar 开始了火花 .

import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import java.sql._

object ConnTest extends App {
  val conf = new SparkConf()
  val sc = new SparkContext(conf.setAppName("Test").setMaster("local[*]"))

  // The following four statements work if running interactively in the Spark shell
  val sqlContext = new org.apache.spark.sql.SQLContext(sc)
  val jdbcSqlConn = "jdbc:sqlserver://...;databaseName=...;user=...;password=...;"
  val jdbcDf = sqlContext.read.format("jdbc").options(Map(
        "url" -> jdbcSqlConn,
        "dbtable" -> "testTable"
      )).load()
  jdbcDf.show(10)

  sc.stop()
}

但是,以下 spark-submit 命令出错 .

spark-submit.cmd --class ConnTest --master local[4] .\target\scala-2.11\test_2.11-1.0.jar
spark-submit.cmd --class ConnTest --master local[4] .\target\scala-2.11\test_2.11-1.0.jar --jars \sqljdbc_6.0\enu\jre8\sqljdbc42.jar
Exception in thread "main" java.sql.SQLException: No suitable driver
        at java.sql.DriverManager.getDriver(Unknown Source)
        at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions$$anonfun$7.apply(JDBCOptions.scala:84)
        at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions$$anonfun$7.apply(JDBCOptions.scala:84)
        at scala.Option.getOrElse(Option.scala:121)
        at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions.(JDBCOptions.scala:83)
        at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions.(JDBCOptions.scala:34)
        at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:32)
        at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:330)
        at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:152)
        at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:125)
        at ConnTest$.delayedEndpoint$ConnTest$1(main.scala:14)
        at ConnTest$delayedInit$body.apply(main.scala:6)
        at scala.Function0$class.apply$mcV$sp(Function0.scala:34)
        at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
        at scala.App$$anonfun$main$1.apply(App.scala:76)
        at scala.App$$anonfun$main$1.apply(App.scala:76)
        at scala.collection.immutable.List.foreach(List.scala:381)
        at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
        at scala.App$class.main(App.scala:76)
        at ConnTest$.main(main.scala:6)
        at ConnTest.main(main.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
        at java.lang.reflect.Method.invoke(Unknown Source)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:743)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Update: 如果直接在Spark shell中运行语句,我可以看到表格内容 .

Update 2: 运行时确实显示以下消息 spark-submit

17/05/15 16:12:30 INFO SparkContext:在spark://10.8.159.130:7587 / jars / sqljdbc42.jar添加了JAR文件:/ C:/sqljdbc_6.0/enu/jre8/sqljdbc42.jar时间戳1494879150052

2 回答

  • 0

    设置另一个选项解决了该问题 .

    "driver" -> "com.microsoft.sqlserver.jdbc.SQLServerDriver",
    
  • 2

    几个尝试的选择:

    A.编辑spark-defaults.conf并修改这些字段:

    spark.driver.extraClassPath /path/to/jar/*

    spark.executor.extraClassPath /path/to/jar/*

    B.在代码中设置路径:

    val conf = new SparkConf() conf.set("spark.driver.extraClassPath", "/path/to/jar/*") val sc = new SparkContext(conf)

    C.尝试使用 --jars=local:--jars "C:\sqljdbc_6.0\enu\jre8\sqljdbc42.jar"

    在Windows上运行Spark时,相应地编辑jar路径 .

    spark-submit.cmd --class ConnTest --master local[4] .\target\scala-2.11\test_2.11-1.0.jar --jars=local:C:\sqljdbc_6.0\enu\jre8\sqljdbc42.jar

相关问题