首页 文章

Spark结构化流 Kafka 依赖无法解决

提问于
浏览
0

我试过了

./spark-2.3.1-bin-hadoop2.7/bin/spark-submit --packages org.apache.spark:spark-sql-kafka-0-10_2.11:2.3.1 test.py

在我自己的电脑里,一切都很好 . 但是在我在学校的服务器上尝试之后,它有以下消息和错误 . 我在谷歌搜索了很长时间并且不知道 . 谁能帮我?

常 Spring 藤默认缓存设置为:/home/zqwang/.ivy2/cache存储的包的jar:/home/zqwang/.ivy2/jars :: loading settings :: url = jar:file:/ data / opt / TMP / zqwang /火花2.3.1彬hadoop2.7 /瓶/常 Spring 藤2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml org.apache.spark#火花SQL- Kafka -0-10_2.11加入作为依赖::解析相关:: org.apache.spark#火花提交 - 父26b526c6-0535-4007-8428-e38188af5709; 1.0 confs:[默认] ::分辨率报告:解决966ms :: artifacts dl 0ms ::使用中的模块:| |模块||文物| | conf |号码|搜索| dwnlded |驱逐||号码| dwnlded | |默认| 1 | 0 | 0 | 0 || 0 | 0 | ::问题摘要:::::找不到WARNINGS模块:org.apache.spark#spark-sql-kafka-0-10_2.11; 2.3.1 ==== local-m2-cache:尝试过的文件:/ home / zqwang / .m2 / repository / org / apache / spark / spark-sql-kafka-0-10_2.11 / 2.3.1 / spark-sql-kafka-0-10_2.11-2.3.1.pom - 神器
org.apache.spark#spark-sql-kafka-0-10_2.11; 2.3.1!spark-sql-kafka-0-10_2.11.jar:file:/home/zqwang/.m2/repository/org/ apache / spark / spark -sql-kafka-0-10_2.11 / 2.3.1 / spark-sql-kafka-0-10_2.11-2.3.1.jar ==== local-ivy-cache:试过/ home /zqwang/.ivy2/local/org.apache.spark/spark-sql-kafka-0-10_2.11/2.3.1/ivys/ivy.xml - 神器
org.apache.spark#spark-sql-kafka-0-10_2.11; 2.3.1!spark-sql-kafka-0-10_2.11.jar:/home/zqwang/.ivy2/local/org.apache . spark / spark -sql-kafka-0-10_2.11 / 2.3.1 / jars / spark-sql-kafka-0-10_2.11.jar ==== central:试过https://repo1.maven.org/ maven2 / org / apache / spark / spark-sql-kafka-0-10_2.11 / 2.3.1 / spark-sql-kafka-0-10_2.11-2.3.1.pom - artifact
org.apache.spark#spark-sql-kafka-0-10_2.11; 2.3.1!spark-sql-kafka-0-10_2.11.jar:https://repo1.maven.org/maven2/org/ apache / spark / spark -sql-kafka-0-10_2.11 / 2.3.1 / spark-sql-kafka-0-10_2.11-2.3.1.jar ==== spark-packages:试过http:// dl.bintray.com/spark-packages/maven/org/apache/spark/spark-sql-kafka-0-10_2.11/2.3.1/spark-sql-kafka-0-10_2.11-2.3.1 . pom - 神器
org.apache.spark#spark-sql-kafka-0-10_2.11; 2.3.1!spark-sql-kafka-0-10_2.11.jar:http://dl.bintray.com/spark-packages/ maven / org / apache / spark / spark-sql-kafka-0-10_2.11 / 2.3.1 / spark-sql-kafka-0-10_2.11-2.3.1.jar ::::::::: :::::::::::::::::::::::::::::::::::::

::不受限制的相关信息::

::::::::::::::::::::::::::::::::::::::::::::::

:: org.apache.spark#spark-sql-kafka-0-10_2.11; 2.3.1:未找到

::::::::::::::::::::::::::::::::::::::::::::::
:::: ERRORS服务器访问错误,网址为https://repo1.maven.org/maven2/org/apache/spark/spark-sql-kafka-0-10_2.11/2.3.1/spark-sql-kafka- 0-10_2.11-2.3.1.pom(java.net.ConnectException:Connection refused)网址https://repo1.maven.org/maven2/org/apache/spark/spark-sql-kafka-的服务器访问错误0-10_2.11 / 2.3.1 / spark-sql-kafka-0-10_2.11-2.3.1.jar(java.net.ConnectException:Connection refused)url http://dl.bintray上的服务器访问错误 . com / spark-packages / maven / org / apache / spark / spark-sql-kafka-0-10_2.11 / 2.3.1 / spark-sql-kafka-0-10_2.11-2.3.1.pom(java . net.ConnectException:Connection refused)网址http://dl.bintray.com/spark-packages/maven/org/apache/spark/spark-sql-kafka-0-10_2.11/2.3.1/上的服务器访问错误火花-SQL Kafka 0-10_2.11-2.3.1.jar(java.net.ConnectException:连接被拒绝)::使用详细或线程调试消息级的更多产品细节异常“主”了java.lang.RuntimeException: [未解决的依赖:org.apache.spark#spark-sql-kafka-0-10_2.11; 2.3.1:未找到]或g.apache.spark.deploy.SparkSubmitUtils $ .resolveMavenCoordinates(SparkSubmit.scala:1303)在org.apache.spark.deploy.DependencyUtils $ .resolveMavenDependencies(DependencyUtils.scala:53)在org.apache.spark.deploy.SparkSubmit $ .doPrepareSubmitEnvironment(SparkSubmit.scala:364)在org.apache.spark.deploy.SparkSubmit $ .prepareSubmitEnvironment(SparkSubmit.scala:250)在org.apache.spark.deploy.SparkSubmit $ .submit(SparkSubmit.scala:171)在Org.apache.spark.deploy.SparkSubmit $ .main(SparkSubmit.scala:137)at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

1 回答

  • 1

    但是在我在学校的服务器上尝试之后,它有以下消息和错误

    您的学校有防火墙,无法下载远程软件包 .

    例如,此链接适用于我

    服务器访问错误网址https://repo1.maven.org/maven2/org/apache/spark/spark-sql-kafka-0-10_2.11/2.3.1/spark-sql-kafka-0-10_2.11-2.3.1 .pom(java.net.ConnectException:拒绝连接)

    您需要在校外下载Kafka jar ,然后使用 --jars 标志随身携带

相关问题