首页 文章

spark shell依赖异常

提问于
浏览
0

我的主机系统Windows 10和我有cloudera vm,我的火花版本是1.6 . 我试图用下面的命令加载spark-shell .

spark-shell --packages org.apache.spark:spark-streaming-twitter_2.10:1.6.0

But it is throwing the below exception :

:::: ERRORS Server access error at url https://repo1.maven.org/maven2/org/apache/spark/spark-streaming-twitter_2.10/1.6.0/spark-streaming-twitter_2.10-1.6.0.pom (javax.net.ssl.SSLException: Received fatal alert: protocol_version)
    Server access error at url https://repo1.maven.org/maven2/org/apache/spark/spark-streaming-twitter_2.10/1.6.0/spark-streaming-twitter_2.10-1.6.0.jar (javax.net.ssl.SSLException: Received fatal alert: protocol_version)

::使用VERBOSE或调试消息级别获取更多详细信息线程中的例外情况

"main" java.lang.RuntimeException: [unresolved dependency: org.apache.spark#spark-streaming-twitter_2.10;1.6.0: not found] at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1067) at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:287) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:154) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

2 回答

  • 2

    我评论过:javax.net.ssl.SSLException: Received fatal alert: protocol_version

    因此它似乎与Java TLS协议版本默认值有关 . 如果Java版本过期并导致通过HTTPS的请求过期的TLS版本,则会被阻止 . 我在尝试安装PySpark软件包时遇到了这个问题 . 许多服务器现在阻止过时版本的TLS . 例如,见:Github now blocks weak cryptography standards

    解决方案是使用环境变量强制TLS 1.2版:

    echo 'export JAVA_TOOL_OPTIONS="-Dhttps.protocols=TLSv1.2"' >> ~/.bashrc
    source ~/.bashrc
    

    当我重新运行命令以使用我的包启动PySpark时:

    pyspark --packages com.databricks:spark-csv_2.10:1.5.0
    

    我指定的TLS版本立即被检测到 . 它字面上给了我输出:

    拿起JAVA_TOOL_OPTIONS:-Dhttps.protocols = TLSv1.2

  • 0

    在与maven central Build 连接时似乎是个问题 .

    检查一下javax.net.ssl.SSLException: Received fatal alert: protocol_version

    Maven Central有TLS1.2所以请检查你的

    如果它之前工作尝试连接浏览器的URL并查看是否允许

    如果是,那么对maven进行必要的更改(可能是代理和凭证)

相关问题