首页 文章

Spark 2.1.1 Log4jLoggerFactory无法强制转换为LoggerContext

提问于
浏览
0

我正在尝试在火花流中使用logback for logger . 当我试图通过spark-submit提交工作时,我会得到如下例外情况 .

线程“main”中的异常java.lang.ClassCastException:org.slf4j.impl.Log4jLoggerFactory无法强制转换为消费者的consumer.spark.LogBackConfigLoader . (LogBackConfigLoader.java:18)中的ch.qos.logback.classic.LoggerContext . spark.Sample.main(Sample.java:18)位于sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)的sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl . java:43)at java.lang.reflect.Method.invoke(Method.java:498)at org.apache.spark.deploy.SparkSubmit $ .org $ apache $ spark $ deploy $ SparkSubmit $$ runMain(SparkSubmit.scala: 743)在org.apache.spark.deploy.SparkSubmit $$ anon $ 1.run(SparkSubmit.scala:169)at org.apache.spark.deploy.SparkSubmit $$ anon $ 1.run(SparkSubmit.scala:167)at java位于org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java)的javax.security.auth.Subject.doAs(Subject.java:422)上的.security.AccessController.doPrivileged(Native Method) :1656)org.apache.spache.deploy.SparkSubmit $ .doRunMain $ 1(SparkSubmit.scala:167)atg.apache.spark.deploy.SparkSubmit $ .submit(SparkSubmit.scala:212)at org.apache.spark .deploy.SparkSubmit $ .main(SparkSubmit.scala:126)at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

我的pom.xml是:

<properties>
    <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
    <slf4j.version>1.6.1</slf4j.version>
    <logback.version>1.2.3</logback.version>
</properties>

<dependencies>
    <dependency>
        <groupId>org.slf4j</groupId>
        <artifactId>slf4j-api</artifactId>
        <version>${slf4j.version}</version>
    </dependency>
    <dependency>
        <groupId>ch.qos.logback</groupId>
        <artifactId>logback-classic</artifactId>
        <version>${logback.version}</version>
    </dependency>
    <dependency>
        <groupId>ch.qos.logback</groupId>
        <artifactId>logback-core</artifactId>
        <version>${logback.version}</version>
    </dependency>
    <dependency>
        <groupId>junit</groupId>
        <artifactId>junit</artifactId>
        <version>3.8.1</version>
        <scope>test</scope>
    </dependency>
</dependencies>

我的logback代码是:

LoggerContext lc = (LoggerContext) LoggerFactory.getILoggerFactory();
JoranConfigurator configurator = new JoranConfigurator();
configurator.setContext(lc);
configurator.doConfigure(externalConfigFileLocation);

我的spark-submit命令是:

〜/ spark-2.1.1-bin-hadoop2.6 / bin / spark-submit --master yarn --deploy-mode client --driver-memory 4g --executor-memory 2g --executor-cores 4 - class consumer.spark.Sample~ / SparkStreamingJob / log_testing.jar~ / SparkStreamingJob / spark-jobs / config / conf / logback.xml

1 回答

  • 0

    看来这里有两个问题:

    SLF4J是日志记录实现的外观,基本上意味着您可以在不更改代码的情况下在日志记录框架之间进行更改 . 这也意味着您不应该使用相应的日志记录实现核心类 . SLF4J本身解决了日志记录实现,SLF4j提供的“logger”或“factory”对象绑定到该实现(在您的情况下为logback) . 所有这些意味着您无法明确地将SLF4j提供的“logger”对象或“factory”强制转换为logback API类型 .

    此外,它似乎SLF4J解析log4jLoggerFactory而不是LogbackLoggerFactory . 我相信SLF4J和Logback的桥接并不成功 .

相关问题