首页 文章

Spark 1.5.2和SLF4J StaticLoggerBinder

提问于
浏览
10

虽然这并没有阻止我的代码运行,但我只是想了解为什么会出现这种警告 . 我正在使用Scala 2.11.7,ScalaIDE,SBT 0.13.9 .

15/11/20 12:17:05 INFO akka.event.slf4j.Slf4jLogger: Slf4jLogger started
15/11/20 12:17:06 INFO Remoting: Starting remoting
15/11/20 12:17:06 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@0.0.0.0:36509]
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.

[Stage 0:=======================================================> (31 + 1) / 32]
[Stage 0:=========================================================(32 + 0) / 32]

现在我通常理解为什么会出现这个错误,但问题是我根本没有搞乱Spark的日志记录 . 现在,如果我将slf4j-simple添加到我的项目中,它会抱怨多个SLF4j绑定,但不会出现此警告 . 我无法为我的生活找到一种方法,使这两件事情都很好玩 . 我的代码本身使用log4j 2.4进行自己的日志记录 .

我试过,但无济于事

  • 不包括Spark的Logging并包括我自己的 .

  • 使用log4j2将SLF4j调用路由到log4j2并排除Spark的SLF4j

  • 包括字面上每个SLF4j绑定,试图让一个人拿起它 .

  • 将SLF4j jar添加到我的类路径,spark的驱动器和 Actuator 类路径

如果我尝试排除Spark日志记录,我将从Spark获得ClassNotFound问题,但对于我的生活,我无法弄清楚到底是做什么的 .

更多细节,我使用Spark,但我排除并包括我自己的Hadoop版本(2.7.1)

以下是我认为根据系统类加载器提供的相关 jar .

~/.ivy2/cache/org.slf4j/slf4j-api/jars/slf4j-api-1.7.10.jar
~/.ivy2/cache/org.slf4j/slf4j-log4j12/jars/slf4j-log4j12-1.7.10.jar
~/.ivy2/cache/log4j/log4j/bundles/log4j-1.2.17.jar
~/.ivy2/cache/org.slf4j/jul-to-slf4j/jars/jul-to-slf4j-1.7.10.jar
~/.ivy2/cache/org.slf4j/jcl-over-slf4j/jars/jcl-over-slf4j-1.7.10.jar
~/.ivy2/cache/com.typesafe.akka/akka-slf4j_2.11/jars/akka-slf4j_2.11-2.3.11.jar
~/.ivy2/cache/org.apache.logging.log4j/log4j-api/jars/log4j-api-2.4.1.jar
~/.ivy2/cache/org.apache.logging.log4j/log4j-core/jars/log4j-core-2.4.1.jar
~/.ivy2/cache/com.typesafe.akka/akka-slf4j_2.11/jars/akka-slf4j_2.11-2.4.0.jar

有没有人对此有任何见解?我很感激 .

log4j: Trying to find [log4j.xml] using context classloader sun.misc.Launcher$AppClassLoader@42a57993.
log4j: Trying to find [log4j.xml] using sun.misc.Launcher$AppClassLoader@42a57993 class loader.
log4j: Trying to find [log4j.xml] using ClassLoader.getSystemResource().
log4j: Trying to find [log4j.properties] using context classloader sun.misc.Launcher$AppClassLoader@42a57993.
log4j: Using URL [file:/home/scarman/workspace-scala/Ingestions/ingestion/bin/log4j.properties] for automatic log4j configuration.
log4j: Reading configuration from URL file:/home/scarman/workspace-scala/Ingestions/ingestion/bin/log4j.properties
log4j: Parsing for [root] with value=[INFO, console].
log4j: Level token is [INFO].
log4j: Category root set to INFO
log4j: Parsing appender named "console".
log4j: Parsing layout options for "console".
log4j: Setting property [conversionPattern] to [%d{yy/MM/dd HH:mm:ss} %p %c: %m%n].
log4j: End of parsing for "console".
log4j: Setting property [target] to [System.err].
log4j: Parsed "console" options.
log4j: Parsing for [org.spark-project.jetty] with value=[WARN].
log4j: Level token is [WARN].
log4j: Category org.spark-project.jetty set to WARN
log4j: Handling log4j.additivity.org.spark-project.jetty=[null]
log4j: Parsing for [org.spark-project.jetty.util.component.AbstractLifeCycle] with value=[ERROR].
log4j: Level token is [ERROR].
log4j: Category org.spark-project.jetty.util.component.AbstractLifeCycle set to ERROR
log4j: Handling log4j.additivity.org.spark-project.jetty.util.component.AbstractLifeCycle=[null]
log4j: Parsing for [org.apache.spark] with value=[WARN].
log4j: Level token is [WARN].
log4j: Category org.apache.spark set to WARN
log4j: Handling log4j.additivity.org.apache.spark=[null]
log4j: Parsing for [org.apache.hadoop.hive.metastore.RetryingHMSHandler] with value=[FATAL].
log4j: Level token is [FATAL].
log4j: Category org.apache.hadoop.hive.metastore.RetryingHMSHandler set to FATAL
log4j: Handling log4j.additivity.org.apache.hadoop.hive.metastore.RetryingHMSHandler=[null]
log4j: Parsing for [parquet] with value=[INFO].
log4j: Level token is [INFO].
log4j: Category parquet set to INFO
log4j: Handling log4j.additivity.parquet=[null]
log4j: Parsing for [org.apache.hadoop] with value=[WARN].
log4j: Level token is [WARN].
log4j: Category org.apache.hadoop set to WARN
log4j: Handling log4j.additivity.org.apache.hadoop=[null]
log4j: Parsing for [org.apache.spark.repl.SparkILoop$SparkILoopInterpreter] with value=[INFO].
log4j: Level token is [INFO].
log4j: Category org.apache.spark.repl.SparkILoop$SparkILoopInterpreter set to INFO
log4j: Handling log4j.additivity.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=[null]
log4j: Parsing for [org.apache.spark.repl.SparkIMain$exprTyper] with value=[INFO].
log4j: Level token is [INFO].
log4j: Category org.apache.spark.repl.SparkIMain$exprTyper set to INFO
log4j: Handling log4j.additivity.org.apache.spark.repl.SparkIMain$exprTyper=[null]
log4j: Parsing for [org.apache.parquet] with value=[ERROR].
log4j: Level token is [ERROR].
log4j: Category org.apache.parquet set to ERROR
log4j: Handling log4j.additivity.org.apache.parquet=[null]
log4j: Parsing for [org.apache.hadoop.hive.ql.exec.FunctionRegistry] with value=[ERROR].
log4j: Level token is [ERROR].
log4j: Category org.apache.hadoop.hive.ql.exec.FunctionRegistry set to ERROR
log4j: Handling log4j.additivity.org.apache.hadoop.hive.ql.exec.FunctionRegistry=[null]
log4j: Finished configuring

在加载时添加slf4j定位的类绑定...

jar:file:/home/scarman/.ivy2/cache/org.slf4j/slf4j-log4j12/jars/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/Log4jLoggerFactory.class
org.slf4j.impl.Log4jLoggerFactory@7cef4e59
org.slf4j.impl.Log4jLoggerFactory

2 回答

  • 6

    更新:这仍适用于Spark 1.6.1

    只是跟进并回答这个问题以防万一其他人在想 . 所以我注意到这个警告只发生在使用Spark的镶木地板界面时 . 我测试了这一点以确认它,并且发现有人已经在SPARK-10057中写过这个 . 该问题的问题在于其他开发人员无法复制它,但公平地说,原始记者在描述问题时相当模糊 .

    无论哪种方式,除了满足我对这些问题的强迫症之外,我决定无缘无故地追踪它 .

    所以我测试了在S3和本地磁盘上使用这两个文件 . 文本和JSON文件未触发此警告,但是镶嵌用法会触发此警告,无论文件是本地还是S3 . 这适用于阅读和书写镶木地板文件 . 看ParquetRelation.scala,我们看到SLF4j的唯一引用 .

    // Parquet initializes its own JUL logger in a static block which always prints to stdout.  Here
      // we redirect the JUL logger via SLF4J JUL bridge handler.
      val redirectParquetLogsViaSLF4J: Unit = {
        def redirect(logger: JLogger): Unit = {
          logger.getHandlers.foreach(logger.removeHandler)
          logger.setUseParentHandlers(false)
          logger.addHandler(new SLF4JBridgeHandler)
        }
    

    因此,我断言Parquet的JUL日志记录和SLF4j桥之间的桥梁导致出现此警告似乎是合理的 . 我想它初始化桥接器,并且无法加载正确的Static Logger Binders . 我不得不深入研究Spark的代码并进行测试以找出答案,但这至少是造成它的原因 . 如果时间允许,我会尝试一起修复 .

    最后,这是一个用于本地再现警告的代码示例 .

    scala> sc.setLogLevel("WARN")
    
    scala> val d = sc.parallelize(Array[Int](1,2,3,4,5))
    d: org.apache.spark.rdd.RDD[Int] = ParallelCollectionRDD[0] at parallelize at <console>:21
    
    scala> val ddf = d.toDF()
    ddf: org.apache.spark.sql.DataFrame = [_1: int]
    
    scala> ddf.write.parquet("/home/scarman/data/test.parquet")
    SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
    SLF4J: Defaulting to no-operation (NOP) logger implementation
    SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
    
  • 0

    最有可能的是,您在项目的依赖项中缺少 org.slf4j.slf4j-simple .

相关问题