首页 文章

spark kafka 生产环境 商可序列化

提问于
浏览
3

我想出了一个例外:

错误yarn.ApplicationMaster:用户类抛出异常:org.apache.spark.SparkException:任务不可序列org.apache.spark.SparkException:任务不能序列在org.apache.spark.util.ClosureCleaner $ .ensureSerializable(ClosureCleaner.scala :304)在org.apache.spark.util.ClosureCleaner $ .ORG $阿帕奇$火花$ UTIL $ ClosureCleaner $$干净(ClosureCleaner.scala:在org.apache.spark.util.ClosureCleaner $清洁机壳294)(ClosureCleaner . scala:122)org.apache.spark.SparkContext.clean(SparkContext.scala:2032)atg.apache.spark.rdd.RDD $$ anonfun $ foreach $ 1.apply(RDD.scala:889)at org.apache .spark.rdd.RDD $$ anonfun $ $的foreach 1.适用(RDD.scala:888)在org.apache.spark.rdd.RDDOperationScope $ .withScope(RDDOperationScope.scala:147)在org.apache.spark.rdd . RDDOperationScope $ .withScope(RDDOperationScope.scala:108)在org.apache.spark.rdd.RDD.withScope(RDD.scala:306)在org.apache.spark.rdd.RDD.foreach(RDD.scala:888)在com.Boot $ .test(Boot.scala:60)at com.Boot $ .main(Boot.scala:36)at com.Boot.main(Boot.scal)一)在sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)在sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)在java.lang.reflect中.Method.invoke(Method.java:606)at org.apache.spark.deploy.yarn.ApplicationMaster $$ anon $ 2.run(ApplicationMaster.scala:525)引起:java.io.NotSerializableException:org.apache.kafka .clients.producer.KafkaProducer序列化堆栈: - 对象不是可序列化(等级:org.apache.kafka.clients.producer.KafkaProducer,值:org.apache.kafka.clients.producer.KafkaProducer@77624599) - 场(类:COM .Boot $$ anonfun $ test $ 1,name:producer $ 1,type:class org.apache.kafka.clients.producer.KafkaProducer) - object(class com.Boot $$ anonfun $ test $ 1,)org.apache.spark .serializer.SerializationDebugger $ .improveException(SerializationDebugger.scala:40)at org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:47) org.apache.spark.serializer.JavaSerializerInstance.serialize(JavaSerializer.scala:84)at org.apache.spark.util.ClosureCleaner $ .ensureSerializable(ClosureCleaner.scala:301)

//    @transient
val sparkConf = new SparkConf()

sparkConf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer")

//    @transient
val sc = new SparkContext(sparkConf)

val requestSet: RDD[String] = sc.textFile(s"hdfs:/user/bigdata/ADVERTISE-IMPRESSION-STAT*/*")

//    @transient
val props = new HashMap[String, Object]()
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, NearLineConfig.kafka_brokers)
//    props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.ByteArraySerializer");
//    props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.ByteArraySerializer");
props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.StringSerializer")
props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.StringSerializer")
props.put("producer.type", "async")
props.put(ProducerConfig.BATCH_SIZE_CONFIG, "49152")

//    @transient
val producer: KafkaProducer[String, String] = new KafkaProducer[String, String](props)

requestSet.foreachPartition((partisions: Iterator[String]) => {
  partisions.foreach((line: String) => {
    try {
      producer.send(new ProducerRecord[String, String]("testtopic", line))
    } catch {
      case ex: Exception => {
        log.warn(ex.getMessage, ex)
      }
    }
  })
})

producer.close()

在这个程序中,我尝试从hdfs路径读取记录并将它们保存到kafka中 . 问题是,当我删除有关向kafka发送记录的代码时,它运行良好 . 我错过了什么?

1 回答

  • 5

    KafkaProducer isn 't serializable. You' ll需要将实例的创建移到 foreachPartition 内:

    requestSet.foreachPartition((partitions: Iterator[String]) => {
      val producer: KafkaProducer[String, String] = new KafkaProducer[String, String](props)
      partitions.foreach((line: String) => {
        try {
          producer.send(new ProducerRecord[String, String]("testtopic", line))
        } catch {
          case ex: Exception => {
            log.warn(ex.getMessage, ex)
          }
        }
      })
    })
    

    请注意 KafkaProducer.send 返回 Future[RecordMetadata] ,如果无法序列化键或值,则唯一可以传播的异常是 SerializationException .

相关问题