我一直在火花作业中间歇性地击中以下异常:

java.lang.UnsupportedOperationException:org.apache.spark.sql.catalyst.ScalaReflection $$ anonfun $ schemaFor $ 1.apply(ScalaReflection.scala:780)不支持scala.collection.Map类型的模式[String,String] at org.apache.spark.sql.catalyst.ScalaReflection $$ anonfun $ schemaFor $ 1.apply(ScalaReflection.scala:715)at scala.reflect.internal.tpe.TypeConstraints $ UndoLog.undo(TypeConstraints.scala:56)at org位于org.apache.spark的org.apache.spark.sql.catalyst.ScalaReflection $ .cleanUpReflectionObjects(ScalaReflection.scala:39)的.apache.spark.sql.catalyst.ScalaReflection $ class.cleanUpReflectionObjects(ScalaReflection.scala:824) . sql.catalyst.ScalaReflection $ .schemaFor(ScalaReflection.scala:714)atg.apache.spark.sql.catalyst.ScalaReflection $$ anonfun $ schemaFor $ 1 $$ anonfun $ apply $ 8.apply(ScalaReflection.scala:776)at org .apache.spark.sql.catalyst.ScalaReflection $$ anonfun $ schemaFor $ 1 $$ anonfun $ apply $ 8.apply(ScalaReflection.scala:775)at scala.collection.TraversableLike $$ anonfun $ map $ 1.apply(Trave rsableLike.scala:234)scala.collection.TraversableLike $$ anonfun $ map $ 1.apply(TraversableLike.scala:234)scala.collection.immutable.List.foreach(List.scala:381)at scala.collection.TraversableLike $ class.map(TraversableLike.scala:234)在scala.collection.immutable.List.map(List.scala:285)at org.apache.spark.sql.catalyst.ScalaReflection $$ anonfun $ schemaFor $ 1.apply(ScalaReflection) .scala:775)at org.apache.spark.sql.catalyst.ScalaReflection $$ anonfun $ schemaFor $ 1.apply(ScalaReflection.scala:715)at scala.reflect.internal.tpe.TypeConstraints $ UndoLog.undo(TypeConstraints.scala :56)atg.apache.spark.sql.catalyst.ScalaReflection $ class.cleanUpReflectionObjects(ScalaReflection.scala:824)org.apache.spark.sql.catalyst.ScalaReflection $ .cleanUpReflectionObjects(ScalaReflection.scala:39)at org位于org.apache.spark.sql的org.apache.spark.sql.catalyst.ScalaReflection $ .schemaFor(ScalaReflection.scala:711)的.apache.spark.sql.catalyst.ScalaReflection $ .schemaFor(ScalaReflection.scala:714) .functions $ .udf(functions.scala:3382)...

然后我查看ScalaReflection.scala中的源代码,看起来Map [String,String]应该始终是schemaFor(..)函数中的有效类型,就像https://github.com/apache/spark/blob/1a5e460762593c61b7ff2c5f3641d406706616ff/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/ScalaReflection.scala#L742中的行所示

case t if t <:< localTypeOf[Map[_, _]] =>
    val TypeRef(_, _, Seq(keyType, valueType)) = t
    val Schema(valueDataType, valueNullable) = schemaFor(valueType)
    Schema(MapType(schemaFor(keyType).dataType,

有谁知道在什么情况下Map [String,String]可以错过这个检查然后落入“其他”的情况?

这个问题很难重现,每千次运行就会随机发生一次 .