java.lang.NoSuchMethodError:parquet.Preconditions.checkState(ZLjava / lang / String;)V

运行Spark 1.3.1以及1.4.1时出现以下错误

java.lang.NoSuchMethodError:parquet.Preconditions.checkState(ZLjava / lang / String;)V at parquet.schema.Types $ PrimitiveBuilder.build(Types.java:314)at parquet.schema.Types $ PrimitiveBuilder.build(Types .java:232)at parquet.schema.Types $ Builder.named(Types.java:210)at org.apache.spark.sql.parquet.ParquetTypesConverter $$ anonfun $ fromDataType $ 1.apply(ParquetTypes.scala:314)at at org.apache.spark.sql.parquet.ParquetTypesConverter $$ anonfun $ fromDataType $ 1.apply(ParquetTypes.scala:305)位于org.apache.spark.sql.parquet的scala.Option.map(Option.scala:145) . ParquetTypesConverter $ .fromDataType(ParquetTypes.scala:305)atg.apache.spark.sql.parquet.ParquetTypesConverter $$ anonfun $ 4.apply(ParquetTypes.scala:395)org.apache.spark.sql.parquet.ParquetTypesConverter $$ anonfun $ 4.apply(ParquetTypes.scala:394)at scala.collection.TraversableLike $$ anonfun $ map $ 1.apply(TraversableLike.scala:244)at scala.collection.TraversableLike $$ anonfun $ map $ 1.apply(TraversableLike.scala :244)在scala.collection.immutable . List.foreach(List.scala:318)at scala.collection.TraversableLike $ class.map(TraversableLike.scala:244)at scala.collection.AbstractTraversable.map(Traversable.scala:105)at org.apache.spark.sql .parquet.ParquetTypesConverter $ .convertFromAttributes(ParquetTypes.scala:393)org.apache.spark.sql.parquet.ParquetTypesConverter $ .writeMetaData(ParquetTypes.scala:440)at org.apache.spark.sql.parquet.ParquetRelation2 $ MetadataCache atg.apache.spark.sql.parquet.ParquetRelation2上的.prepareMetadata(newParquet.scala:260)$ org.apache.spark.sql.parquet.ParquetRelation2 $ MetadataCache $$ anonfun $ 6.apply(newParquet.scala:276)$ MetadataCache $$ anonfun $ 6.apply(newParquet.scala:269)at scala.collection.TraversableLike $$ anonfun $ map $ 1.apply(TraversableLike.scala:244)at scala.collection.TraversableLike $$ anonfun $ map $ 1.apply(TraversableLike .scala:244)scala.collection.immutable.List.foreach(List.scala:318)at scala.collection.TraversableLike $ class.map(TraversableLike.scala:244)at scala.collection.AbstractTrav ersable.map(Traversable.scala:105)org.apache.spark.sql.parquet.ParquetRelation2 $ MetadataCache.refresh(newParquet.scala:269)org.apache.spark.sql.parquet.ParquetRelation2 . (newParquet.scala :391)在org.apache的org.apache.spark.sql.parquet.DefaultSource.createRelation(newParquet.scala:98)org.apache.spark.sql.parquet.DefaultSource.createRelation(newParquet.scala:128) . 位于org.apache.spark.sql.execution的org.apache.spark.sql.hive.execution.CreateMetastoreDataSourceAsSelect.run(commands.scala:218)中的spark.sql.sources.ResolvedDataSource $ .apply(ddl.scala:240) .ExecutedCommand.sideEffectResult $ lzycompute(commands.scala:54)atg.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:54)at org.apache.spark.sql.execution.ExecutedCommand.execute(commands) .scala:64)atg.apache.spark.sql.SQLContext $ QueryExecution.toRdd $ lzycompute(SQLContext.scala:1099)at org.apache.spark.sql.SQLContext $ QueryExecution.toRdd(SQLContext.scala:1099)at at org.apache.spark.sql.DataFrame.sa org.apache上的org.apache.spark.sql.DataFrame.saveAsTable(DataFrame.scala:1071)中的orAs.apache(DataFrame.scala:1021)org.apache上的org.apache.spark.sql.DataFrame.saveAsTable(DataFrame.scala:1037)位于com.xurmo.ai.spark.SparkClient.dataFrameToHiveTable(SparkClient.java:128)的.spark.sql.DataFrame.saveAsTable(DataFrame.scala:1015),位于com.xurmo.ai.xflow.operation.sink.DataFrameToPlatformSink.push (DataFrameToPlatformSink.java:79)位于com.xurmo.ai.xflow.operation.sink.ASink.operate(ASink.java:24)的com.xurmo.ai.xflow.operation.AOperation.process(AOperation.java:121) )在com.xurmo.ai.xflow.flow.executor.ExecutableOp.call(ExecutableOp.java:26)的java中的com.xurmo.ai.xflow.flow.executor.ExecutableOp.call(ExecutableOp.java:15) . util.concurrent.FutureTask $ Sync.innerRun(FutureTask.java:334)at java.util.concurrent.FutureTask.run(FutureTask.java:166)at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)在java.util.concurrent.ThreadPoolExecutor $ Worker.run(ThreadPoolExecutor.java:603)at at java.lang.Thread.run(Thread.java:722)

从日志中可以清楚地看到类加载器正在加载此类的其他版本 . 有人可以告诉我这个类有哪些其他jar包吗?

回答(1)

3 years ago

我有同样的问题 . lib目录中有两个包含Precondition Class的jar文件

1.parquet-hadoop-bundle-1.6.0rc3.jar
2.parequet-hadooop-bundle-***.jar (which does not contain checkState method)

我只删除它

parequet-hadooop-bundle-***.jar

来自lib目录的文件,现在它为我工作 .