谁都可以帮忙

当我从RDD创建dateFrame时,我遇到了这个问题 .

[错误] [2016-04-22 18:21:46] [HBaseOperator:load:140]失败:org.apache.spark.SparkException:作业因阶段失败而中止:阶段2.0中的任务0失败4次,大多数最近的失败:阶段2.0中失去的任务0.3(TID 5,host26):java.lang.ClassCastException:org.apache.spark.sql.catalyst.expressions.GenericRow无法强制转换为org.apache.spark中的scala.collection.Iterator .sql.SQLContext $$ anonfun $ 9.apply(SQLContext.scala:519)at scala.collection.Iterator $$ anon $ 11.next(Iterator.scala:328)at scala.collection.Iterator $$ anon $ 11.next(Iterator .scala:328)atg.apache.spark.sql.execution.Aggregate $$ anonfun $ doExecute $ 1 $$ anonfun $ 6.apply(Aggregate.scala:130)at org.apache.spark.sql.execution.Aggregate $$ anonfun $ doExecute $ 1 $$ anonfun $ 6.apply(Aggregate.scala:126)org.apache.spark.rdd.RDD $$ anonfun $ mapPartitions $ 1 $$ anonfun $ apply $ 17.apply(RDD.scala:686)at org .apache.spark.rdd.RDD $$ anonfun $ mapPartitions $ 1 $$ anonfun $ apply $ 17.apply(RDD.scala:686)at org.apache.spark.rdd.MapPartitionsRDD.compute(M apPartitionsRDD.scala:35)org.apache.spark.rdd.RDD.compatOrReadCheckpoint(RDD.scala:277)org.apache.spark.rdd.RDD.iterator(RDD.scala:244)atg.apache.spark位于org.apache.spark.rdd.RDD.iterator的org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:277)的.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:35)(RDD.scala:244) )org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:70)at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)at org.apache.spark.scheduler.Task . 在java.util的java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)的org.apache.spark.executor.Executor $ TaskRunner.run(Executor.scala:213)上运行(Task.scala:70) .concurrent.ThreadPoolExecutor $ Worker.run(ThreadPoolExecutor.java:617)

当我通过纱线客户提交它时它起作用 . 而当我通过我的代码提交它将得到我的代码错误:

StructType type = new StructType(fields);
        scala.collection.immutable.List<StructField> list =  type.toList();
        for (StructField structField : fields)
        {
            LOGGER.info(structField.name() + " " + structField.dataType().json());
        }
        dataFrame = sqlContext.createDataFrame(rowRdd, type);
        LOGGER.debug("dataframe num " +  dataFrame.count());

它无法创建数据帧

有人可以帮帮我吗?