我有一个大型数据框df,其中包含 yyyymmdd 格式的日期列,如何在pySpark中将其转换为 MM-dd-yyyy .
yyyymmdd
MM-dd-yyyy
from datetime import datetime from pyspark.sql.functions import col,udf from pyspark.sql.types import DateType rdd = sc.parallelize(['20161231', '20140102', '20151201', '20161124']) df1 = sqlContext.createDataFrame(rdd, ['old_col']) //UDF to convert string to date func = udf (lambda x: datetime.strptime(x, '%Y%M%d'), DateType()) df = df1.withColumn('new_col', date_format(func(col('old_col')), 'MM-dd-yyy')) df.show()
这也有效:
from datetime import datetime from pyspark.sql.functions import col,udf,unix_timestamp from pyspark.sql.types import DateType func = udf(lambda x: datetime.strptime(str(x), '%m%d%y'), DateType()) df2 = df.withColumn('date', func(col('InvcDate')))
2 回答
这也有效: