首页 文章

将值转移到PySpark数据框中的现有列

提问于
浏览
0

我在pyspark中有一个如下所示的数据框 .

+---+-------------+----+
| id|       device| val|
+---+-------------+----+
|  3|      mac pro|   1|          
|  1|       iphone|   2|
|  1|android phone|   2|
|  1|   windows pc|   2|
|  1|   spy camera|   2|
|  2|   spy camera|   3|
|  2|       iphone|   3|
|  3|   spy camera|   1|
|  3|         cctv|   1|
+---+-------------+----+

我想基于以下 lists 填充一些列

phone_list = ['iphone', 'android phone', 'nokia']
pc_list = ['windows pc', 'mac pro']
security_list = ['spy camera']
ucg_list = ['ipad']

我在下面做了 .

from pyspark.sql.functions import col, when, lit 
from pyspark.sql.types import IntegerType
df1 = df.withColumn('phones', lit(None).cast(IntegerType())).withColumn('pc', lit(None).cast(IntegerType())).withColumn('security', lit(None).cast(IntegerType())).withColumn('null', lit(None).cast(IntegerType())).withColumn('ucg', lit(None).cast(IntegerType()))

import pyspark.sql.functions as F

df1.withColumn('cat', 
    F.when(df1.device.isin(phone_list), 'phones').otherwise(
    F.when(df1.device.isin(pc_list), 'pc').otherwise(
    F.when(df1.device.isin(security_list), 'security')))
).groupBy('id', 'phones', 'pc', 'security', 'null', 'ucg').pivot('cat').agg(F.count('cat')).show()

output I am receiving

+---+------+----+--------+----+----+----+---+------+--------+
| id|phones|  pc|security|null| ucg|null| pc|phones|security|
+---+------+----+--------+----+----+----+---+------+--------+
|  3|  null|null|    null|null|null|   0|  1|     0|       1|
|  2|  null|null|    null|null|null|   0|  0|     1|       1|
|  1|  null|null|    null|null|null|   0|  1|     2|       1|
+---+------+----+--------+----+----+----+---+------+--------+

我想要的是首先根据列表名称创建列,然后填充值

expected output

+---+------+---+------+--------+----+
| id|   ucg| pc|phones|security|null|
+---+------+---+------+--------+----+
|  1|     0|  1|     2|       1|   0|
|  2|     0|  0|     1|       1|   0|
|  3|     0|  1|     0|       1|   1|
+---+------+---+------+--------+----+

我怎样才能得到我想要的东西?

Edit

当我做下面的事情

df1 = df.withColumn('cat', 
    f.when(df.device.isin(phone_list), 'phones').otherwise(
    f.when(df.device.isin(pc_list), 'pc').otherwise(
    f.when(df.device.isin(ucg_list), 'ucg').otherwise(
    f.when(df.device.isin(security_list), 'security')))))

输出是

+---+-------------+---+--------+
| id|       device|val|     cat|
+---+-------------+---+--------+
|  3|      mac pro|  1|      pc|
|  3|   spy camera|  1|security|
|  3|         cctv|  1|    null|
|  1|       iphone|  2|  phones|
|  1|android phone|  2|  phones|
|  1|   windows pc|  2|      pc|
|  1|   spy camera|  2|security|
|  2|   spy camera|  3|security|
|  2|       iphone|  3|  phones|
+---+-------------+---+--------+

在输出中,您可以看到 id 3cat 列中具有 null

1 回答

  • 1

    仅为 groupBy'phones', 'pc', 'ucg', 'security', 'null' 列创建和填充 None 没有意义 . Grouping with id and all of the above columns with null or grouping by only id, both are same .

    您可以做的是找到实际的旋转列和预期列之间的差异,然后创建并填充0

    所以以下内容适合您

    phone_list = ['iphone', 'android phone', 'nokia']
    pc_list = ['windows pc', 'mac pro']
    security_list = ['spy camera']
    ucg_list = ['ipad']
    
    from pyspark.sql import functions as f
    df = df.withColumn('cat',
                   f.when(df.device.isin(phone_list), 'phones').otherwise(
                     f.when(df.device.isin(pc_list), 'pc').otherwise(
                       f.when(df.device.isin(ucg_list), 'ucg').otherwise(
                         f.when(df.device.isin(security_list), 'security'))))
                   )\
        .groupBy('id').pivot('cat').agg(f.count('val'))\
        .na.fill(0)\
    
    columnList = ['phones', 'pc', 'ucg', 'security', 'null']
    actualcolumnList = df.columns[1:]
    
    diffColumns = [x for x in columnList if x not in actualcolumnList]
    
    for y in diffColumns:
        df = df.withColumn(y, f.lit(0))
    
    df.show(truncate=False)
    

    哪个应该给你

    +---+----+---+------+--------+---+
    |id |null|pc |phones|security|ucg|
    +---+----+---+------+--------+---+
    |3  |1   |1  |0     |1       |0  |
    |1  |0   |1  |2     |1       |0  |
    |2  |0   |0  |1     |1       |0  |
    +---+----+---+------+--------+---+
    

    我希望答案是有帮助的

相关问题