Rdd to json in spark and scala

我带了一个带有spark / scala的Json文件,然后将它保存在rdd中 .

val dataFile = "resources/tweet-json/hello.json"
  lazy val rdd = SparkCommons.sqlContext.read.format("json").load(dataFile)

在查询rdd之后,我想再次生成一个Json输出文件(我将使用get Http请求发送) . 我怎样才能在json中转换这个rdd?

[
{
    "label": [
        "fattacq_an_eser_facq",
        "eu_tot_doc",
        "fattacq_prot_facq",
        "id_sogg",
        "eu_tot_man"
    ],
    "values": [
        {
            "label": "Prima Fattura 2016",
            "values": [
                2016,
                956.48,
                691,
                44633,
                956.48
            ]
        },
        {
            "label": "Seconda Fattura 2016",
            "values": [
                2016,
                190,
                982,
                38127,
                190
            ]
        },
        {
            "label": "Terza Fattura 2016",
            "values": [
                2016,
                140.3,
                1088,
                59381,
                140.3
            ]
        },
        {
            "label": "Quarta Fattura 2016",
            "values": [
                2016,
                488,
                1091,
                59382,
                488
            ]
        },
        {
            "label": "Quinta Fattura 2016",
            "values": [
                2016,
                11365.95,
                1154,
                57526,
                11365.95
            ]
        },
        {
            "label": "Sesta Fattura 2016",
            "values": [
                2016,
                44440.01,
                1276,
                5555,
                44440.01
            ]
        }
    ]
  }
]

回答(1)

3 years ago

您可以简单地使用write函数写出Json示例:

dfTobeSaved.write.format("json").save("/root/data.json")

我认为这应该工作正常!