首页 文章

azure stream analytics to cosmos db

提问于
浏览
2

我在保存从Azure IoT中心到Cosmos DB的遥测时遇到了麻烦 . 我有以下设置:

  • IoT Hub - 用于事件聚合

  • Azure Stream Analytics - 用于事件流处理

  • 带表API的Cosmos DB . 在这里我创建了1个表 .

来自IoT Hub的示例消息: {"id":33,"deviceId":"test2","cloudTagId":"cloudTag1","value":24.79770721657087} 流分析中的查询处理事件: SELECT concat(deviceId, cloudtagId) as telemetryid, value as temperature, id, deviceId, 'asd' as '$pk', deviceId as PartitionKey INTO [TableApiCosmosDb] From [devicesMessages] 每次作业尝试将输出保存到CosmosDB时都会出现问题我收到错误 An error occurred while preparing data for DocumentDB. The output record does not contain the column '$pk' to use as the partition key property by DocumentDB

Note :我在尝试解决问题时添加了 $pk 列和 PartitionKey .

EDIT 这里是输出配置:

enter image description here

有谁知道我做错了什么?

3 回答

  • 1

    遗憾的是,不支持CosmosDB中的Table API作为ASA的输出接收器 .

    如果要使用Table作为输出,可以使用存储帐户下的一个 . 抱歉给你带来不便 .

    我们将来会添加Cosmos DB Table API .

    谢谢! JS - Azure流分析团队

  • 0

    我也有这个问题 . 尽管在UI中尚不清楚,但目前仅支持CosmosDB的SQL API . 我切换到那个,一切都很美妙 .

  • 1

    Try with

    SELECT 
        concat(deviceId, cloudtagId) as telemetryid, value as temperature, id, deviceId, 'asd' as 'pk', deviceId as PartitionKey
    INTO
        [TableApiCosmosDb]
    From
        [devicesMessages]
    

    特殊字符是问题所在 .

    虽然使用分区作为'id'创建输出,而在插入查询'deviceId'作为PartitionKey时,因为它没有正确分区 .

    Example:

    SELECT
            id as PartitionKey, SUM(CAST(temperature AS float)) AS temperaturesum ,AVG(CAST(temperature AS float)) AS temperatureavg
    INTO streamout
    FROM
        Streaminput TIMESTAMP by Time
    GROUP BY
         id ,
        TumblingWindow(second, 60)
    

相关问题