我目前正在尝试在Azure Data Factory V2上设置此管道(如附图所示) . 总之,该ERP系统将每月导出此报告(包含实际数据和预测数据的CSV文件),并将其保存在blob容器中 . 保存此文件CSV后,事件触发器应激活此存储过程,该过程将依次从Azure SQL中的事实表中删除所有实际值,因为每月都会替换它 . 删除实际值后,管道将随后具有复制活动,该复制活动将CSV报告(实际预测)复制到Azure SQL中的同一事实表 . 复制活动完成后,HTTP逻辑APP将从blob容器中删除该新CSV文件 . 这个工作流程将是一个经常在一个月内进行的事件 .
到目前为止,我已经能够独立运行这3个x活动 . 但是,当我在同一个管道中加入它们时,我在尝试“全部发布”时遇到了一些参数错误 . 因此,我不确定是否需要为管道中的每个活动提供相同的参数?
我的管道的JSON代码如下:
{
"name": "TM1_pipeline",
"properties": {
"activities": [
{
"name": "Copy Data1",
"type": "Copy",
"dependsOn": [
{
"activity": "Stored Procedure1",
"dependencyConditions": [
"Succeeded"
]
}
],
"policy": {
"timeout": "7.00:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false
},
"typeProperties": {
"source": {
"type": "BlobSource",
"recursive": false
},
"sink": {
"type": "SqlSink",
"writeBatchSize": 10000
},
"enableStaging": false,
"dataIntegrationUnits": 0
},
"inputs": [
{
"referenceName": "SourceDataset_e7y",
"type": "DatasetReference",
"parameters": {
"copyFolder": {
"value": "@pipeline().parameters.sourceFolder",
"type": "Expression"
},
"copyFile": {
"value": "@pipeline().parameters.sourceFile",
"type": "Expression"
}
}
}
],
"outputs": [
{
"referenceName": "DestinationDataset_e7y",
"type": "DatasetReference"
}
]
},
{
"name": "Stored Procedure1",
"type": "SqlServerStoredProcedure",
"policy": {
"timeout": "7.00:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false,
"secureInput": false
},
"typeProperties": {
"storedProcedureName": "[dbo].[test_sp]"
},
"linkedServiceName": {
"referenceName": "AzureSqlDatabase",
"type": "LinkedServiceReference"
}
},
{
"name": "Web1",
"type": "WebActivity",
"dependsOn": [
{
"activity": "Copy Data1",
"dependencyConditions": [
"Succeeded"
]
}
],
"policy": {
"timeout": "7.00:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false,
"secureInput": false
},
"typeProperties": {
"url": "...",
"method": "POST",
"body": {
"value": "@pipeline().parameters.BlobName",
"type": "Expression"
}
}
}
],
"parameters": {
"sourceFolder": {
"type": "String",
"defaultValue": "@pipeline().parameters.sourceFolder"
},
"sourceFile": {
"type": "String",
"defaultValue": "@pipeline().parameters.sourceFile"
},
"BlobName": {
"type": "String",
"defaultValue": {
"blobname": "source-csv/test.csv"
}
}
}
},
"type": "Microsoft.DataFactory/factories/pipelines"
}
1 回答
请按照this doc配置blob事件触发器并将正确的值传递给参数 .