我正在尝试使用此ADF DataLakeAnalyticsU-SQL 管道活动发布数据工厂解决方案,遵循azure逐步doc(https://docs.microsoft.com/en-us/azure/data-factory/data-factory-usql-activity) .
{
"type": "DataLakeAnalyticsU-SQL",
"typeProperties": {
"scriptPath": "\\scripts\\111_risk_index.usql",
"scriptLinkedService": "PremiumAzureDataLakeStoreLinkedService",
"degreeOfParallelism": 3,
"priority": 100,
"parameters": {
"in": "/DF_INPUT/Consodata_Prelios_consegna_230617.txt",
"out": "/DF_OUTPUT/111_Analytics.txt"
}
},
"inputs": [
{
"name": "PremiumDataLakeStoreLocation"
}
],
"outputs": [
{
"name": "PremiumDataLakeStoreLocation"
}
],
"policy": {
"timeout": "06:00:00",
"concurrency": 1,
"executionPriorityOrder": "NewestFirst",
"retry": 1
},
"scheduler": {
"frequency": "Minute",
"interval": 15
},
"name": "ConsodataFilesProcessing",
"linkedServiceName": "PremiumAzureDataLakeAnalyticsLinkedService"
}
在发布期间遇到此错误:
25/07/2017 18:51:59- Publishing Project 'Premium.DataFactory'....
25/07/2017 18:51:59- Validating 6 json files
25/07/2017 18:52:15- Publishing Project 'Premium.DataFactory' to Data
Factory 'premium-df'
25/07/2017 18:52:15- Value cannot be null.
Parameter name: value
试图找出项目可能出错的问题,问题出现在上面显示的活动选项 "typeProperties"
中,特别是 scriptPath
和 scriptLinkedService
属性 . 医生说:
scriptPath: Path to folder that contains the U-SQL script. Name of the file
is case-sensitive.
scriptLinkedService: Linked service that links the storage that contains the
script to the data factory
没有它们发布项目(使用硬编码的 script
)它将成功完成 . 问题是我无法弄清楚究竟是什么 . 我尝试了几种组合路径 . 我唯一知道的是脚本文件必须作为依赖项在本地引用到解决方案中 .
有人可以帮我这个吗?
先感谢您 .
1 回答
脚本链接服务需要是Blob存储,而不是Data Lake Storage .
忽略发布错误,误导 .
在您的解决方案中为Azure存储帐户提供链接服务,在“scriptLinkedService”属性中引用 . 然后在'scriptPath'属性中引用blob容器路径 .
例如:
希望这可以帮助 .
PS . 仔细检查属性名称的区分大小写 . 它也可以抛出无用的错误 .