我有Spark 1.6的环境(我无法升级) . 我在SqlContext中设置了hive.metastore.uris属性,并且能够查看所有表(存储在S3中) . 当我尝试获取任何数据时出现问题,因为我得到了“文件未找到异常” . 这是我启动shell的方式:

spark-shell --deploy-mode client --master yarn --packages=org.apache.hadoop:hadoop-aws:2.6.5

调查更多我看到以下HTTP请求:

"GET /%2Fuser_tech%2Femail_testing%2Fsentdate%3D2017-12-15 HTTP/1.1[\r][\n]"
18/04/11 16:13:20 DEBUG http.wire:  >> "Date: Wed, 11 Apr 2018 16:13:20 GMT[\r][\n]"
18/04/11 16:13:20 DEBUG http.wire:  >> "Authorization: AWS:=[\r][\n]"
18/04/11 16:13:20 DEBUG http.wire:  >> "Host: uyt-data-prod-users.s3.amazonaws.com:443[\r][\n]"
18/04/11 16:13:20 DEBUG http.wire:  >> "Connection: Keep-Alive[\r][\n]"
18/04/11 16:13:20 DEBUG http.wire:  >> "User-Agent: JetS3t/0.9.3 (Linux/4.9.81-35.56.amzn1.x86_64; amd64; en; JVM 1.8.0_131)[\r][\n]"
18/04/11 16:13:20 DEBUG http.wire:  >> "[\r][\n]"
18/04/11 16:13:20 DEBUG http.headers: >> GET /%2Fuser_t

如您所见,主持人是:

s3.amazonaws.com

这是美国东部地区,但我需要使用us-west-2地区 .

有什么建议吗?