ApplicationMaster:用户类引发异常:org.apache.spark.sql.AnalysisException:未找到表或视图:“DB_X” . “table_Y”

Spark Session :

SparkSession
    .builder()          
    .appName(appName)
    .config("spark.sql.warehouse.dir", "/apps/hive/warehouse")
    .enableHiveSupport()
    .getOrCreate();

hive-site.xml中的Hive仓库目录:/ apps / hive / warehouse /

hadoop fs -ls /apps/hive/warehouse/
drwxrwxrwx   - root hadoop          0 2018-09-03 11:22 /apps/hive/warehouse/DB_X.db


hadoop fs -ls /apps/hive/warehouse/DB_X.db
none

这里抛出错误:

spark
   .read()
   .table("DB_X.table_Y");

在java中:

spark.sql("show databases").show()
default

在火花壳互动中:

spark.sql("show databases").show()
default
DB_X

show create table table_Y:

CREATE EXTERNAL TABLE `table_Y`(
...
PARTITIONED BY (
  `partition` string COMMENT '')
...
    location '/data/kafka-connect/topics/table_Y'

hadoop文件:

hadoop fs -ls /data/kafka-connect/topics/table_Y
drwxr-xr-x   - kafka hdfs          0 2018-09-11 17:24 /data/kafka-connect/topics/table_Y/partition=0
drwxr-xr-x   - kafka hdfs          0 2018-09-11 17:24 /data/kafka-connect/topics/table_Y/partition=1

hadoop fs -ls data/kafka-connect/topics/table_Y/partition=0
-rw-r--r--   3 kafka hdfs     102388 2018-09-11 17:24 /data/kafka-connect/topics/table_Y/partition=0/table_Y+0+0001823382+0001824381.avro
-rw-r--r--   3 kafka hdfs     102147 2018-09-11 17:24 /data/kafka-connect/topics/table_Y/partition=0/table_Y+0+0001824382+0001825381.avro
...

每个人都可以在火花壳或蜂巢壳中正常工作

来自hive conf的hive-site.xml被复制到spark2 / conf中

使用HDP 2.6.4.0-91和spark 2.2

任何帮助?