以下是我们的版本
Spark 1.6.1 Hadoop 2.6.2 Hive 1.1.0
我在$ SPARK_HOME / conf目录中有hive-site.xml . hive.metastore.uris属性也已正确配置 .
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://host.domain.com:3306/metastore</value>
<description>metadata is stored in a MySQL server</description>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value>
<description>MySQL JDBC driver class</description>
</property>
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>hive</value>
<description>user name for connecting to mysql server </description>
</property>
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>*****</value>
<description>password for connecting to mysql server </description>
</property>
<property>
<name>hive.metastore.uris</name>
<value>thrift://host.domain.com:9083</value>
<description>IP address (or fully-qualified domain name) and port of the metastore host</description>
</property>
不幸的是,Spark正在创建一个临时德比数据库,而无需连接MySQL Metastore
我需要Spark连接到MySQL Metastore,因为它是所有元数据的中央存储 . 请帮忙
问候
巴拉
1 回答
在群集模式下运行时,您可以尝试使用spark-submit传递hive-site.xml(
--files
)吗?