首页 文章

无法通过更改hive-site.xml来连接spark-HiveContext来运行配置单元

提问于
浏览
0

下面是我的hive / conf / hive-site.xml:

<configuration>
   <property>
      <name>javax.jdo.option.ConnectionURL</name>
      <value>jdbc:mysql://127.0.0.1/metastore?createDatabaseIfNotExist=true</value>
      <description>metadata is stored in a MySQL server</description>
   </property>
   <property>
      <name>javax.jdo.option.ConnectionDriverName</name>
      <value>com.mysql.jdbc.Driver</value>
      <description>MySQL JDBC driver class</description>
   </property>
   <property>
      <name>javax.jdo.option.ConnectionUserName</name>
      <value>hiveuser</value>
      <description>user name for connecting to mysql server</description>
   </property>
   <property>
      <name>javax.jdo.option.ConnectionPassword</name>\
      <value>hivepassword</value>
      <description>password for connecting to mysql server</description>
   </property>
</configuration>

我想使用spark-HiveContext访问Hive现有的数据库和表 . 所以在hive / conf / hive-site.xml下面添加了以下行:

<configuration>
   <property>
      <name>javax.jdo.option.ConnectionURL</name>
      <value>jdbc:mysql://127.0.0.1/metastore?createDatabaseIfNotExist=true</value>
      <description>metadata is stored in a MySQL server</description>
   </property>
   <property>
      <name>javax.jdo.option.ConnectionDriverName</name>
      <value>com.mysql.jdbc.Driver</value>
      <description>MySQL JDBC driver class</description>
   </property>
   <property>
      <name>javax.jdo.option.ConnectionUserName</name>
      <value>hiveuser</value>
      <description>user name for connecting to mysql server</description>
   </property>
   <property>
      <name>javax.jdo.option.ConnectionPassword</name>\
      <value>hivepassword</value>
      <description>password for connecting to mysql server</description>
   </property>
   <property>
      <name>hive.metastore.uris</name>
      <value>thrift://127.0.0.1:9083</value>
   </property>
</configuration>

在如上所示编辑hive-site.xml之后,hive shell无法正常工作 . 请帮我更新hive-site.xml的正确方法,并帮助我使用HiveContext访问spark-shell上的hive表,如下所示:

val hc = new org.apache.spark.sql.hive.HiveContext(sc);
hc.setConf("hive.metastore.uris","thrift://127.0.0.1:9083");
val a = hc.sql("show databases");
a.show //should display all my hive databases.

请帮我解决这个问题 .

1 回答

  • 0

    @Chaithu你需要使用hive --service metastore启动你的hive Metastore,然后通过这种方式启用hivesupport创建sparksession

    val spark= SparkSession
      .builder()
      .master("local")
      .appName("HiveExample").config("hive.metastore.uris","thrift://hadoop-master:9083")
      .enableHiveSupport()
      .getOrCreate()
    

相关问题