首页 文章

在现有Hadoop集群上安装Spark(带HIVE的ISSUE)

提问于
浏览
0

我想要一个Spark / Shark集群,但仍然遇到同样的问题 . 我已按照https://github.com/amplab/shark/wiki/Running-Shark-on-a-Cluster上的说明进行操作,并按照说明对Hive进行了处理 .

以下是详细信息,任何帮助都会很棒 .

我已经安装了以下软件包:

Spark / Shark 1.0.0

Apache Hadoop 2.4.0

Apache Hive 0.13

Scala 2.9.3

Java 7

我将〜/ spark / conf / spark-env.sh配置为:

导出HADOOP_HOME = / path / to / hadoop / export HIVE_HOME = / path / to / hive / export MASTER = spark://xxx.xxx.xxx.xxx:7077 export SPARK_HOME = / path / to / spark export SPARK_MEM = 4g导出HIVE_CONF_DIR = / path / to / hive / conf / source $ SPARK_HOME / conf / spark-env.sh

当用“./spark-withinfo”启动spark时,我收到以下错误:

-hiveconf hive.root.logger=INFO,console

    Starting the Shark Command Line Client

    14/07/07 16:26:57 WARN conf.HiveConf: DEPRECATED: hive.metastore.ds.retry.* no longer has any effect.  Use hive.hmshandler.retry.* instead

    14/07/07 16:26:57 [main]: WARN conf.HiveConf: DEPRECATED: hive.metastore.ds.retry.* no longer has any effect.  Use hive.hmshandler.retry.* instead

    Logging initialized using configuration in jar:file:/path/to/hive/lib/hive-exec-0.13.0.jar!/hive-log4j.properties

    14/07/07 16:26:57 [main]: INFO SessionState:

    Logging initialized using configuration in jar:file:/path/to/hive/lib/hive-exec-0.13.0.jar!/hive-log4j.properties

    14/07/07 16:26:57 [main]: INFO metastore.HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore

    Exception in thread "main" java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient

            at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:344)

            at shark.SharkCliDriver$.main(SharkCliDriver.scala:128)

            at shark.SharkCliDriver.main(SharkCliDriver.scala)

    Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient

            at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1139)

            at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:51)

            at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:61)

            at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2444)

            at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2456)

            at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:338)

            ... 2 more

    Caused by: java.lang.reflect.InvocationTargetException

            at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

            at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)

            at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)

            at java.lang.reflect.Constructor.newInstance(Constructor.java:526)

            at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1137)

            ... 7 more

    Caused by: java.lang.NoSuchFieldError: METASTOREINTERVAL

            at org.apache.hadoop.hive.metastore.RetryingRawStore.init(RetryingRawStore.java:78)

            at org.apache.hadoop.hive.metastore.RetryingRawStore.<init>(RetryingRawStore.java:60)

            at org.apache.hadoop.hive.metastore.RetryingRawStore.getProxy(RetryingRawStore.java:71)

            at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:413)

            at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:401)

            at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:439)

            at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:325)

            at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:285)

            at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:54)

            at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:59)

            at org.apache.hadoop.hive.metastore.HiveMetaStore.newHMSHandler(HiveMetaStore.java:4102)

            at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:121)

            ... 12 more

我猜Spark在Hive找不到一些libs ton连接Metastore,但我已经在这里堆了几天而且不知道如何解决它 . 顺便说一句,我使用MYSQL进行hive元数据,一切都在hive中运行良好 .

任何帮助表示赞赏 . 提前致谢 .

1 回答

  • 1

    您可能需要在启动spark之前添加mysql连接器jar文件...在我的情况下,我添加了如下所示的mysql连接器jar .

    $SPARK_HOME/bin/compute-classpath.sh 
    
    CLASSPATH=$CLASSPATH:/opt/big/hive/lib/mysql-connector-java-5.1.25-bin.jar
    

相关问题