我无法通过kerberized集群中的Spark Thrift Server使用'org.apache.hadoop.hive.hbase.HBaseStorageHandler'创建映射到HBase表的Hive表 . 它因kerberos异常而失败 . 我已经在下面提供了详细信息 . 相同的查询在Hive Thrift Server中正常工作 .

Hadoop Version: 2.7.3.2.6.2.0-205

Hive Version: 1.2.1000.2.6.2.0-205

Spark Version: 1.6.3

HBase Version: 1.1.2.2.6.2.0-205

Query (via Spark Thrift Server): CREATE TABLE IF NOT NOT EXISTS页面(rowkey STRING,pageviews STRING,bytes STRING)存储'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES('hbase.columns.mapping' = ':key,pageDetails:pageViews,pageDetails:sizeInBytes')TBLPROPERTIES('hbase.table.name' = 'pages');

Exception:

错误:java.util.concurrent.ExecutionException:java.lang.RuntimeException:org.apache.spark.sql.execution.QueryExecutionException:FAILED:Execution Error,从org.apache.hadoop.hive.ql.exec.DDLTask返回代码1 . MetaException(消息:org.apache.hadoop.hbase.client.RetriesExhaustedException:尝试次数= 36后失败,例外:Tue Dec 19 11:08:04 UTC 2017,null,java.net.SocketTimeoutException:callTimeout = 60000,callDuration = 68402 :无法将IO Streams设置为hdp-dev43-worker03.dev/XX.XX.XX.31:16020 row'page ,,'在表'hbase:meta'at region = hbase:meta ,, 1.1588230740,hostname = hdp-dev43-worker03.dev,16020,1513225021878,seqNum = 0

Caused by: java.net.SocketTimeoutException: callTimeout=60000, callDuration=68402: Could not set up IO Streams to hdp-dev43-worker03.dev/XX.XX.XX.31:16020 row 'pages,,' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=hdp-dev43-worker03.dev,16020,1513225021878, seqNum=0

引起:java.io.IOException:无法将IO Streams设置为hdp-dev43-worker03.dev/XX.XX.XX.31:16020

Caused by: java.lang.RuntimeException: SASL authentication failed. The most likely cause is missing or invalid credentials. Consider 'kinit'

Caused by: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]

Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)