我在我的localhost(127.0.0.1:9042)上运行了Cassandra 3.0.0 . 它可以从cqlsh访问,我可以创建/查询表 .
在Spark项目中,我为Cassandra连接器,Cassandra驱动程序等添加了maven依赖项 .
当我现在尝试读取或插入数据到Cassandra时,Cassandra连接器连接到Cassandra集群,但是我收到以下错误: Exception in thread "main" java.util.NoSuchElementException: key not found: 'text'
我尝试了不同的表,版本,编码等 . 什么都没有帮助 . 我认为这可能是一个问题,因为错误或缺少maven依赖 . 也许你可以帮助我 . 这是我的编码和依赖:
Cassandra 表: CREATE TABLE mykeyspace2.kv(key text PRIMARY KEY, value int);
火花代码:
val master = config.getString(Configuration.SPARK_MASTER)
logger.info("Starting, spark master:$master")
val sparkConf = new SparkConf()
.setAppName("test streaming")
.setMaster(master)
.set("spark.cassandra.connection.host", "127.0.0.1")
val sc = new SparkContext(sparkConf)
val cc = new CassandraSQLContext(sc)
val newRdd = sc.parallelize(Seq(("cat",40),("fox",50)))
newRdd.saveToCassandra("mykeyspace2","kv",SomeColumns("key", "value"))
val rdd = sc.cassandraTable("mykeyspace2", "users3")
Maven依赖:
<dependencies>
<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector_2.10</artifactId>
<version>1.5.0-M3</version>
</dependency>
<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector-java_2.10</artifactId>
<version>1.5.0-M3</version>
</dependency>
<dependency>
<groupId>com.datastax.cassandra</groupId>
<artifactId>cassandra-driver-core</artifactId>
<version>3.0.0-alpha4</version>
</dependency>
<!-- Spark dependencies -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.5.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.10</artifactId>
<version>1.5.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka_2.10</artifactId>
<version>1.5.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.10</artifactId>
<version>1.5.0</version>
</dependency>
</dependencies>
错误:
15/12/04 07:43:40 INFO Cluster: New Cassandra host /127.0.0.1:9042 added 15/12/04 07:43:40 INFO CassandraConnector: Connected to Cassandra cluster: Test Cluster Exception in thread "main" java.util.NoSuchElementException: key not found: 'text' at scala.collection.MapLike$class.default(MapLike.scala:228) at scala.collection.AbstractMap.default(Map.scala:58) at scala.collection.MapLike$class.apply(MapLike.scala:141) at scala.collection.AbstractMap.apply(Map.scala:58) at com.datastax.spark.connector.types.ColumnType$.fromDriverType(ColumnType.scala:81) at com.datastax.spark.connector.cql.ColumnDef$.apply(Schema.scala:117) at com.datastax.spark.connector.cql.Schema$$anonfun$com$datastax$spark$connector$cql$Schema$$fetchPartitionKey$1.apply(Schema.scala:199) at com.datastax.spark.connector.cql.Schema$$anonfun$com$datastax$spark$connector$cql$Schema$$fetchPartitionKey$1.apply(Schema.scala:198) at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244) at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244) at scala.collection.Iterator$class.foreach(Iterator.scala:727) at scala.collection.AbstractIterator.foreach(Iterator.scala:1157) at scala.collection.IterableLike$class.foreach(IterableLike.scala:72) at scala.collection.AbstractIterable.foreach(Iterable.scala:54) at scala.collection.TraversableLike$class.map(TraversableLike.scala:244) at scala.collection.AbstractTraversable.map(Traversable.scala:105) at com.datastax.spark.connector.cql.Schema$.com$datastax$spark$connector$cql$Schema$$fetchPartitionKey(Schema.scala:198) at com.datastax.spark.connector.cql.Schema$$anonfun$com$datastax$spark$connector$cql$Schema$$fetchTables$1$2.apply(Schema.scala:239) at com.datastax.spark.connector.cql.Schema$$anonfun$com$datastax$spark$connector$cql$Schema$$fetchTables$1$2.apply(Schema.scala:238) at scala.collection.TraversableLike$WithFilter$$anonfun$map$2.apply(TraversableLike.scala:722) at scala.collection.immutable.Set$Set3.foreach(Set.scala:115) at scala.collection.TraversableLike$WithFilter.map(TraversableLike.scala:721) at com.datastax.spark.connector.cql.Schema$.com$datastax$spark$connector$cql$Schema$$fetchTables$1(Schema.scala:238) at com.datastax.spark.connector.cql.Schema$$anonfun$com$datastax$spark$connector$cql$Schema$$fetchKeyspaces$1$2.apply(Schema.scala:247) at com.datastax.spark.connector.cql.Schema$$anonfun$com$datastax$spark$connector$cql$Schema$$fetchKeyspaces$1$2.apply(Schema.scala:246) at scala.collection.TraversableLike$WithFilter$$anonfun$map$2.apply(TraversableLike.scala:722) at scala.collection.immutable.HashSet$HashSet1.foreach(HashSet.scala:153) at scala.collection.immutable.HashSet$HashTrieSet.foreach(HashSet.scala:306) at scala.collection.TraversableLike$WithFilter.map(TraversableLike.scala:721) at com.datastax.spark.connector.cql.Schema$.com$datastax$spark$connector$cql$Schema$$fetchKeyspaces$1(Schema.scala:246) at com.datastax.spark.connector.cql.Schema$$anonfun$fromCassandra$1.apply(Schema.scala:252) at com.datastax.spark.connector.cql.Schema$$anonfun$fromCassandra$1.apply(Schema.scala:249) at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withClusterDo$1.apply(CassandraConnector.scala:121) at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withClusterDo$1.apply(CassandraConnector.scala:120) at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:110) at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:109) at com.datastax.spark.connector.cql.CassandraConnector.closeResourceAfterUse(CassandraConnector.scala:139) at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:109) at com.datastax.spark.connector.cql.CassandraConnector.withClusterDo(CassandraConnector.scala:120) at com.datastax.spark.connector.cql.Schema$.fromCassandra(Schema.scala:249) at com.datastax.spark.connector.writer.TableWriter$.apply(TableWriter.scala:263) at com.datastax.spark.connector.RDDFunctions.saveToCassandra(RDDFunctions.scala:36)
2 回答
我尝试了几种连接器版本和驱动程序的组合 . 最后我发现了一个适合我的组合 . 实际上,来自spark连接器git页面的兼容性矩阵似乎不正确 . 使用他们提到的兼容性列表,组件不起作用(https://github.com/datastax/spark-cassandra-connector) .
工作组合如下:
Cassandra 2.1.12
我正在使用Cassandra 3.2.1,下面的依赖项工作正常 . 迟到的答案,但它可能会帮助某人 .
对于以下版本,我收到了像 insufficient heap memory 这样的错误,所以我将参数设置为 -Xmx1024m -Xms512m