private static void deserialize() {
try {
// Deserialize Users from disk
DatumReader<User> userDatumReader = new SpecificDatumReader<>(User.class);
DataFileReader<User> dataFileReader = new DataFileReader<User>(new File("users.avro"), userDatumReader);
User user = null;
while (dataFileReader.hasNext()) {
// Reuse user object by passing it to next(). This saves us from
// allocating and garbage collecting many objects for files with
// many items.
user = dataFileReader.next(user);
System.out.println("deserialized : "+user);
}
} catch (IOException e) {
e.printStackTrace();
}
}
0
以下是我的解决方法:此代码将以 Avro 架构格式打印消费者消息。
props.put("key.deserializer","org.apache.kafka.common.serialization.StringDeserializer");
props.put("value.deserializer","io.confluent.kafka.serializers.KafkaAvroDeserializer");
props.put("schema.registry.url", "http://kafka-.XXX");
KafkaConsumer<String, avro_schema> consumer = new KafkaConsumer<>(props);
consumer.subscribe(Arrays.asList("topic"));
//infinite poll loop
try {
while (true) {
ConsumerRecords<String, avro_schema> records = consumer.poll(200L);
for (ConsumerRecord<String, avro_schema> record : records) {
System.out.println(record.value());
}
}
}
2 回答
按照以下步骤 -
1.Create avro 架构中的对象
这将基于模式的命名空间在包中生成适当的源文件
以下是我的解决方法:此代码将以 Avro 架构格式打印消费者消息。