首页 文章

无法发送SSL关闭消息

提问于
浏览
7

我有一个线程,偶尔会在Message Hub上列出主题 . 但有一段时间,我得到了:无法发送SSL关闭消息 .

有任何想法吗?

KafkaConsumer<String, String> consumer = new KafkaConsumer<>(getConsumerConfiguration());
try {
    Map<String, List<PartitionInfo>> topics = consumer.listTopics();
    return new ArrayList<String>(topics.keySet());
} finally {
    if (consumer != null) {
        **consumer.close();**
    }
}

我收到了 consumer.close 的警告 .

消费者的配置:

  • sasl.mechanism = PLAIN

  • security.protocol = SASL_SSL

  • group.id = consumer1

  • ssl.enabled.protocol = TLSv1.2

  • ssl.endpoint.identification.algorithm = HTTPS

  • ssl.protocol = TLSv1.2

  • sasl.jaas.config = org.apache.kafka.common.security.plain.PlainLoginModule required username = "USERNAME" password = "PASSWORD";

[WARN] 2018-01-25 20:12:23.204 [ClusterChannelMonitorTaskThread] org.apache.kafka.common.network.SslTransportLayer {} - 无法发送SSL关闭消息java.io.IOException:SSLEngine.wrap返回的意外状态,预计CLOSED,收到OK . 不会向同行发送密切消息 . 在org.apache.kafka.common.network.SslTransportLayer.close(SslTransportLayer.java:158)[kafka-clients-0.11.0.0.jar:?] at org.apache.kafka.common.utils.Utils.closeAll(Utils .java:663)[kafka-clients-0.11.0.0.jar:?] at org.apache.kafka.common.network.KafkaChannel.close(KafkaChannel.java:59)[kafka-clients-0.11.0.0.jar: ?] org.apache.kafka.common.network.Selector.doClose(Selector.java:582)[kafka-clients-0.11.0.0.jar:?] at org.apache.kafka.common.network.Selector.close (Selector.java:573)[kafka-clients-0.11.0.0.jar:?] at org.apache.kafka.common.network.Selector.close(Selector.java:539)[kafka-clients-0.11.0.0 . jar:?] at org.apache.kafka.common.network.Selector.close(Selector.java:250)[kafka-clients-0.11.0.0.jar:?] at org.apache.kafka.clients.NetworkClient.close (NetworkClient.java:505)[kafka-clients-0.11.0.0.jar:?] at org.apache.kafka.clients.consumer.internals.ConsumerNetworkClient.close(ConsumerNetworkClient.java:439)[kafka-clients-0.11 . 0.0.jar:?] at org.apache.kafka.clients.ClientUti ls.closeQuietly(ClientUtils.java:71)[kafka-clients-0.11.0.0.jar:?] at org.apache.kafka.clients.consumer.KafkaConsumer.close(KafkaConsumer.java:1613)[kafka-clients-0.11 .0.0.jar:?] org.apache.kafka.clients.consumer.KafkaConsumer.close(KafkaConsumer.java:1573)[kafka-clients-0.11.0.0.jar:?] at org.apache.kafka.clients . consumer.KafkaConsumer.close(KafkaConsumer.java:1549)[kafka-clients-0.11.0.0.jar:?] at com.ibm.saas.msg.kafka.KafkaMessageService.listChannelNames(KafkaMessageService.java:305)[saas-msg -kafka-TRUNK-SNAPSHOT.jar:TRUNK-SNAPSHOT]

2 回答

  • 1

    今天使用kafka客户端1.0.2也遇到了这个例外:/

  • 0

    请确保您在/ etc / hosts中设置了适当的值这非常重要 . 在我的情况下,我在server.properties文件中设置 listeners=SASL_SSL://10.10.10.3:9093 ,所以我需要在/ etc / hosts中有 10.10.10.3 SzymekKafka ,其中SzymekKafka是我的主机名 .

相关问题