flink消费kafka,事务关闭问题

classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view
|

flink消费kafka,事务关闭问题

宁吉浩
hi,all
最近在使用 flink 读写kafka,频繁输出procuder close 日志
但是flink checkpoint观察一段时间没有失败,数据也写入到kafka了,观察kafka server.log 也没发现报错
目前写入kafka的数据是否有丢失,暂时还没校验,有人可以给我解释下下列的日志是什么原因吗?

频繁输出下列日志 :
2020-09-29 14:13:28,916 INFO org.apache.kafka.common.utils.AppInfoParser - Kafka version: 2.2.0
2020-09-29 14:13:28,916 INFO org.apache.kafka.common.utils.AppInfoParser - Kafka commitId: 05fcfde8f69b0349
2020-09-29 14:13:28,917 INFO org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer - Starting FlinkKafkaInternalProducer (1/1) to produce into default topic Dim_OLMS_MediaResources_collect
2020-09-29 14:13:28,918 INFO org.apache.kafka.clients.producer.internals.TransactionManager - [Producer clientId=producer-32, transactionalId=Source: readKafka_DWD2Dim_collect_DWD2Dim_collect -> flatMap_DWD2Dim_collect_DWD2Dim_collect -> Sink: DataStreamSink_DWD2Dim_collect_DWD2Dim_collect-1cf52c94a0b506fe08b6a6a2a44aa4ec-1] ProducerId set to -1 with epoch -1
2020-09-29 14:13:29,020 INFO org.apache.kafka.clients.Metadata - Cluster ID: kcv8nwDySzie-OBCzqs15w
2020-09-29 14:13:29,127 INFO org.apache.kafka.clients.producer.internals.TransactionManager - [Producer clientId=producer-32, transactionalId=Source: readKafka_DWD2Dim_collect_DWD2Dim_collect -> flatMap_DWD2Dim_collect_DWD2Dim_collect -> Sink: DataStreamSink_DWD2Dim_collect_DWD2Dim_collect-1cf52c94a0b506fe08b6a6a2a44aa4ec-1] ProducerId set to 2297 with epoch 2
2020-09-29 14:13:29,227 INFO org.apache.flink.streaming.api.functions.sink.TwoPhaseCommitSinkFunction - FlinkKafkaProducer 0/1 - checkpoint 6 complete, committing transaction TransactionHolder{handle=KafkaTransactionState [transactionalId=Source: readKafka_DWD2Dim_collect_DWD2Dim_collect -> flatMap_DWD2Dim_collect_DWD2Dim_collect -> Sink: DataStreamSink_DWD2Dim_collect_DWD2Dim_collect-1cf52c94a0b506fe08b6a6a2a44aa4ec-2, producerId=1321, epoch=2], transactionStartTime=1601359949123} from checkpoint 6
2020-09-29 14:13:29,231 INFO org.apache.kafka.clients.producer.KafkaProducer - [Producer clientId=producer-31, transactionalId=Source: readKafka_DWD2Dim_collect_DWD2Dim_collect -> flatMap_DWD2Dim_collect_DWD2Dim_collect -> Sink: DataStreamSink_DWD2Dim_collect_DWD2Dim_collect-1cf52c94a0b506fe08b6a6a2a44aa4ec-2] Closing the Kafka producer with timeoutMillis = 9223372036854775807 ms.
2020-09-29 14:14:28,914 INFO org.apache.flink.streaming.connectors.kafka.internal.FlinkKafkaInternalProducer - Flushing new partitions