'How to stop Kafka producer messages (Debezium + Azure EventHub)

I have setup Debezium and Azure Event Hub as CDC engine from PostgeSQL. Exactly like on this tutorial: https://dev.to/azure/tutorial-set-up-a-change-data-capture-architecture-on-azure-using-debezium-postgres-and-kafka-49h6

Everything was working good until I have changed something (I don't know exactly what I changed). Now my kafka-connect log is spammed with below WARN entry and CDC stopped working...

[2022-03-03 08:31:28,694] WARN [dbz-ewldb-connector|task-0] [Producer clientId=connector-producer-dbz-ewldb-connector-0] Got error produce response with correlation id 2027 on topic-partition ewldb-0, retrying (2147481625 attempts left). Error: REQUEST_TIMED_OUT (org.apache.kafka.clients.producer.internals.Sender:616)
[2022-03-03 08:31:28,775] WARN [dbz-cmddb-connector|task-0] [Producer clientId=connector-producer-dbz-cmddb-connector-0] Got error produce response with correlation id 1958 on topic-partition cmddb-0, retrying (2147481694 attempts left). Error: REQUEST_TIMED_OUT (org.apache.kafka.clients.producer.internals.Sender:616)
[2022-03-03 08:31:28,800] WARN [dbz-ewldb-connector|task-0] [Producer clientId=connector-producer-dbz-ewldb-connector-0] Got error produce response with correlation id 2028 on topic-partition ewldb-0, retrying (2147481624 attempts left). Error: REQUEST_TIMED_OUT (org.apache.kafka.clients.producer.internals.Sender:616)
[2022-03-03 08:31:28,880] WARN [dbz-cmddb-connector|task-0] [Producer clientId=connector-producer-dbz-cmddb-connector-0] Got error produce response with correlation id 1959 on topic-partition cmddb-0, retrying (2147481693 attempts left). Error: REQUEST_TIMED_OUT (org.apache.kafka.clients.producer.internals.Sender:616)

This messages appear even when I delete the Kafka connectors. Restarting kafka and kafka connect does not help. How to stop this retries?



Solution 1:[1]

Only thing that helps to workaround is to:

  1. Delete connector from Debezium API
  2. Stop Kafka-Connect
  3. Delete the EventHub
  4. Start Kafka-Connect
  5. Add connector from Debezium API

To permanently change how reconnect works change below parameter of producer:

  • producer.retries=10 (by default it is set to over 2 billions causing spam in kafka-connect.log)

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 Tcheslav Tcheslav