'Kafka producer can't create topics and throwing continuous error after creating Debezium MySQL connector

I am using Debezium as a CDC tool to stream data from MySql. After installing Debezium MySQL connector to Confluent OSS cluster, I am trying to capture MySQL bin_log changes into a Kafka topic. When I create a connector, after taking the snapshot of the database, I am left with a continuous series of error.

I checked MySql bin_log is ON and tried restarting schema-registry and connectors with different serializers. But I am getting the same errors.

Error logs show:

[2019-06-21 13:56:14,885] INFO Step 8: - Completed scanning a total of 955 rows from table 'mydb.test' after 00:00:00.086 (io.debezium.connector.mysql.SnapshotReader:565)
[2019-06-21 13:56:14,886] INFO Step 8: scanned 1758 rows in 2 tables in 00:00:00.383 (io.debezium.connector.mysql.SnapshotReader:601)
[2019-06-21 13:56:14,886] INFO Step 9: committing transaction (io.debezium.connector.mysql.SnapshotReader:635)
[2019-06-21 13:56:14,887] INFO Completed snapshot in 00:00:01.055 (io.debezium.connector.mysql.SnapshotReader:701)
[2019-06-21 13:56:14,965] WARN [Producer clientId=producer-5] Error while fetching metadata with correlation id 11 : {kbserver=UNKNOWN_TOPIC_OR_PARTITION} (org.apache.kafka.clients.NetworkClient:968)
[2019-06-21 13:56:15,066] WARN [Producer clientId=producer-5] Error while fetching metadata with correlation id 12 : {kbserver=UNKNOWN_TOPIC_OR_PARTITION} (org.apache.kafka.clients.NetworkClient:968)
[2019-06-21 13:56:15,168] WARN [Producer clientId=producer-5] Error while fetching metadata with correlation id 13 : {kbserver=UNKNOWN_TOPIC_OR_PARTITION} (org.apache.kafka.clients.NetworkClient:968)
[2019-06-21 13:56:15,269] WARN [Producer clientId=producer-5] Error while fetching metadata with correlation id 14 : {kbserver=UNKNOWN_TOPIC_OR_PARTITION} (org.apache.kafka.clients.NetworkClient:968)
[2019-06-21 13:56:15,370] WARNDebezium [Producer clientId=producer-5] Error while fetching metadata with correlation id 15 : {kbserver=UNKNOWN_TOPIC_OR_PARTITION}

The connector payload that I am sending is as follows:

{
      "name": "debezium-connector",
      "config": {
            "connector.class": "io.debezium.connector.mysql.MySqlConnector",
            "tasks.max": "1",
            "key.serializer": "io.confluent.connect.avro.AvroConverter",
            "value.serializer": "io.confluent.connect.avro.AvroConverter",
            "database.hostname": "localhost",
            "database.port": "3306",
            "database.user": "test",
            "database.password": "test@123",
            "database.whitelist": "mydb",
            "table.whitelist": "mydb.test",
            "database.server.id": "1",
            "database.server.name": "kbserver",
            "database.history.kafka.bootstrap.servers": "kafka:9092",
            "database.history.kafka.topic": "db-schema.mydb",
            "include.schema.changes": "true"
       }
    }

does someone know why this is happening or how can I fix it?



Solution 1:[1]

please refer to database.whitelist and table.whitelist. They are inconsistent.

It should be either mydb and mydb.test or db and db.test depending on the name of database.

Solution 2:[2]

I had the same error... In my case, I had not created the "schema change topic" ahead of time. See: https://debezium.io/documentation/reference/stable/connectors/sqlserver.html#about-the-debezium-sqlserver-connector-schema-change-topic

Once I created that topic, the error went away and data began streaming.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 Jiri Pechanec
Solution 2 mike d