Category "apache-kafka-connect"

kafka s3 sink connector keys and headers s3 storage write not working

I have enabled "store.kafka.keys" : "true", "store.kafka.headers" : "true", "keys.format.class" : "io.confluent.connect.s3.format.json.JsonFormat", "headers.for

Kafka connect s3 sink multiple partitions

I have multiple questions about the kafka connect S3 sink connector 1.I was wondering if its possible using the S3 sink of kafka connect to save records with mu

kafka connect consumer group members with no assigments available

I have a kafka connect task which fetches data from a topic with 3 partitions and send the data to a cassandra sink, so I have kconnect in distributed mode with

How to run kafka s3 sink connector in confluent 6.2.0

Have installed confluent 6.2.0 in my 3 kafka nodes and also installed confluentinc-kafka-connect-s3-10.0.1 in 3 nodes and modified the quickstart-s3.properties

Confluent connect 5.5.1 is throwing Exception: java.lang.OutOfMemoryError UncaughtExceptionHandler in thread kafka-coordinator-heartbeat-thread |

I am having a large cluster of Confluent Kafka comprising of multiple sub-clusters One for Zookeeper, another for Kafka broker with Schema Registry and KSQL str

Kafka & Connect - how to fix AVRO Schema Data type

Setup Multiple independent source systems push AVRO events into a Kafka topic. A Kafka S3 sink connector reads AVRO events from this topic and writes into S3 pa

How to load multiple postgresql tables into multiple kafka topics in google cloud environment?

load multiple postgresql tables into multiple kafka topics in google cloud environment using pubsub or kafka connect.

Kafka connector and Schema Registry - Error Retrieving Avro Schema - Subject not found

I have a topic that will eventually have lots of different schemas on it. For now it just has the one. I've created a connect job via REST like this: { "name"

java.lang.RuntimeException: Failed to resolve Oracle database version

I am using debezium oracle connector in kafka connect.While starting connector I am getting below error, java.lang.RuntimeException: Failed to resolve Oracle da

Can Kafka Connect consume data from a separate kerberized Kafka instance and then route to Splunk?

My pipeline is: Kerberized Kafka --> Logstash (hosted on a different server) --> Splunk. Can I replace the Logstash component with Kafka Connect? Could

Kafka producer can't create topics and throwing continuous error after creating Debezium MySQL connector

I am using Debezium as a CDC tool to stream data from MySql. After installing Debezium MySQL connector to Confluent OSS cluster, I am trying to capture MySQL bi

Message loss/missing from same topic, read with different consumer group in Kafka

I have been encountering a weird issue with Kafka and Confluent Sink Connector which I am using in my setup. I have a system where in I have two kafka connect s

"The $changeStream stage is only supported on replica sets" error while using mongodb-source-connect

I get an error when running kafka-mongodb-source-connect I was trying to run connect-standalone with connect-avro-standalone.properties and MongoSourceConnector

How can a org.apache.kafka.connect.data.Decimal stored in an avro file be converted to a python type?

I am trying to interpret a Avro record stored by Debezium in Kafka, using Python { "name": "id", "type": {

Debezium mySql connecter not able to capture insert and updater operation

I am trying to implemnt CDC piline with Debezium mysql connecter and kafkal But Source connecter not able to pusblish event for insert and update operation in t

Multiple postgres connections with debezium connector

When I create a kafka connect connector with the debezium connector, it results in four database connections. Three of them remain idle, while one works as the

Kafka Connect failed to start

I installed kafka confluent oss 4.0 on a fresh linux centos 7 but kafka connect failed to start. Steps to reproduce : - Install Oracle JDK 8 - Copy confluen

How to stop Kafka producer messages (Debezium + Azure EventHub)

I have setup Debezium and Azure Event Hub as CDC engine from PostgeSQL. Exactly like on this tutorial: https://dev.to/azure/tutorial-set-up-a-change-data-captur

How to do type conversion when transferring data from MongoDB to kafka with debezium?

In Mongodb, the objectid is base64. I'm streaming these docs to Kafka using Debezium. How can I get ObjectId to be written as UUID in kafka? Mongo Example Doc :

How to stream data from Kafka to MongoDB by Kafka Connector

I want to stream data from Kafka to MongoDB by using Kafka Connector. I found this one https://github.com/hpgrahsl/kafka-connect-mongodb. But there is no step t