I have a db connector, which will detect the timestamp column update on my user table on my database and push it to my kafka topic. Thus, my kafka topic will ke
I use this logback appender to send logs to Kafka: https://github.com/danielwegener/logback-kafka-appender When Kafka was PLAINTEXT everything worked correctly.
I am trying to have my ksqldb-server docker instance up and connect to a remote Kafka cluster, but getting an error. Here are the details docker-compose.yml ---
Trying to upgrade confluent I have checked upgrade link: https://docs.confluent.io/platform/7.0.1/installation/upgrade.html I found inside it said in the Prepar
I am trying to understand confluent kafka and do some hands on with it Trying to setup on my local machine . I have followed the instructions in this doc : http
I have created a KSQL table with the following command: ksql> CREATE TABLE CUSTTABLE (id INT PRIMARY KEY, name VARCHAR, purchase VARCHAR) WITH (KAFKA_TOPIC =
I'm new to Kafka and I would like to know why there are specific Database connectors like Redshift Sink Connector and why we should not go for generic JDBC sink
Looking for a library or algorithm that implements serialization of a message in protobuf format with schema version which retrieved from conf
I am new to Kafka, and I am trying to understand how a Kafka Consumer can consume and process a message from a Kafka topic without losing mess
I'm trying to seek the offset to some specific offset. It seems I need to use seekToTimestamp and there is no seekToOffset or something. I've research a lot but
auto.register.schemas=false doesn't work as I expect. If I read the documentation it's suppose to counter the producer to regsiter new schemas. https://docs.c
I am trying to access a bitnami/kafka cluster deployed using helm. I need to be able to send messages to kafka from my local machine and then have my pods proc
I use spring-cloud-stream-binder-kafka to connect to the kafka broker I updated spring-cloud-dependencies version to 2021.0.1 I went from this [INFO] +- org.spr
How can I integrate Kafka producer with spark stateful streaming which uses checkpoint along with StreamingContext.getOrCreate. I read this post: How to write s
I'm implementing a Django web service, which is about to have different platform apps, Reactjs for computers, a swift app for ios, and Kotlin for android device
Currently, I am having an issue where one of the consumer functions throws an error which makes Kafka retry the records again and again. @Bean public Consumer&l
Im new to pyflink. Im tryig to write a python program to read data from kafka topic and prints data to stdout. I followed the link Flink Python Datastream API K
We have a need to push the kafka topic JSON records to postGresSql database. the JSON are compliant to https://json-schema.org/draft-07/json-schema-release-note
Let's say we are using Kafka with manual commits. We are processing the incoming message. But if there is a failure in processing for any reason, we want to rer
I'm trying to send bytes with a JSON object using a Java POJO and an Avro schema, with schema registry. So my question is, how should my code send "bytes" type