Category "apache-kafka"

Can we create ksql table from topic together with topic data?

I have a db connector, which will detect the timestamp column update on my user table on my database and push it to my kafka topic. Thus, my kafka topic will ke

How to use Kafka with SSL via logback appander?

I use this logback appender to send logs to Kafka: https://github.com/danielwegener/logback-kafka-appender When Kafka was PLAINTEXT everything worked correctly.

Topic authorization failed error while KSQL tries connecting to cluster

I am trying to have my ksqldb-server docker instance up and connect to a remote Kafka cluster, but getting an error. Here are the details docker-compose.yml ---

confluent 5.0.1 upgrade to 7.0.1

Trying to upgrade confluent I have checked upgrade link: https://docs.confluent.io/platform/7.0.1/installation/upgrade.html I found inside it said in the Prepar

Not able to open Control Center UI on localhost

I am trying to understand confluent kafka and do some hands on with it Trying to setup on my local machine . I have followed the instructions in this doc : http

KSQL Rest INSERT not inserting

I have created a KSQL table with the following command: ksql> CREATE TABLE CUSTTABLE (id INT PRIMARY KEY, name VARCHAR, purchase VARCHAR) WITH (KAFKA_TOPIC =

Difference between JDBC Sink Connector vs Redshift Sink Connector

I'm new to Kafka and I would like to know why there are specific Database connectors like Redshift Sink Connector and why we should not go for generic JDBC sink

How to serialize protobuf message with schema id [closed]

Looking for a library or algorithm that implements serialization of a message in protobuf format with schema version which retrieved from conf

kafka - How to read and process messages in a fault tolerant way? [closed]

I am new to Kafka, and I am trying to understand how a Kafka Consumer can consume and process a message from a Kafka topic without losing mess

Kafka Consumer - Point to specific offset in Spring boot Kafka

I'm trying to seek the offset to some specific offset. It seems I need to use seekToTimestamp and there is no seekToOffset or something. I've research a lot but

auto.register.schemas set to false doesn't work as intended

auto.register.schemas=false doesn't work as I expect. If I read the documentation it's suppose to counter the producer to regsiter new schemas. https://docs.c

Access bitnami/kafka running in minikube from local machine

I am trying to access a bitnami/kafka cluster deployed using helm. I need to be able to send messages to kafka from my local machine and then have my pods proc

JSON array serialization issue after spring-cloud-stream update

I use spring-cloud-stream-binder-kafka to connect to the kafka broker I updated spring-cloud-dependencies version to 2021.0.1 I went from this [INFO] +- org.spr

spark stateful streaming with checkpoint + kafka producer

How can I integrate Kafka producer with spark stateful streaming which uses checkpoint along with StreamingContext.getOrCreate. I read this post: How to write s

Integrate Django and ReactJS with Kafka to generate some analytical data for users?

I'm implementing a Django web service, which is about to have different platform apps, Reactjs for computers, a swift app for ios, and Kotlin for android device

Spring cloud Kafka does infinite retry when it fails

Currently, I am having an issue where one of the consumer functions throws an error which makes Kafka retry the records again and again. @Bean public Consumer&l

Flink Python Datastream API Kafka Consumer

Im new to pyflink. Im tryig to write a python program to read data from kafka topic and prints data to stdout. I followed the link Flink Python Datastream API K

Consume kafka topic JSON data to postgres using JDBC connector

We have a need to push the kafka topic JSON records to postGresSql database. the JSON are compliant to https://json-schema.org/draft-07/json-schema-release-note

How to reread uncommitted messages until they are committed?

Let's say we are using Kafka with manual commits. We are processing the incoming message. But if there is a failure in processing for any reason, we want to rer

how to send a byte array in kafka with Avro Schema registry

I'm trying to send bytes with a JSON object using a Java POJO and an Avro schema, with schema registry. So my question is, how should my code send "bytes" type