Hi am trying to execute the KSQL query using npm package ksqlDb-client package, it throws an timeout error. I have attached the code as well, please let me kno
I have setup Apache MirrorMaker 3.0.0 with active-active strategy for two Kafka clusters (named DC, DR). So topic on DC is replicated by MirrorMaker2 as DC.<
I've tried to get the offset from Kafka topic based on timestamp when I tried to run it was throwing null pointer error, Map<TopicPartition, Long> timest
The image below is my topic. Every once in a while, a value other than an empty or json is returned. If it is null, it throws an error as below and enters the
I have to add encryption and authentication with SSL in kafka. This is what I have done: Generate certificate for each broker kafka: keytool -keystore server
producer sends messages 1, 2, 3, 4 consumer receives messages 1, 2, 3, 4 consumer crashes/disconnects producer sends messages 5, 6, 7 consumer comes back up
I am reading this one: Automatic Commit The easiest way to commit offsets is to allow the consumer to do it for you. If you configure enable.auto.commit=t
I'm using Spring Kafka integration and I've my own value generic serializer/deserializer as shown below Serializer: public class KafkaSerializer<T>
I want to test end-to-end exactly once processing in flink. My job is: Kafka-source -> mapper1 -> mapper-2 -> kafka-sink I had put a Thread.sleep(100
My biggest question is that do we have a concept in kafka called as consumer group to be inactive/active. I am not sure how to achieve this in my local kafka
Using PHP Laravel Framework to consume kafka messages with the help of the mateusjunges/laravel-kafka laravel package. Is it possible to save the offset by cons
I am very new to Kafka. I am creating two topics and publishing on these two topics from two Producers. I have one consumer which consumes the messages from bot
I'm using Kafka 0.10.2 and Avro for the serialization of my messages, both for the key and for the value data. Now I would like to use Kafka Streams but I'm stu
I'm a begginer on kafka as well as docker, I have been doing a course and working with kafka producer and consumer but for some reason it is not working. When I