I'm using python queue to insert data packets from mqtt listeners But I'm not sure when this queue will be loaded by Mqtt packet. Can we put a listener on to th
In docker image johnnypark/kafka-zookeeper new topics are not created automatically. How do I set auto.create.topics.enable=true in this docker image johnnypark
I use macOS M1 Big Sur 11.2.3, but my kafka cannot running well and cannot create/list the topics. I don't know its because the OS or not, but the log for kafka
Hi I am trying to run this code in but it is working fine in another EC2 Azkaban instance but not giving below error for another instance. private val adminprop
Setup I've installed latest (7.0.1) version of Confluent platform in standalone mode on Ubuntu virtual machine. Python producer for Avro format Using this sampl
On setting kafka producer property - enable.idempotence to true kafkaProps.put("enable.idempotence" , "true"); I am getting below error - 2021-04-18 16:43:53.58
My Kafka Consumer says the following: [TopicPartition{topic=my-topic,partition=0,offset=-1000,error=None}] Whenever I run poll(), it returns None. I want the o
I have files that will be coming in daily that I would like to process as they come in and insert into existing sql tables (using postgres). What is the best wa
I am new to Apache flume https://flume.apache.org/. For one of the use-case, I need to move data from the Kafka topic on one cluster (bootstrap: bootstrap1, top
I want to use the PLC4X Connector (https://www.confluent.io/hub/apache/kafka-connect-plc4x-plc4j) to connect OPC UA (Prosys Simulation Server) with Kafka. Howev
I am setting up Kafka on my local Windows 10 machine. So downloaded all the required binaries and updated the two settings server and zookeeper properties as pe
In my application, I defined a global state store (backed by a topic “query-topic”) in order to perform specific time based operations such as "give
we have applications that work with Kafka (MSK), we noticed that once pod is starting to shutdown (during autoscaling or deployment) the app container loses all
I am trying to setup a local kafka-connect stack with docker-compose and I have a problem with my scala producer that's supposed to send avro messages to a kafk
Spring for Apache Kafka 2.8.4 under https://docs.spring.io/spring-kafka/reference/html shows some of the listener methods with @Payload annotation next to the m
I´ve made a simple pipeline in Python to read from kafka, the thing is that the kafka cluster is on confluent cloud and I am having some trouble conecting
I am using Flink v1.11.2 and Avro v1.10.1. I am trying to deserialize an Avro record as a Specific record from a Kafka topic, but for some reason keep getting t
I am trying to create a materialized table from a topic. I am creating and producing data into the topic as follows: kafka-topics.sh create --bootstrap-server
What's the best way to run the following sequence of commands kafka-console-producer --topic discounts --broker-list localhost:9092 --property parse.key=true --
First of all I have already tried setting the unclean.leader.election to true and I am still having the same problem. The brokers are still exiting with this ex