I am trying to setup a local kafka-connect stack with docker-compose and I have a problem with my scala producer that's supposed to send avro messages to a kafk
Spring for Apache Kafka 2.8.4 under https://docs.spring.io/spring-kafka/reference/html shows some of the listener methods with @Payload annotation next to the m
I´ve made a simple pipeline in Python to read from kafka, the thing is that the kafka cluster is on confluent cloud and I am having some trouble conecting
I am using Flink v1.11.2 and Avro v1.10.1. I am trying to deserialize an Avro record as a Specific record from a Kafka topic, but for some reason keep getting t
I am trying to create a materialized table from a topic. I am creating and producing data into the topic as follows: kafka-topics.sh create --bootstrap-server
What's the best way to run the following sequence of commands kafka-console-producer --topic discounts --broker-list localhost:9092 --property parse.key=true --
First of all I have already tried setting the unclean.leader.election to true and I am still having the same problem. The brokers are still exiting with this ex
Does "retention.bytes" apply to "compact" topic? The reason why I came here is that in lenses in my current project, I saw the partition bytes is 2GB which is w
I'm building a cdc pipeline to read mysql binlog through maxwell and putting them into kafka my compression type is snappy in maxwell config.But at consumer end
I have a 3 node Kafka cluster with a single zookeeper node, my question is how can I add a new Kafka node to this cluster without downtime?
I have a simple stream processor (not consumer/producer) that looks like this (Kotlin) @Bean fun processFoo():Function<KStream<FooName, FooAddress>, KS
We have a debezium source connectors working perfectly fine, and one of the properties set is, for example: "transforms.SetSchemaMetadata.schema.name": "myschem
Im using Spring Kafka and wrote Producer Class @Component @RequiredArgsConstructor class Producer { private static final String TOPIC = "channels"; pri
Im using Spring Kafka and wrote Producer Class @Component @RequiredArgsConstructor class Producer { private static final String TOPIC = "channels"; pri
I am trying to create a bare-bones skeleton integration test for Kafka with TestContainers: just publish message to topic and check it arrives to it (entire set
We have a "microservices" platform and we are using debezium for change data capture from databases on these platforms which is working nicely. Now, we'd like t
When I launch Docker container with Kafka broker it fails sometimes, but I can't understand by logs what exactly happens, logs always are: # docker-compose up b
I have enabled "store.kafka.keys" : "true", "store.kafka.headers" : "true", "keys.format.class" : "io.confluent.connect.s3.format.json.JsonFormat", "headers.for
I'm deploying zeebe using helm. With extraInitContainers directive I manage to include the kafka-exporter 3.1.1 and it loads correctly. In the yml file I set a
I have a code where I am aggregating the data from Kafka stream via: StreamsBuilder streamsBuilder = new StreamsBuilder(); streamsBuilder.table(AppConfigs.t