Category "apache-kafka"

Failed to send HTTP request to schema-registry

I am trying to setup a local kafka-connect stack with docker-compose and I have a problem with my scala producer that's supposed to send avro messages to a kafk

Do I have to use @Payload spring annotation to read Kafka message

Spring for Apache Kafka 2.8.4 under https://docs.spring.io/spring-kafka/reference/html shows some of the listener methods with @Payload annotation next to the m

How to connect kafka IO from apache beam to a cluster in confluent cloud

I´ve made a simple pipeline in Python to read from kafka, the thing is that the kafka cluster is on confluent cloud and I am having some trouble conecting

Deserialize Avro from kafka as SpecificRecord Failing. Expecting type to be a PojoTypeInfo

I am using Flink v1.11.2 and Avro v1.10.1. I am trying to deserialize an Avro record as a Specific record from a Kafka topic, but for some reason keep getting t

Materialized table from a topic in ksqlDB

I am trying to create a materialized table from a topic. I am creating and producing data into the topic as follows: kafka-topics.sh create --bootstrap-server

How to write a bash command to open up a kafka producer and immediately follow with topic to create?

What's the best way to run the following sequence of commands kafka-console-producer --topic discounts --broker-list localhost:9092 --property parse.key=true --

Kafka 1.0.2 Exiting because log truncation is not allowed for partition X

First of all I have already tried setting the unclean.leader.election to true and I am still having the same problem. The brokers are still exiting with this ex

Does Kafka config "retention.bytes" apply to compact topic?

Does "retention.bytes" apply to "compact" topic? The reason why I came here is that in lenses in my current project, I saw the partition bytes is 2GB which is w

org.xerial.snappy.SnappyError: [FAILED_TO_LOAD_NATIVE_LIBRARY] no native library is found for os.name=Mac and os.arch=aarch64

I'm building a cdc pipeline to read mysql binlog through maxwell and putting them into kafka my compression type is snappy in maxwell config.But at consumer end

Add a new Kafka Node to an Existing Kafka Cluster without downtime

I have a 3 node Kafka cluster with a single zookeeper node, my question is how can I add a new Kafka node to this cluster without downtime?

Spring cloud stream with kafka streams binder: how to set `trusted.packages` for a Stream Processor (that's different than consumer and producer)

I have a simple stream processor (not consumer/producer) that looks like this (Kotlin) @Bean fun processFoo():Function<KStream<FooName, FooAddress>, KS

Accessing schema.name from debezium sink connector to postgres

We have a debezium source connectors working perfectly fine, and one of the properties set is, for example: "transforms.SetSchemaMetadata.schema.name": "myschem

Can one Kafka Producer Class have multiple @EventListener methods?

Im using Spring Kafka and wrote Producer Class @Component @RequiredArgsConstructor class Producer { private static final String TOPIC = "channels"; pri

Can one Kafka Producer Class have multiple @EventListener methods?

Im using Spring Kafka and wrote Producer Class @Component @RequiredArgsConstructor class Producer { private static final String TOPIC = "channels"; pri

No broker/node available in test with Kafka in TestContainers

I am trying to create a bare-bones skeleton integration test for Kafka with TestContainers: just publish message to topic and check it arrives to it (entire set

Best way to join two (or more) kafka topics in KSQL emiting changes from all topics?

We have a "microservices" platform and we are using debezium for change data capture from databases on these platforms which is working nicely. Now, we'd like t

Kafka broker fails inside Docker container without meaningful logs

When I launch Docker container with Kafka broker it fails sometimes, but I can't understand by logs what exactly happens, logs always are: # docker-compose up b

kafka s3 sink connector keys and headers s3 storage write not working

I have enabled "store.kafka.keys" : "true", "store.kafka.headers" : "true", "keys.format.class" : "io.confluent.connect.s3.format.json.JsonFormat", "headers.for

Unable to set topic with camelCase name via Env variable

I'm deploying zeebe using helm. With extraInitContainers directive I manage to include the kafka-exporter 3.1.1 and it loads correctly. In the yml file I set a

Avro Definition for Custom Aggregator

I have a code where I am aggregating the data from Kafka stream via: StreamsBuilder streamsBuilder = new StreamsBuilder(); streamsBuilder.table(AppConfigs.t