'Spring Kafka serialiser config

I upgraded from spring-kafka 2.7.2 to 2.8.4. I found my code broke where I had created my serialiser up front in my tests.


        var serialiser = new KafkaAvroSerializer(schemaRegistryClient, mapOfProperties);

// I then pass this to my KafkaTemplate
...
...

KafkaTemplate then passes the serialiser to DefaultKafkaProducerFactory. The DefaultKafkaProducerFactory does not know if the serialiser has already been configured and it attempts to call configure() on it again but this time with an empty map and it falls over because some mandatory properties are missing e.g. schema.registry.url.

Is the way forward to stop pre-configuring the serialiser and to delegate the serialiser configuration to spring?

I did some digging and found this PR which I think is related. https://github.com/spring-projects/spring-kafka/pull/1907

It's a shame the serialiser doesn't have a isConfigured() flag to prevent reconfiguration.



Solution 1:[1]

Well, I suggest you to extend that KafkaAvroSerializer class and introduce a configured property into it like we do in the JsonSerializer.

Also: feel free to raise a GH issue, so we may introduce something like setConfigureSerializers(boolean). Apparently Apache Kafka does not configure if they are provided explicitly, not via configs. Plus it looks like not all serializers are immune to reconfiguration.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 Artem Bilan