'SCDF kubernetes custom source is writing data to "ouput" channel

I have custom source application which reads data from external kafka server and pass the information to next processor in the stream. In local everything works perfect. I have created docker image of the code and when i deploy stream in kubernetes env, i do see topic with the name stream.source-app got created but messages produced by source are actually going to "output" topic. I dont see this issue in local env.

application.yaml

spring:
  cloud:
    stream:
      bindings:
        Workitemconnector_subscribe:
          destination: Workitemconnector
          contentType: application/json
          group: SCDFMessagingSourceTestTool1
          consumer:
            partitioned: true
            concurrency: 1
            headerMode: embeddedHeaders        
        output:
#          destination: dataOut
          binder: kafka2              
      binders:
        kafka1:
         type: kafka
         environment:
           spring:
             cloud:
               stream:
                kafka:
                  binder:
                    brokers: xx.xxx.xx.xxx:9092
                    zkNodes: xx.xxx.xx.xxx:2181
        kafka2:
         type: kafka
         environment:
           spring:
             cloud:
               stream:
                 kafka:
                   binder:
                     brokers: server1:9092
                     zkNodes: server1:2181
spring.cloud.stream.defaultBinder: kafka1 

In local without defining any parameters during stream deployment, i notice source is consuming message from xxxx server and producing data to server1 and to the topic name "stream.sourceapp" but in kubernetes env it is acting strange. It is always sending data to "output" topic even though "stream.sourceapp" topic exists



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source