Is it possible to pass kafka topics in Runtime config?
Hi,
We are currently using 13.9.
We have multiple kafka topics and they are differently named with a suffix representing the environment like topic.sit, topic.uat. When we do the deployment to prod we have to make sure that the comp files where the topics are specified like event handlers or streaming consumer are cherry picked manually.
Since fetching the values from environment variable functionality is not available in MDM or ONE desktop, I was wondering if there is a way we could create an alias for these topics and define them in runtime config as runtime config will anyway not be merged in any deployments.
If someone has any workaround for this kind of situation, please suggest.
Thanks,
Page 1 / 1
Hi @hkulkarni,
There's a workaround for this type of situation (thanks to @tomas.reichel). You can wrap the Kafka Reader/Writer step into a component and create separate versions for dev, test, and prod environments in distinct folders. Then, you can reference these components using a path variable that you adjust in the runtime configuration depending on the environment.
Detailed description.
1) Create two folders env_dev and env_prod in my case. And create two copies of the same component, but with different Topic values Dev Component
Prod Component
2) Create a path variable COMPONENTS_ENV that will point to a different folder depending on environment
3) Now you can use your component referenced through a path variable where needed and server will pick up correct version of the component depending on the environment
Hi @AKislyakov,
Thanks for your prompt response.
This is a great solution for a normal comp file, but looks like it might not work for us as we are dealing with notifications going through event handlers under output interface and kafka streaming consumers.
It does have an option to map the topic as parameter in kafka writer in event handlers, but not able to get how I can pass the parameter in the event handler, as it runs automatically as soon the processing of message completes. No workflow is involved. Does parameters can be passed only from workflow ?
Thank you,
Hi @hkulkarni,
This solution works seamlessly with Event Handlers. Event Handler Plans are standard plans and follows to the same rules.
To reiterate: In the publisher plan, reference the component with the actual Kafka Writer step through a path variable. This path variable should point to a different folder depending on the environment. Therefore, at startup, your server will select the appropriate version of your Kafka Writer component based on the configuration in the runtime config file. The same applies when the plan is triggered through the Event Handler mechanism: the server will retrieve the actual path from the Runtime config and utilize the corresponding variant of the Kafka Writer component.
Hi @hkulkarni, I’m closing this thread for now, if you have any follow up questions please feel free to share them in the comments or create a new post ♀️