Skip to main content

Hi We would like to publish the profiling results of a scan on to Azure service so that we can pick up those results and update it back to another reporting systems via Kafka or Azure Pubsub.

What we are mainly interested is in how to achieve sending publish results to Azure post scanning in runtime without impacting the performance on PostgreSQL DB where profiling results are getting logged. 

Did any one encounter similar scenario and how did you achieve it.

Regards,

Uma

Hello @Uma!

To have an automated way of sending profiling results to a kafka queue, you would need to first set up a notification handler, this notification handler would need the entity type of “profilingJob”. Whenever a profiling job is created, the profilingJob entity gets called with the information on the profiling run like its ID, which you would need to pass into the next step so you can get the profiling results. Once you created the notification handler, you will need to create a component that will be triggered by the notification handler. This component will require a one metadata reader or a JSON call step, that will take in the profiling id in order to read into one desktop the profiling results. Once the reader or json call step is done, you will pass the data into a kafka writer step, and there you can configure the template to how you would like the message to be configured.

 

The diagram would look like this:
 

For information on the notification handler, please see this link: Notifications Handler :: Ataccama ONE

Thanks :) 


Reply


ataccama
arrows
Lead your team  forward  OCT 24 / 9AM ET
×