Skip to main content

Hey Everyone!

We have a use case involving datastage and curious if anyone has worked with this in the past. We have a team who has datastage jobs that are triggered by a table being pushed information downstream. We are wondering if there is a way either in the web or desktop, if we can use the trigger files that are created by datastage, somehow plug into the datastage flow, or another way that would trigger a monitoring project simultaneously to run data quality checks against the load? if certain data is not loaded in correctly, it will break the connection for the two tables, and in turn break the job, and they want to see the population of what is breaking the join. Hope this makes sense!

 

Thanks,

Thomas

Hello @thomas.prociuk ,

 

I have an idea how datastage works, We can achieve the use case mentioned above by using the below steps

1) Whenever a  datastage job triggers, we can include a step in the datastage job to place a dummy file in a location along with the process that is currently running as a part of the datastage job.

2) We can create a plan which triggers the monitoring project.

3) We can use the plan in a workflow which will be triggered when the dummy file is available in that location. After the Monitoring project runs, we can delete the dummy file such that it can be used for the next run.

4) To break the job if we have invalid results, we might think of a solution similar to the above. I am hoping we can make use of post processing plan which runs after monitoring project to break the job.(Need to explore more on this)

 

Thanks & Regards,

Srija Piratla


Reply