Skip to main content
Solved

Writing data to snowflake database as part of post processing

  • September 3, 2025
  • 1 reply
  • 27 views

Hi Team, I have a requirement where I need to write the post processing results (post DQ, failed records) into snowflake database in a dedicated schema.

If I set up a user X to perform this write operation in snowflake schema, does it also need to have read access to the data objects that I am monitoring for data quality.

Let’s say data objects are on schema s1 of database ABC (snowflake) and post processing results are to be saved on schema s2 of database PQR (snowflake)

Thanks for your response!

Best answer by anna.spakova

Hello ​@mp_ataccamauser ,

thank you for your question. I personally don’t have much experience with Snowflake specifically, but taking it in general, how projects and post-processing work, it should be fine if your user writing the post-processing results doesn’t have access to the objects being monitored.

For the monitoring itself and DQ evaluation, a connection and credentials defined under Source → Connection are being used, but the post-processing JDBC Writer step itself will most likely use a connection defined in the Global runtime configuration in DPM Admin Console. And this connection can have different credentials. As long as this user has access to the objects use are writing into as part of the post-processing, you should be fine. Very often, clients send the results into  - let’s say - S3, but the monitored objects are in Snowflake/MSSQL etc. So completely different technologies.

Let me know in case you run into any issues, and I can forward this to a Snowflake expert.

Kind regards,

Anna

1 reply

anna.spakova
Ataccamer
Forum|alt.badge.img+3
  • Ataccamer
  • 211 replies
  • Answer
  • September 10, 2025

Hello ​@mp_ataccamauser ,

thank you for your question. I personally don’t have much experience with Snowflake specifically, but taking it in general, how projects and post-processing work, it should be fine if your user writing the post-processing results doesn’t have access to the objects being monitored.

For the monitoring itself and DQ evaluation, a connection and credentials defined under Source → Connection are being used, but the post-processing JDBC Writer step itself will most likely use a connection defined in the Global runtime configuration in DPM Admin Console. And this connection can have different credentials. As long as this user has access to the objects use are writing into as part of the post-processing, you should be fine. Very often, clients send the results into  - let’s say - S3, but the monitored objects are in Snowflake/MSSQL etc. So completely different technologies.

Let me know in case you run into any issues, and I can forward this to a Snowflake expert.

Kind regards,

Anna