Skip to main content
Best Practice

Data Quality and Pipeline Execution

Data Quality and Pipeline Execution
Forum|alt.badge.img

Hello, here is a short video to illustrate how an organization might use Ataccama Data Quality to run checks as part of an existing data pipeline, and verify data integrity before further processing. With Ataccama, you can use various data warehouses or ETL tools, and GraphQL API integration allows conditional scheduling and more complex orchestration.

In this demonstration, we use DataBricks Notebook to trigger DQ execution and retrieve the results. These determine whether to halt or commence the pipeline, and in-app dashboards display the metrics. All results and metadata can also be exported to a data mart table for further analysis and visualization.

The platform is designed to provide visible and usable data quality insights, and we offer flexible integration capabilities to suit any data stack. Thanks for watching.


https://www.youtube.com/watch?v=H24Un9V_46E

 

Did this topic help you find an answer to your question?

Reply


Cookie policy

We use cookies to enhance and personalize your experience. If you accept you agree to our full cookie policy. Learn more about our cookies.

 
Cookie settings