Hello, here is a short video to illustrate how an organization might use Ataccama Data Quality to run checks as part of an existing data pipeline, and verify data integrity before further processing. With Ataccama, you can use various data warehouses or ETL tools, and GraphQL API integration allows conditional scheduling and more complex orchestration.
In this demonstration, we use DataBricks Notebook to trigger DQ execution and retrieve the results. These determine whether to halt or commence the pipeline, and in-app dashboards display the metrics. All results and metadata can also be exported to a data mart table for further analysis and visualization.
The platform is designed to provide visible and usable data quality insights, and we offer flexible integration capabilities to suit any data stack. Thanks for watching.