Hi @onufry ,
I'm afraid we also don't have an automated way to keep environments in sync, so I'm as curious as you about suggestions from others. But I am responding to your question in order to give some related considerations:
- We also have a dev environment with the same purpose as you describe. When you, as you suggest, weekly overwrite the dev environment with the prod state, you would lose your new development work. Or you should have a strict process that at the end of the week all dev changes should have been deployed to prod, so prior to the prod-to-dev copy.
- In your prod data catalog you will use connections that point to production systems. Likely your organization will have a policy that in lower environments no connections to prod can be made. So having copied from prod to dev, what would that mean for all your prod sources and connections?
- I once tried the export functionality for terms. But looking at the created json, our term metadata model customization is not taken into account. So that wouldn't help us. Besides, and maybe more important, the id's of the term instances will not match the ones in another environment.
And creating export and import plan in OneDesktop for the whole metadata model will be undoable.
The only way I know on how to try to keep environments in sync regarding the metadata model, is by exporting changes in the System Changes screen and importing those changes in another environment. So in fact repeating the steps, including all erroneous steps ad following corrections. It would be much nices of course if you could deploy just the final state of something. But also now, you can be sure that in some time the environments will become out of sync. (and besides, what you describe what should be in scope of the synchronization is way more than just the model)
So hopefully more replies will be posted.
Kind regards,
Albert
Hello @Albert de Ruiter Thank you for the thoughtful considerations. To clarify, our goal isn’t to overwrite all dev work but to sync critical configurations/metadata (e.g., data quality rules, monitoring projects, catalog structures) while preserving ongoing development. Based on your experience:
-
Logical Environments vs. Physical Separation:
-
Monitoring Project Lifecycles:
-
Connection Management:
-
Metadata Model Drift:
Ideal Outcome: A process where dev can selectively pull prod configurations (without overwriting active work) and push tested changes back. Any examples of scripting partial syncs or leveraging Ataccama APIs for this?
Hi @onufry,
As @Albert de Ruiter asset promotion between environment is something that is unfortunately not fully supported at the moment.
There are some features such as Export/Import that can allow you export metadata for specific assets. There’s a set of Export Plans available out of the box but technically more can be created. However it’s important to mention here that asset id’s across environment might differ, example of that might be different dqDimensionId’s and dqDimensionResultId’s, so if you would try to Export and Import rules mismatches in id references for various asset might lead to situation where rules will be imported in broken state.
One of the way to deal with this could be through custom-built plans/components in ONE Desktop. These can be used to export metadata from source environment and then use the exported metadata to re-create the same assets in the target environment. However id’s in this case will also be different, and in addition to that you’ll need to make sure that you can update all there references to other entities with id’s relevant for target system. This can be achieved by creating lookups based on the metadata for the same entities from both environment to do the translation of id’s. So while technically it is doable, it will require a large number of very complex plans/components that will have to take into consideration all the possible conflicts. Development of such plans will require quite extensive knowledge of metadata model and relations between different entities.
Technically it is also possible to copy the MMM database with all the content from one environment to another but this is a rather complex procedure that requires a lot of planning, cross-team effort and collaboration. This is not something you would do on a weekly basis, but if there’s a huge gap between environments you might explore this option as well.
I would suggest you to reach out to your Ataccama contact who would get you in touch with our Professional Services team so you could discuss all the possible options.
I hope this helps.
Ivan
Hi,
Thanks for all the ideas.
@onufry , I wasn't aware of the logical and physical separation, but it sounds interesting. I will look into its meaning and implications.
Regarding the connections, as we have physically separated environments, all meant to connect to likewise application/data warehouse/lake environments, we have no such thing as parameters, but simply an environment specific connection definition.
Regarding the metadata model drift, acting via the System Changes, I didn't mean to suggest that it is about a full export of the model. Suppose you added one propert/attribute to a metadata entity, then via the System Changes screen in the Applied Changes tab you can download just that one change. That download you can add as new change in the other environment.
@ivan.kozlov , having different ID's is indeed one of the things to keep in mind when exporting/importing metadata between environments. Surely we use OneDesktop for exporting/importing metadata.
But I want to suggest the following approach in order to avoid the translation of ID's. What about the following approach, assuming that the content of referenced tables is equal.
- Export the metadata from the entity in a plan, including the Name properties of the referenced entities, not the referenced ID's.
- Import the metadata also with a plan, where the ID of the referenced entity is found via a join that is based on that Name property.
This of course describes the just ‘happy flow’, but just meant to make my point. As you state things can become complex quite easily, for instance when the entity is also used is relationship types. But if you understand the metadata model very well, it should be doable.
Kind regards,
Albert
@Albert de Ruiter yes, you’re right, Name joins will have to be involved in the process. Since ID’s would differ, name is the only thing that can help tie things together. So having all dependency items existing in target environment with the same naming convention is important.
However references between different entities are based around ID’s so the names would play in important role when trying to identify the right ID’s in a new environment, so I'd use names to join assets from 2 systems and have source and target ID’s in one place which then can be used to build reusable lookups for ID translation/mapping.