Skip to main content
Solved

Migrating Production Environment to Development


Hello Ataccama Community,

We currently operate two Ataccama ONE environments: dev (for testing and experimentation) and prod (for business use). Our goal is to ensure that the dev environment closely mirrors prod, especially for configuration and metadata, so that testing is as realistic as possible.

Ideally, we would like to perform a weekly migration from prod to dev—essentially overwriting the dev environment with the latest prod state. However, as I understand it, there is no automated way to fully clone the prod environment to dev, and migration of certain elements (like metadata model changes or monitoring projects) is either limited or requires manual intervention.

Questions:

  1. Migration Feasibility:

    • Is there any way to automate or script a full environment migration from prod to dev, or are we limited to manual processes for specific components?

    • What are the main obstacles or unsupported features when trying to migrate Ataccama ONE environments?

  2. Best Practices for Environment Synchronization:

    • How do you recommend managing configuration drift between dev and prod?

    • Are there tools or scripts you use to keep environments in sync, or do you rely on the import/export functionality for selected components?

  3. Monitoring Projects & Data Quality Rules:

    • For monitoring projects and DQ rules, are there new capabilities in recent Ataccama ONE versions that allow for easier migration between environments? I see that DQ rules can be exported and imported as of version 13.9.3, but what about monitoring projects

  4. Manual vs. Automated Migration:

    • What are your real-world experiences with migrating environments? Do you find that manual migration is manageable, or does it become a bottleneck?

    • Are there plans from Ataccama to improve this workflow in future releases?

Best answer by Albert de Ruiter

Hi,

Thanks for all the ideas.

@onufry , I wasn't aware of the logical and physical separation, but it sounds interesting. I will look into its meaning and implications.

Regarding the connections, as we have physically separated environments, all meant to connect to likewise application/data warehouse/lake environments, we have no such thing as parameters, but simply an environment specific connection definition.

Regarding the metadata model drift, acting via the System Changes, I didn't mean to suggest that it is about a full export of the model. Suppose you added one propert/attribute to a metadata entity, then via the System Changes screen in the Applied Changes tab you can download just that one change. That download you can add as new change in the other environment.

@ivan.kozlov , having different ID's is indeed one of the things to keep in mind when exporting/importing metadata between environments. Surely we use OneDesktop for exporting/importing metadata. 

But I want to suggest the following approach in order to avoid the translation of ID's. What about the following approach, assuming that the content of referenced tables is equal.

  1. Export the metadata from the entity in a plan, including the Name properties of the referenced entities, not the referenced ID's.
  2. Import the metadata also with a plan, where the ID of the referenced entity is found via a join that is based on that Name property.

This of course describes the just ‘happy flow’, but just meant to make my point. As you state things can become complex quite easily, for instance when the entity is also used is relationship types. But if you understand the metadata model very well, it should be doable.

Kind regards,

Albert

 

View original

Albert de Ruiter
Rocket Pioneer L1
Forum|alt.badge.img+4

Hi ​@onufry ,

I'm afraid we also don't have an automated way to keep environments in sync, so I'm as curious as you about suggestions from others. But I am responding to your question in order to give some related considerations:

  • We also have a dev environment with the same purpose as you describe. When you, as you suggest, weekly overwrite the dev environment with the prod state, you would lose your new development work. Or you should have a strict process that at the end of the week all dev changes should have been deployed to prod, so prior to the prod-to-dev copy.
  • In your prod data catalog you will use connections that point to production systems. Likely your organization will have a policy that in lower environments no connections to prod can be made. So having copied from prod to dev, what would that mean for all your prod sources and connections?
  • I once tried the export functionality for terms. But looking at the created json, our term metadata model customization is not taken into account. So that wouldn't help us. Besides, and maybe more important, the id's of the term instances will not match the ones in another environment.
    And creating export and import plan in OneDesktop for the whole metadata model will be undoable.

The only way I know on how to try to keep environments in sync regarding the metadata model, is by exporting changes in the System Changes screen and importing those changes in another environment. So in fact repeating the steps, including all erroneous steps ad following corrections. It would be much nices of course if you could deploy just the final state of something. But also now, you can be sure that in some time the environments will become out of sync. (and besides, what you describe what should be in scope of the synchronization is way more than just the model)

So hopefully more replies will be posted.

Kind regards,

Albert


  • Universe Traveller
  • June 3, 2025

Hello ​@Albert de Ruiter  Thank you for the thoughtful considerations. To clarify, our goal isn’t to overwrite all dev work but to sync critical configurations/metadata (e.g., data quality rules, monitoring projects, catalog structures) while preserving ongoing development. Based on your experience:

  1. Logical Environments vs. Physical Separation:

    • The Ataccama documentation mentions using logical environments within a single physical instance, separating prod/non-prod via permissions. Have you tried this approach instead of maintaining separate dev/prod instances? Does it reduce sync complexity while allowing safe testing?

  2. Monitoring Project Lifecycles:

    • Search results highlight using Import Configuration to replicate rules from dev to prod. Could this functionality be adapted for partial synchronization (e.g., migrating only approved DQ rules or catalog changes)?

  3. Connection Management:

    • You noted that prod connections might conflict in dev. How do teams typically handle connection templates (e.g., parameterizing data source URLs/credentials) to avoid manual adjustments during migrations?

  4. Metadata Model Drift:

    • For metadata model changes, you mentioned exporting via System Changes. Are there workflows to cherry-pick specific changes (e.g., new attributes) rather than full-model exports?

Ideal Outcome: A process where dev can selectively pull prod configurations (without overwriting active work) and push tested changes back. Any examples of scripting partial syncs or leveraging Ataccama APIs for this?


ivan.kozlov
Ataccamer
Forum|alt.badge.img+3

Hi ​@onufry
​​​​As @Albert de Ruiter asset promotion between environment is something that is unfortunately not fully supported at the moment.
There are some features such as Export/Import that can allow you export metadata for specific assets. There’s a set of Export Plans available out of the box but technically more can be created. However it’s important to mention here that asset id’s across environment might differ, example of that might be different dqDimensionId’s and dqDimensionResultId’s, so if you would try to Export and Import rules mismatches in id references for various asset might lead to situation where rules will be imported in broken state.

One of the way to deal with this could be through custom-built plans/components in ONE Desktop. These can be used to export metadata from source environment and then use the exported metadata to re-create the same assets in the target environment. However id’s in this case will also be different, and in addition to that you’ll need to make sure that you can update all there references to other entities with id’s relevant for target system. This can be achieved by creating lookups based on the metadata for the same entities from both environment to do the translation of id’s. So while technically it is doable, it will require a large number of very complex plans/components that will have to take into consideration all the possible conflicts. Development of such plans will require quite extensive knowledge of metadata model and relations between different entities.

Technically it is also possible to copy the MMM database with all the content from one environment to another but this is a rather complex procedure that requires a lot of planning, cross-team effort and collaboration. This is not something you would do on a weekly basis, but if there’s a huge gap between environments you might explore this option as well.

I would suggest you to reach out to your Ataccama contact who would get you in touch with our Professional Services team so you could discuss all the possible options.

I hope this helps.
Ivan


Albert de Ruiter
Rocket Pioneer L1
Forum|alt.badge.img+4

Hi,

Thanks for all the ideas.

@onufry , I wasn't aware of the logical and physical separation, but it sounds interesting. I will look into its meaning and implications.

Regarding the connections, as we have physically separated environments, all meant to connect to likewise application/data warehouse/lake environments, we have no such thing as parameters, but simply an environment specific connection definition.

Regarding the metadata model drift, acting via the System Changes, I didn't mean to suggest that it is about a full export of the model. Suppose you added one propert/attribute to a metadata entity, then via the System Changes screen in the Applied Changes tab you can download just that one change. That download you can add as new change in the other environment.

@ivan.kozlov , having different ID's is indeed one of the things to keep in mind when exporting/importing metadata between environments. Surely we use OneDesktop for exporting/importing metadata. 

But I want to suggest the following approach in order to avoid the translation of ID's. What about the following approach, assuming that the content of referenced tables is equal.

  1. Export the metadata from the entity in a plan, including the Name properties of the referenced entities, not the referenced ID's.
  2. Import the metadata also with a plan, where the ID of the referenced entity is found via a join that is based on that Name property.

This of course describes the just ‘happy flow’, but just meant to make my point. As you state things can become complex quite easily, for instance when the entity is also used is relationship types. But if you understand the metadata model very well, it should be doable.

Kind regards,

Albert

 


ivan.kozlov
Ataccamer
Forum|alt.badge.img+3

@Albert de Ruiter yes, you’re right, Name joins will have to be involved in the process. Since ID’s would differ, name is the only thing that can help tie things together. So having all dependency items existing in target environment with the same naming convention is important. 
However references between different entities are based around ID’s so the names would play in important role when trying to identify the right ID’s in a new environment, so I'd use names to join assets from 2 systems and have source and target ID’s in one place which then can be used to build reusable lookups for ID translation/mapping.


Reply


Cookie policy

We use cookies to enhance and personalize your experience. If you accept you agree to our full cookie policy. Learn more about our cookies.

 
Cookie settings