Skip to main content

importing catalog item descriptions and attribute descriptions from dbT docs


Prasad Rani
Data Pioneer
Forum|alt.badge.img+1

We are using dBT for our data pipelines, and for the tables and fields, there is description maintained in dBT that we would like to export (json) and update Ataccama Data Catalog.

Is there a known, trusted, supported way to automate this?

I already do have a desktop plan i created that can update catalog objects and attributes based on a excel input file. I may be able to modify that if thats the only option. Iam reaching out to the community to see if there is a preferred way to get descriptions and documentation from dBT to Ataccama.

April 4, 2024

Hi Prasad,
You should be able to update the descriptions for catalog items and attributes using Metadata Writer step.
First you would need to read the ID’s and names for all the existing CI’s and Attributes using Metadata Reader step, then you can join the metadata from the platform with another data stream from dbt which would contain the names of CI’s and attributes and relevant descriptions and then you would update existing CI’s and Attributes using Metadata Writer step.
Below are examples of the configuration of Metadata Reader\Writer steps:
 

And here’s a high level example of how the plan logic might look like:
 

Updating CI descriptions is rather straight forward task but in case of attributes you need to make sure that you reference the right parent catalogItemId for each attribute id.

To parse the data in json format coming from dbt you should be able to use something like Json Reader or Json Parser step in IDE.

I hope this will be helpful.

Ivan

Did this topic help you find an answer to your question?

ivan.kozlov
Ataccamer
Forum|alt.badge.img+3

Hi Prasad,
You should be able to update the descriptions for catalog items and attributes using Metadata Writer step.
First you would need to read the ID’s and names for all the existing CI’s and Attributes using Metadata Reader step, then you can join the metadata from the platform with another data stream from dbt which would contain the names of CI’s and attributes and relevant descriptions and then you would update existing CI’s and Attributes using Metadata Writer step.
Below are examples of the configuration of Metadata Reader\Writer steps:
 

And here’s a high level example of how the plan logic might look like:
 

Updating CI descriptions is rather straight forward task but in case of attributes you need to make sure that you reference the right parent catalogItemId for each attribute id.

To parse the data in json format coming from dbt you should be able to use something like Json Reader or Json Parser step in IDE.

I hope this will be helpful.

Ivan


Cansu
Community Manager
Forum|alt.badge.img+3
  • Community Manager
  • April 5, 2024

Hi @Prasad Rani, I’m closing this thread for now. If you have any follow up questions please feel free to share them here or create a new post  🙋‍♀️


Forum|alt.badge.img
  • Data Voyager
  • March 10, 2025

@ivan.kozlov thanks for your instructions above, our metadata operates with a similar structure to Prasads and I was able to follow what you have to work in my environment. 
Question for you though, I created a plan which reads metadata from an XML file, and successfully writes it to AtaccamaOne, through the logic you provided of joining based on the catalogueItemId and locationId. However, the plan only reads 1 XML file, and writes to 1 catalogue item. I have 1000+ XMLs related to 1000+ catalogue items. Is there a way to iterate through this workflow, for every XML (or DBT file). They all have the exact same formatting, so the plan i have created will work for any XML, i just cant find a way to provide multiple XMLs at input. 


Reply


Cookie policy

We use cookies to enhance and personalize your experience. If you accept you agree to our full cookie policy. Learn more about our cookies.

 
Cookie settings