Hi, I notice that for some of my terms, the visual Insights is visible and for others it is not. What is the reason for this?Second question: I would want to automatically assign terms. if the column is named wnplts then I want to assign the term Residence to it. Is there an easy way to do this? Detection rules seem to look at the value and not at the column name.kind regards, Jur Dördregter
I have created a rule that detects if a record is in the table more than once based on aggregation of 2 attributes in OneWeb application. This gives a proper result in Invalid samples data and exports giving 2 errors if the record is in the table 2 times etc. But the customer requires that only 1 record should be visible in Invalid samples and also in exporting to db. It does not matter which one. Is it possible to configure the result to do this?
Hi,I have certrain fields in my data that should be masked for all users. This is information like social security number and IBAN. Now users can see this data from the sample data tab and from the profile. Can I apply some some masking of these attributes without have to create a VCI or SCI for this? Ideally masking should be applied based on the term assignment: as soon as I map the data to IBAN it should be masked for all users.Regards, Jur Dördregter
Hi Community,Would there be a way in Ataccama Desktop to select records from for instance entity term that have been inserted/updated/deleted from a certain datetime, like the Published on datetime in the History tab? My intention is to use the metadatareader to read the details from term and then use this mutation datetime in the filter in the metadatareader.Kind regards,Albert
Hello allA basic (and rookie) question, but I couldn’t find the answer in the manual or tutorials, or on this forum.After I’ve loaded source data into Ataccama ONE Desktop, I’m looking to create a new column and fill it with the concatenated content of some other columns (two or more), maybe with a bit of text in between. In Excel I’d use the Concatenate function...Example from ExcelAs you can see, I can then lookup information based on Column D and use that for bringing in various lookup values (the real use case is not name-based, this is just to create a simple example)Note - I’m not looking to merge different tables (with join, union, or representative creator - unless those tools can do what I’m looking for) - this is simply creating a new column/field which is a concatenation of two or more other fieldsContext: I’m building a lookup based on multiple attributes - I’ll combine them into a single attribute to give me a single lookup key and then I can use that for various other pur
Hi We would like to publish the profiling results of a scan on to Azure service so that we can pick up those results and update it back to another reporting systems via Kafka or Azure Pubsub.What we are mainly interested is in how to achieve sending publish results to Azure post scanning in runtime without impacting the performance on PostgreSQL DB where profiling results are getting logged. Did any one encounter similar scenario and how did you achieve it.Regards,Uma
Hello,I hoped somebody has already worked this out so I don’t have to 🤣. I have a Epoch Unix Timestamp value that I want to convert into normal DATETIME format in Ataccama.e.g. 1711015200 I was expecting converted value to be 2024-03-21 10:00:00 (GMT).But when I do a toDateTime() on it, it gives me 1970-01-20 20:16:55. So I guessed wrong.I thought about using dateadd() but that one does not support seconds. Any ideas?
Is it possible to read/extract Data Quality results using metadata reader? has anyone tried that. Can some one share a template if you have one?I dont want it to be a Post Processing Component. I rather want it as an independent Plan that can be scheduled to run daily at a specific time that extracts all the DQ results of all projects of that day. And then using JDBC write i want to add it to my database. And then run some powerBI reports.thanks.
The response message was:null and response error was 'errorMsg:java.net.NoRouteToHostException: No route to host at java.base/sun.nio.ch.Net.connect0(Native Method) at java.base/sun.nio.ch.Net.connect(Net.java:579) at java.base/sun.nio.ch.Net.connect(Net.java:568) at java.base/sun.nio.ch.NioSocketImpl.connect(NioSocketImpl.java:593) at java.base/java.net.SocksSocketImpl.connect(SocksSocketImpl.java:327) at java.base/java.net.Socket.connect(Socket.java:633) at org.apache.http.conn.scheme.PlainSocketFactory.connectSocket(PlainSocketFactory.java:121) at org.apache.http.impl.conn.DefaultClientConnectionOperator.openConnection(DefaultClientConnectionOperator.java:180) at org.apache.http.impl.conn.ManagedClientConnectionImpl.open(ManagedClientConnectionImpl.java:326) at org.apache.http.impl.client.DefaultRequestDirector.tryConnect(DefaultRequestDirector.java:605) at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:440)
Hello hive mind,I am calling an API that gets me some foreign exchange rate data (all from USD). The data that comes back is in this format:timestamp AUD EUR GBP 2024-04-24 12:00:00 1.538766 0.935636 0.803964 I’ve highly simplified this. There’s a possible169 currencies here, I’m only showing 3. I need to unpivot it and get it into this format:timestamp from_ccy to_ccy rate 2024-04-24 12:00:00 USD AUD 1.538766 2024-04-24 12:00:00 USD EUR 0.935636 2024-04-24 12:00:00 USD GBP 0.803964 My current plan looks like this: It is fine for 3 currencies. But not fine for 169!Anyone got any ideas for steps I can use to keep my plan clean and tidy? (And that I don’t have to make 169 copies of Alter Format 😱)
Hello Ataccama team I am working with ataccama 14.5 , I would like to know if there is a documentation to know more about dpe service. Kind Regards
Hi,We have several Monitoring Projects that are scheduled to run at 07:00. If for instance 1000 jobs are still in the queue those scheduled MP will run after the 1000 jobs (assuming the priority is the same; 0). In order to change this I have to go to DPM and change the priority of the scheduled MPs. Is there a way where I can schedule a Monitoring Project at a specific time, 07:00, with a certain priority (automatically)?
Hello I would like to know if there is documentation about the architecture of ataccama. I know ataccama has different services integrated (dqf,cooment,audit,dmm,dqp,task,workflow..). In order to know how services are integrated and communicate each other , does ataccama has a document?
Hi Community,The purpose of the follwing scenario is to update multiple term names and definitions. I have created a plan that writes the terms names, definitions and GID to a csv file. In that file I can make updates on the names and definitions. With another plan I read that file and write it back into Ataccama.That works well, except in the following case. The text in the Definition can be entered on multiple lines by using the Enter button. So OneWeb will show:Text on line 1.Text on line 2.Text on line 3.In the export (the csv file) the text will look like this:"[{""type"":""paragraph"",""children"":[{""text"":""Text on line 1.""}]},{""type"":""paragraph"",""children"":[{""text"":""Text on line 2}]},{""type"":""paragraph"",""children"":[{""text"":""Text on line 3}]}]"If you import the text like this, the content will not appear on 3 lines, but literally as above and on one line.How can I import the text so it is on 3 lines again?Thanks for any suggestion.Kind regards,Albert
Hey Everyone!We have a use case involving datastage and curious if anyone has worked with this in the past. We have a team who has datastage jobs that are triggered by a table being pushed information downstream. We are wondering if there is a way either in the web or desktop, if we can use the trigger files that are created by datastage, somehow plug into the datastage flow, or another way that would trigger a monitoring project simultaneously to run data quality checks against the load? if certain data is not loaded in correctly, it will break the connection for the two tables, and in turn break the job, and they want to see the population of what is breaking the join. Hope this makes sense! Thanks,Thomas
Hello everyone, We recently upgraded to version 13.9.3, and gained the ability to export DQ rules from one Ataccama environment and import them into another. This works great, but I need to do the same thing for our monitoring projects. I would like to be able to export them from DEV and import them into a higher environment. Unfortunately, there doesn’t seem to be a plan available in my version to export monitoring projects. Has anybody figured out a way to migrate monitoring projects without recreating them in each environment? Alternately, I am considering trying to write my own plan. Is there any documentation out there on how to write the JSON for export plans? Thanks!
Master Data Management & Reference Data Management 🛰️
Data Quality & Data Governance ⚙️
News & Announcements 📣
Ataccama User Lab 🧪
Already have an account? Login
No account yet? Create an account
Enter your username or e-mail address. We'll send you an e-mail with instructions to reset your password.
Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.
Sorry, our virus scanner detected that this file isn't safe to download.