featured-image

Ataccama documentation has a new home

We're thrilled to announce that Ataccama product documentation can now be found at https://docs.ataccama.com for version 14.5.x and later.In addition to the visual refresh and updated branding, the new documentation portal features better search, improved navigation, and feedback options for a smoother and more comfortable user experience. Moreover, our documentation is now open to the public*, so you can find answers to your questions even faster.*Exceptions include upgrade guides and security-related information, such as Security Advisories, which remain available only to Ataccama customers after authentication.What if I'm using version 14.4.0 or earlier?Versions prior to 14.5.0 are available only to Ataccama customers. You can find them at https://support.ataccama.com/downloads. Log in using your existing credentials.What if I can't find my product at https://docs.ataccama.com?This means that the migration for this product is still in the works and it will take us a few more weeks before it's completed.In the meantime, you can access the documentation in PDF format at https://support.ataccama.com/downloads. Log in using your existing credentials.Where do I find installation and upgrade packages?Log in to https://support.ataccama.com/downloads using your existing credentials, then find the version you're using.Give us feedback!Given the scope of the project, there is a chance you run into some hiccups and spot imperfections. Please let us know about these issues and help us make the portal even better.We hope you enjoy the new documentation experience and look forward to your feedback! ❗Some of the links shared on the community might not be accessible due to the documentation portal transition, please get in touch with us to request an up-to-date documentation link or access to previous resources. 

featured-image

Data Integration in v15.1 🚀

Hi community 👋Curious about what’s new on Data Integration features in v15.1?We have updates bringing you a streamlined deployment, enhanced customization, and extended capabilities to elevate your data processing experience. Read on to find out more 😎 Self-service deployment in managed Kubernetes Easily deploy data-processing capabilities to your Kubernetes infrastructure with no manual setup required. Our new entirely self-service update allows you to integrate with the platform's cloud services much more easily. This feature primarily focuses on ease in deployment if you are operating in your own Kubernetes cluster, it also minimizes the amount of data leaving its premises and adding another layer of privacy & security 🔐 Custom file import settings📂Tailor CSV and Excel file import settings for individual files, folders, or entire connections, providing enhanced flexibility. You can now, populate your Data Catalog with correct metadata or object storage connection and fix the file import options so that every file can be imported and processed correctly each time.How it works?Grant access only to the file import settings. Delegate updates and edits on catalog items and the file import settings for the file, any (sub)folder or whole connection. It’s also possible to change the settings for a file while only having the preview of data read with the modified settings. Orchestration Server updatesWe now have more but simplified capabilities for the Portal Orchestration Server enabling additional connectivity to Azure Data Lake Storage 2, Google Cloud Storage, Google Drive, Kafka, and Salesforce.That’s not all! There is an extended monitoring update with business metrics for workflows, tasks, and plans, including running/max and successful/failed indicators.Learn more about the improved configurations and file handling and debugging. Questions? Thoughts? Share them in the comments below 👇

Related products:DI
featured-image

Updates to Data Governance v15.1 🏢

Hello community! 👋 We hope that you are excited about all the updates and here is another one!Get ready for the latest in Data Governance with version 15.1, in which we are both introducing new features while improving some existing functionalities 🚀 What’s new?Tableau Server Connection in Report Catalog:In addition to the Tableau Cloud connectivity, you can now connect ONE Catalog with the Tableau Server (On-Premise). The connection is unidirectional and allows you to store your reports preview of the assets created in Tableau. How it works?Create the connection to ONE with your BI Tool (Tableau). You can follow the instructions described in Tableau Connection, and import reports. These can be on several levels: the whole Project, a Workbook of the particular project, a Data Source of the workbook, a particular Sheet, a Dashboard, or a Story. You can either select the whole project or click to expand the list of workbooks. Review & Sign Off Term Suggestions in One Place:Save time with the new Term Suggestions under the Knowledge Catalog for processing AI term suggestions. You can now see, approve, and reject them from a single place. Suggestions processed here are automatically published. How it works?Filter out the suggestion results according to your business needs and review the term suggestions available. You can approve or reject in the Action column with one click. Review Workflow Usability Improvements:You can now use more granular task assignments in the Review Request Workflow and have reviews done by specific roles or users. If there are any changes in the team structure or access levels, the right people can see the task and reassign it. Streamline approval of the changes you make and minimize room for errors with the enhanced workflow. How it works?You can now assign roles in the Review Request Workflows, and ensure security and access is given to the right users. If there are changes in the team structure or access levels, you can reassign the task as well.  Enhanced Metadata Import:Now, you can reliably scan even very large data sources, and access the results faster than before.How it works?If you have a large-scale data source in your data landscape, you can import their structure (metadata) in one go easily. The limits of assets in one data source were improved by order of magnitude for example, for common databases you can import up to 850,000 catalog items (tables) at once.There is also a significant improvement in the metadata importing speed up to 3x faster than before ⚡ Let us know if you have any questions or feedback on the latest updates in Data Governance in the comments below👇

Related products:DG
featured-image

Updates to ONE Data v15.1 ✨🎨

Hi everyone, We are so happy to share more exciting updates in v15.1 🚀 Read on to find out more about what’s new on ONE Data in the latest version 👇Let’s dive in 🤿 Remediate DQ issues from Monitoring ProjectYou can now import data and DQ results from Monitoring Project (MP) into ONE Data in a self-service way without any IT support to define post-processing plans using ONE Desktop. Your data and DQ results will be loaded to ONE Data in each Monitoring Project run with an option to choose to import All Records vs Only Invalid Records. UX improvements 🎨We also have multiple UX improvements from the ability to fill in multiple cells at once, enhanced keyboard shortcuts, DQ filter behavior, and more!  Stewardship in ONE Data tables 👥You can now manage access to ONE Data tables through stewardship and also change the access levels when you create a new dataset. If you are creating a table from scratch instead of loading an existing catalog item or importing a file, change the stewardship from the table Overview tab.Top tip ✨ assign ownership of your tables when creating a new ONE Data table to prevent any unauthorized access. If it is not set on the table level, the stewardship configuration is inherited from the data source. Overwrite existing ONE Data tables ✍️When loading data to ONE Data from another catalog item in ONE, you can now overwrite an existing table instead of creating a new one.To do this, select Overwrite existing ONE Data table and choose the table you want to replace. Keep in mind this deletes all existing data from the selected table.Data operations within the platform are updated to distinguish from operations exporting data outside of it. We have a new control design and terminology upgrade. Load to ONE Data function as a standalone operation for easy access. Let us know what you think of the newest improvements in the comments below 👇

Related products:ONE Data
featured-image

Updates to Data Stories v15.1🎇

Hi community 👋In this post, find out the latest updates for Data Stories in v15.1, such as seamless navigation between Data Stories and the Knowledge Catalog for a unified experience and the enhanced Visualization Builder with condensed layouts, and new functionalities.Now you can use Data Stories more intuitively and efficiently for data storytelling🚀Let’s dive in!What’s new with Data Stories?Seamless navigation to Knowledge Catalog📚Link to Knowledge Catalog in Data Stories navigation:Navigate between Data Stories and the Knowledge Catalog from the was never easier with direct access to your Knowledge Catalog.  Visualization Builder 🎨Updated navigation and attribute list:A more condensed layout designed to fit displays with smaller resolutions, ensuring a more efficient and visually pleasing experience.Enhanced data view:Source catalog item or Data Quality (DQ) result names prominently displayed above raw data in the table, providing a clear reference to data sources.Ability to rearrange tabs in the Visualization Builder:When you have multiple visualizations in your collection, it’s not always easy to find the ones you need. Now, you can organize your tabs in collections for easy management and group them by topic or visualization type.Ability to define segments in the KPI widget:The KPI widget now allows you to define specific performance ranges visually, providing a clearer representation of key data points.Top/Bottom values feature refresh:The formerly named Top N feature has been rebranded to Top/Bottom Values, accompanied by a refreshed UI and an information box explaining its impact on visualization results. The support was also extended to Pivot Table. Additional ImprovementsEffortless visualization copying:Management of visualization has been made easier with the new features that allows you to copy or move a visualization from one collection to another. This feature not only saves you time but also helps you better organize your visualizations by grouping them according to your needs.Localization support:You can now use Data Stories in English, German, French, Italian, and Czech 🗣️ Your ONE language preference will be reflected on Data Stories and you can always change it under User Settings.We hope that you are as excited about these upcoming enhancements as we are! They are built to make your Data Stories experience even more intuitive, efficient and tailored to your specific needs.Stay tuned for more exciting announcements and share your thoughts on the latest Data Stories features in the comments 👇

Related products:DQDS
featured-image

ONE AI is here 🚀

Hi everyone 👋Get ready for some exciting news🥁🥁🥁This release is probably one of the most awaited updates for Ataccama ONE - we are so excited to share that ONE AI is finally here!ONE AI can generate DQ Rules, create and interpret SQL queries, automate routine tasks, and more, allowing your teams to devote their time to more strategic and value-driven initiatives, while simultaneously making trusted data easily accessible to all users.AI-Powered Data Quality ✅ Generate complex DQ rules from plain text or apply AI-driven rule suggestions Empower any user to improve DQ effortlessly by providing recommendations and creating actionable DQ rules via plain text conversions – no need to code. ONE AI can generate rule suggestions based on your metadata and profiling results, and you can decide & approve new rules according to relevancy and your needs. It’s also possible to use ONE AI to automatically detect the most relevant DQ rules for your specific datasets and create ONE Expressions in rule implementation.  Assisted Rule Management, Anomaly Detection, Time Series Analysis and Freshness Monitoring are just a few other highlights of the AI-driven data quality features, making data rule creation and assignment much easier.  AI-powered Data Governance📖Generate Table DescriptionsAutomatically generate descriptions for your table catalog items to streamline the data catalog management, enhancing data asset documentation. Now, you can perform data documentation much faster leveraging the new ONE AI for automated asset categorization, classification, and description creation. Use AI to fill in a table description in the database or catalog, and help your team members facilitate through various tables in ease. Don’t forget to save before leaving! SQL generation 🪄Create SQL queries using natural language Interpret existing SQL queries for better understanding Perform SQL-to-text explanationsSay goodbye to the complexities of SQL!When you need to create specific data assets for tracking the Data Quality and anomalies on the partition of table you’d initially use SQL.Now, you can use ONE AI and express your data needs in plain text and let ONE AI to generate the correct SQL syntax for the specific database. ONE AI can also interpret existing SQL query that might have been created by any of your colleagues and turn into plain text for ease of understanding by anyone. Chat with our documentation 💬Get simplified documentation access; chat with 100+ docs available on any app page Ask your questions and let the AI help you with navigating through our documentation with follow-up’s and resourcesYou can now ask questions and query data using plain language in ⚡ speed. Don't dig through the long product documentation. Simply ask what you want to know and ONE AI will answer it for you.For a detailed overview of the ONE AI v15.1 features, check out Generative AI in ONE.Stay tuned for more updates and share your questions and feedback on the ONE AI below! 👇To find out more, please contact us here, or reach out to your Account Executive. 👋

Related products:DGDQ
featured-image
featured-image
featured-image

14.5 Updates: MDM & RDM 🤟

Hi all!We hope that you have enjoyed learning more about the recent updates & improvements on version 14.5. If you haven’t had the chance to check it out, you can find them below 👇In this post, I’ll be covering the latest changes in MDM & RDM on v14.5. Read on to find out what’s new 👀 Master Data Management:Task Management REST API 📃We have introduced a REST API for task management in MDM. Using REST interfaces, you can perform operations on individual or multiple tasks. The REST API currently supports the following operations:Retrieve tasks for a specific record or based on specific parameters. Retrieve details, comments, history for a specific task. Assign a task. Add a comment to a task. Delete a task. Move a task to a different workflow state. Discard a task. Create multiple tasks. Update multiple tasks.Server Operations and Resetting the Environment from Admin Center 🚜We now have a new functionality that enables you to stop and start the MDM Server from the Admin Center, reset the environment, as well as enable automatic synchronization of the lookups from MinIO to locally available folders after every restart.The new options are available on the Server Dashboard tab under Administration in the Admin Center. Reference Data Management:UX Enhancements 🎨Publish when Creating or Editing a RecordYou can now publish changes immediately after creating or editing a record instead of first moving the record to publish and then publishing. This is particularly useful for smaller teams with no dedicated roles for approving changes.To publish changes from the create or edit dialog, select Publish or choose Publish from the actions menu. Timepicker Added for Datetime Attributes ⏲️Use the timepicker to select a time more easily when creating or editing a record. The following elements are supported:Hours (either 12-hour or 24-hour clock) Minutes Seconds AM/PM switch   Revert Single Record Change ⏺️For more flexibility when editing, you can undo a change made to a specific attribute value and restore it to the last published state. To do this, in the Edit dialog, select the orange dot to compare the current and the published value, then choose Revert to published.In addition, when reverting all changes made to records, this action is now called Restore instead of Undo. Checkbox for Boolean ValuesAttributes of Boolean type can now also be set using a checkbox.Other Improvements ↗️Edited records are consistently labeled with an orange dot when viewing record details. Information about record validity is clearly visible: valid records are marked with a green tick and invalid records with a red exclamation mark. We have renamed the following elements: The Confirmed viewing mode is now called Published. In RDM REST API, while Published is the preferred term, you can continue using Confirmed interchangeably. The Waiting for confirmation state in workflows is now called Waiting for publishing. When comparing the current and the published version of an edited record, the Old value and Original value are now called Edited value and Published value. Improved Record Validation ✔️Schedule full validation on all records that will run when you restart the application. Previously, full validation was performed when RDM was first started as well as on every restart. To do this, go to RDM Admin Console and select Schedule revalidation, then restart the application.Records are now validated at the application restart only if a previous validation failed on a particular table or the table structure was modified, which helps speed up the startup time.If you change your mind before restarting RDM, you can cancel revalidation after you schedule it by selecting Disable revalidation from the same screen. Fixed Permissions Editable in RDM WebappIf your RDM permissions are provided using the configuration model, you can now edit them directly in the web application. Once you make any changes to permissions, this custom configuration is applied instead of the roles defined in the configuration model. And that’s all for 14.5 updates, let us know what you think of them in the comments below 👇

Related products:MDMRDM

14.5 Updates: ONE Data 🟪

Hi all!In this post, I’ll cover the latest features and updates for ONE Data in 14.5 ⚡If you have missed the previous posts on 14.5 updates & new features you can check them out here 👇 Let’s get into it! Data Deduplication ♊You can now deduplicate your datasets to easily create managed reference data. Once your reference data is ready, use it in DQ rules to continuously improve the quality of your data.How? Start by opening the catalog item from which you want to extract reference data, then expand the three dots menu for a particular attribute and select Create reference table.Choose your deduplication key (one or a combination of several attributes) and select any additional attributes that help you better describe your data. In addition to these, the table will also contain a frequency attribute, which stores the number of occurrences for each record.Review and update your data as needed and apply it in DQ rules.  DQ Metadata Automatically Updated 🔁DQ metadata used in DQ filters is automatically updated each time you run a DQ evaluation and each time you edit a record (except for bulk edits). This ensures you're working with the latest DQ results available, which in turn facilitates the process of data remediation. It also allows you to use DQ filters on any ONE Data table, not only those created by importing invalid records from an existing catalog item.If you edit one of the filtered records, it is validated on the fly and removed from your filtering results if it no longer fits the criteria. Additionally, a warning is displayed, informing you that the overall quality of your table might be outdated.  UX Improvements 🎨 Changes to Multi-edit Values OptionThe Multi-edit values option is now called Bulk edit. To help prevent unwanted edits, you can see how many records are affected by your action as well as whether you are editing an entire attribute or a selection of records (filtered or not).If you are editing a selection of records, you can also easily select which attribute to edit without having to select the records again. Support for Copy-pastingCopy and paste cells for faster inline data editing. Pasted cells are evaluated on the fly, allowing you to quickly spot issues in the data. When copying data between attributes of different types, cells are converted to the corresponding data type if possible, otherwise your action is ignored.In addition, pasting cells overwrites any existing data and automatically adds additional rows if needed. Improved Support for Microsoft Excel FilesWhen importing a Microsoft Excel file with more than one sheet, you can now choose which sheet to import. Select sheet option in the file import configuration or the first sheet in the file is selected. Improved Table CreationCreating an empty table now produces a template that you can further edit. Rename the table and modify the table structure by adding, removing, or renaming attributes, then start entering your data.We have also reorganized the table creation menu. Selecting Create table in ONE Data now instantly generates a new table.  Easier Attribute and Table RenamingWe have made it easier to rename ONE Data tables and attributes. To edit, double-click the name in the header, provide a new value, and press Enter to save your changes. Stay tuned for more!

Related products:DGDQ
featured-image

14.5 Updates: ONE 🟣

Hi everyone!This week, we’ll continue covering the latest features and updates in 14.5 ⚡Today on the menu we have ONE updates and features in 14.5. Let’s dive in 🤿DQ Firewalls 🧯DQ Firewalls allow you to apply data quality rules to your data using API calls, specifically, the evaluation APIs provided by the new DQF Service. Both GraphQL and REST options are available.This allows you to be able to maintain one central rule library and use Ataccama data quality evaluation rules on your data in real-time in the data stream. For example, you have an ETL pipeline in Python that processes data, and you want to make sure that it filters out invalid records. After defining the DQ rule in ONE, the pipeline for each record (or batch of records) can call the DQF endpoint, and records will be split up by their validity.DQF ServiceWith 14.5 we have a new DQF service that handles rule debug in rules and in monitoring projects, as well as allows live data quality validation in ONE Data and the new DQ firewall feature.Two types of API are provided by the service:Management APIs: These APIs configure the service itself. They are primarily called by the Metadata Management Module (MMM) to create a new DQ firewall, get DQ firewall statistics (such as average execution time, number of calls, pass and fail ratio), configure global default authentication and ad hoc rule evaluation. Evaluation APIs: APIs that accept actual data and return results of DQ evaluation, that is, the calls used for the DQ firewall feature. They are always bound to a specific firewall configuration, which also defines authentication.Freshness Checks Added to Data Observability ✅Freshness is an essential factor for good data quality and it indicates how often data is updated at the source. You can now set thresholds for freshness either manually or using AI, and the system checks your data source for updates, alerting you when a table is updated late or not at all.On the dashboards, you can see information such as the number of missing updates, the time between the last update and the last check, the expected time between updates, and the detection type (AI or manual). Home Page 🏡We now support two additional widgets:Getting started with Ataccama: Widget containing links to onboarding sections within our documentation and community webpages. Tasks: Manage and track your tasks, assign tasks to others, set due dates, and monitor progress.Additionally, access to landing pages is now managed through a new View page access level. Viewers with roles that include this access level can view landing pages and their content but cannot make any changes. New Default Templates for Notifications 🔔Slack, MS Teams, and email notifications now use new default templates. Custom templates from older versions will still be useable after upgrade, but if you want to make further edits to custom templates, you must convert them to the new template syntax.Improvements to MANTA Lineage Integration 🖇️Thanks to all community members who have contributed with their time and feedback here, we have updated how you import and explore lineage metadata from MANTA in ONE.Previously, browsing lineage in ONE required a direct connection to MANTA. With 14.5, you first generate a compatible snapshot of lineage metadata in MANTA and then import and map it to relevant data sources in ONE. We have also expanded the support for lineage to additional connection types.We have redesigned the catalog item Lineage tab to make the navigation more intuitive and introduced new exploration options such as searching (or filtering) source or target items, opening the selected item in MANTA viewer, or viewing transformation details for attributes that are subject to complex transformation. Group Isolation 🌿You can now create isolated groups or branches, which consist of an isolated group and children. When a group is isolated, members of the group or branch are restricted from sharing assets and assigning stewardship to assets outside of their group or branch.| Support for Apache Parquet 👍We have added support for the following Apache Parquet assets: files, tables, and partitioned tables. When assets other than Parquet files are imported, ONE analyzes the asset and then creates a catalog item with the attributes based on the asset. Support for Microsoft Excel 🔢You can now work with Microsoft Excel files in ONE! When you import a Microsoft Excel file, each sheet is loaded as a separate catalog item, while the file itself corresponds to a location within your data source. This way, you can see all catalog items imported from the same Microsoft Excel file.Check the catalog item Relationships or Overview tab to see a list of other catalog items from the single file import. You can profile and evaluate catalog items imported from Microsoft Excel files as you would any other catalog item. Previously, Microsoft Excel files could be imported to ONE, but no further processing was possible. Support for Italian LanguageYou can now change the language of the application to Italian.Audit Log Retention Changes 🪵By default, audit logs are now stored in an internal audit database for 90 days in newly installed environments and upgraded environments without existing audit logs. In environments with existing audit logs, the default retention is extended to one year.You can now change the default retention and cleanup settings for the audit database directly from Audit. You can also set up a schedule for exporting, retention, and cleanup of audit logs in a designated ONE Object Storage (MinIO) bucket. You can also manage access to the Audit module with newly added identity provider (Keycloak) roles.Connect to Multiple Databricks Clusters from a Single DPEYou can now connect to multiple Databricks clusters from a single DPE with new plugin.metastoredatasource.ataccama.one.cluster.<clusterId> property patterns you can specify in the dpe/etc/application.properties file. To use multiple Databricks clusters, properties related to each cluster configuration should use the cluster ID associated with that cluster.Cancel All JobsA Cancel all jobs button has been added to the DPM Admin Console. When selected, all jobs that are not in the state KILLED, FAILURE, or SUCCESS are cancelled and can't be resumed.DPE Label in Run DPM Job Workflow TaskWhen creating a Run DPM Job workflow task, you have the option to set a DPE label that is then matched against labels assigned to available DPEs. What are your thoughts on the latest updates and features? Anything you are excited to try? Let us know in the comments below 👇

featured-image

14.5 Updates: Data Stories 📊🤝✏️

Hi everyone!This week is an exciting one at Ataccama - version 14.5 is out and about! We will be sharing what’s new on the community starting today with Data Stories. So read on to find out what’s new ✨ What are Data Stories?Data Stories is a data visualization tool designed to create, present, and share complex data as dynamic and self-explanatory reports.Data Stories offers two types of reports: Dashboards and Stories.Both Dashboards and Stories have sections with visualizations and static content. While static content allows you to add text, images, or video, visualizations allow you to create bar, line, or pie charts, in addition to pivot tables and more.You can use Data Stories when you need a tool…to create data-driven and dynamic presentations where it’s easy to showcase results, and ideas and display dynamic insights. to build dashboards with real-time data which allows stakeholders to understand what is happening at the moment and also to quickly find any answers to potential questions.  What’s new?📊Enhanced VisualizationsCreate visualizations and share them across multiple reports. Visualizations are now standalone and each visualization can be used in multiple reports. The updated wizard-led process makes it much easier to build, duplicate, customize, or remove visualizations.   Streamlined user collaboration using Collections. Collections are folders for storing unlimited visualizations that you can engage with or contribute to. You can also manage multiple visualizations at once using tabs and apply visualization-specific filters. Build visualizations on top of DQ results and customize your attributes. DQ results are now available in Data Stories! Use them as out-of-the-box datasets that help you assess data quality across monitoring projects and catalog items.   Improved customization. You can define both custom metrics and dimensions, pin certain values, and formulate custom queries.📈Changes to ReportsRevamped Dashboard creation. You can now create dynamic dashboards with an interactive, responsive grid layout, add visualizations, and enrich them with static widgets such as videos, images, and text.   Enhanced Stories creation. With 14.5 you can craft compelling, data-driven narratives through slide-based presentations, customize transitions, and incorporate visualizations and static widgets.   Refreshed UI. We have uplifted the look with updated features such as advanced filters and expanded customization options to make creating reports even easier.   🤝 Sharing and CollaborationYou can now control how collections and visualizations are shared through role assignment and ownership. As the collection creator, you are assigned as its owner and can grant or remove roles such as Reader or Editor within the collection. If needed, you can also transfer ownership to another user or group of users.Another improvement here is you can now share your reports with your team members and other users to increase collaboration and visibility. 📌 New icon on Ataccama ONE DQG suiteData Stories is now seamlessly integrated with ONE GEN2, which allows to building of custom reports on top of the data quality information.⚠️ After the the upgrade, your existing reports are migrated, please check out the documentation to find out more about the migration process.What do you think of the new updates? Let us know in the comments below 👇

featured-image
featured-image
featured-image
featured-image
featured-image
featured-image

Updates to our Data Stories 🥁

 Hi community! Today, we are very excited to share the latest updates to our Data Stories tool. Now, you can connect it with Snowflake, utilize our Data Explorer, and improved filters! In case you're not familiar, Data Stories is our rapid data visualization tool designed to create, present, and share data-driven narratives using dynamic reports, animations, and tooltips. So, what’s new? 👀 Data Explorer We have a brand new feature - Data Explorer! Using the Data Explorer, you can perform quick data explorations and understand your data much faster but also:    Save explorations for reuse as answers to your data questions Utilize textual annotations and overlays on data visuals  Explore your data and identify patterns, trends, and outliers to get insights about customer behavior, market trends, business performance, and more  Connection to Snowflake  Using our new native connector, it’s now possible to create data visualizations on your Snowflake data, without moving it anywhere. You can simply connect to your Snowflake instance directly from Data Stories, explore the data, and start creating stories and visualizations on top of it. Improved filters We've made several improvements to our Data Stories filters as well, including simplified selection of filtered values, a new date picker for easier date selection, and new filtering conditions.These updates will help you to focus on a specific data segment and extract meaningful insights from the data.We hope you like our latest updates to Data Stories! Please let us know if you have any feedback or questions in the comments below👇

Related products:DS
featured-image
featured-image