Skip to main content

🖇️ Ataccama default connections - Part 2️⃣

🖇️ Ataccama default connections - Part 2️⃣
Forum|alt.badge.img+1

Hi everyone, 

 

Following up on our previous post about Ataccama Default Connections, we're back with Part 2 to explore more technologies and connections available.

 

Whether you're looking to fine-tune connection behavior, manage credentials, or better understand how default connections are leveraged across ONE, this post will walk you through it all. Let’s get into it! 👇

 

Data Sources: API and Filesystems 🗂️
 

Azure Data Lake Storage Gen2 and Blob Storage

Show content

Azure ADLS Gen2 (Azure Data Lake Storage Gen2) and Azure Blob Storage both store unstructured data, but ADLS Gen2 is optimized for big data analytics, while Blob Storage is a general-purpose object storage.

Both are Natively supported connections by Ataccama

Connection to Azure ADLS Gen2 or Azure Blob Storage requires read-only access.

Additional requirement:

  • Soft delete for blobs must be disabled to avoid access issues, if not disabled may lead to the Status code 409 error message.

Authentication options:

  • Azure AD client credential - Client ID, Client Secret and Tenant ID need to be provided and are stored securely upon saving. Providing details through the Azure KeyVault is possible.
  • Azure AD managed identity - The managed Identity Client ID need to be provided and are stored securely upon saving.
  • Storage account access key - ADLS shared key need to be provided and are stored securely upon saving. Providing details through the Azure KeyVault is possible.

📚Read more on our documentation ​​​​​

 

Amazon S3 Buckets

Show content

Ataccama uses the AWS SDK to connect to Amazon S3 via the S3 REST API, which requires a properly configured network connection. S3 REST API Connection requires DNS resolution, outbound internet access, or a VPC endpoint if private access is needed. Ensure firewall rules, VPC endpoints, and S3 regional availability allow access.

Ataccama ONE supports multiple authentication options for connecting to Amazon S3, including Access Keys, AWS Web Identity IAM Role, and AWS EC2 VM Instance IAM Role.

Access Keys consist of an Access Key ID and Secret Access Key, providing direct authentication but requiring careful management to avoid security risks. AWS Web Identity IAM Roles allow authentication using federated identities, enabling secure access without storing static credentials. AWS EC2 VM Instance IAM Roles provide seamless authentication for applications running on EC2 instances, automatically assuming permissions assigned to the instance profile. Each method ensures secure access to S3, depending on the deployment environment and security requirements.

<aside> 💡

Note that for AWS EC2 VM Instance IAM Roles Authentication, additional configuration is required for the DPE application.properties:

constraints.com.ataccama.dpe.plugin.dataconnect.aws.instance.iam=[boolean]true

</aside>

S3 REST API URL is created automatically based on the information provided during the source configuration

📚Read more on our documentation  ​​​​

 

Google Cloud Storage

Show content

Google Cloud Storage (GCS) provides REST API access for managing objects and buckets. Authentication requires service account key. Network access to Google APIs must be ensured.

IAM Permissions: The user or service account must have roles/storage.objectViewer (read-only) permissions

API Access: Ensure Cloud Storage JSON API is enabled in Google Cloud Console.

Networking: Outbound access to https://storage.googleapis.com on port 443 (HTTPS).

VPC Service Controls (If Used): Adjust policies if GCS is restricted within a private network.

Authentication options:

  • Google service account key credentials - the service account key file needs to be provided and is stored securely upon saving.

📚Read more on our documentation​​​

 

Salesforce

Show content

Salesforce REST API enables secure integration with external applications for querying, updating, and managing Salesforce data. To establish a connection, authentication via OAuth 2.0 is required, and proper network access to Salesforce endpoints must be ensured.

To connect Salesforce with Ataccama ONE, authentication can be established using OAuth 2.0 credentials or standard login credentials. When using OAuth, a Client ID (OAuth 2.0 consumer key) and Client Secret are required, along with a Refresh Token to maintain continuous access without user intervention. This method is recommended for secure and automated connections.

Alternatively, authentication can be performed using standard login credentials, which require a Salesforce username and password. If accessing Salesforce from an untrusted IP address, a Security Token must be included to verify the connection. Ensuring the correct authentication method is used allows Ataccama ONE to securely access Salesforce data for integration, monitoring, and data quality evaluation.

API Access Permission: The Salesforce user must have "API Enabled" in their profile.

Network Access: Ensure outbound requests can reach *.salesforce.com on port 443 (HTTPS).

IP Whitelisting (Optional): If IP restrictions are enabled, whitelist client IPs in Setup > Network Access.

By fulfilling these prerequisites, Ataccama ONE can effectively connect to Salesforce, enabling seamless data integration and management.

Authentication options:

  • OAuth credentials - Client ID, Client secret and Refresh token needs to be provided and is stored securely upon saving.
  • Login credentials - A username and password along with Secret token ****are required to establish a connection to the JDBC datasource. Credentials are provided in the Ataccama UI and stored securely. A read-only user is sufficient, as Ataccama never writes back to the client’s datasource.

📚Read more on our documentation

 

PowerBI

Show content

To connect Power BI with Ataccama ONE, several prerequisites must be met. First, an Azure Active Directory (AD) application must be available with at least one security group assigned. The Ataccama application should be added as a member of this security group to enable authentication and authorization.

In the Power BI Admin Portal, the setting "Service principals can use Fabric APIs" must be enabled under Tenant settings, and the correct security groups must be assigned to allow Ataccama to access Power BI data.

Ataccama ONE also requires the Power BI application URL (https://app.powerbi.com) to be included in the Content Security Policy (CSP) directives, ensuring that Power BI reports can be accessed and embedded properly.

Network Access: Ensure outbound requests can reach *.powerbi.com on port 443 (HTTPS).

Authentication options:

  • The Client ID, Client secret, and Tenant ID need to be provided and is stored securely upon saving. Providing details through the Azure KeyVault is possible.

📚Read more on our documentation

 

PowerBI Repost Server

Show content

To connect Ataccama ONE with the Power BI Report Server, ensure the following prerequisites are met:

  1. Power BI Report Server URL: The web portal URL of your Power BI Report Server must be accessible
  2. Content Security Policy (CSP) Configuration: Add the Power BI Report Server URL to the CSP directives in Ataccama ONE to allow secure integration.
  3. NTLM Authentication: Ataccama ONE supports only NTLM authentication for connecting to the Power BI Report Server. Ensure you have valid credentials (username and password) with appropriate access rights to the server
  4. Network Accessibility: Both Ataccama ONE and Power BI Report Server should have network configurations that allow them to communicate effectively.

Network Access: Ensure outbound requests can reach *.powerbi.com on port 443 (HTTPS).

Meeting these prerequisites ensures a successful connection between Ataccama ONE and the Power BI Report Server, enabling you to import and manage your reports within Ataccama's platform.

Authentication options:

  • Username and password - A username and password are required to establish a connection to the JDBC datasource. Credentials are provided in the Ataccama UI and are stored securely. A read-only user is sufficient, as Ataccama never writes back to the client’s datasource.

📚Read more on our documentation

 

Tableau

Show content

To integrate Tableau with Ataccama ONE, ensure the following prerequisites are met:

  1. Supported Tableau Version: Only Tableau version 2022.3 and later are supported in Ataccama ONE.
  2. Personal Access Token (PAT): Create a Personal Access Token in Tableau for a user with at least an Explorer site role. This token is used for browsing and importing data.
  3. Connected App Configuration: Set up a Connected App in Tableau with direct trust to enable data preview. This configuration can be assigned to all projects or a specific one.
  4. Content Security Policy (CSP) in Ataccama ONE: Add the Tableau application URL (e.g., https://<your-tableau-server>) to the Content Security Policy directives in Ataccama ONE to allow secure integration.
  5. Tableau Server Specifics: If using Tableau Server, ensure that Metadata Services are enabled for GraphQL, and Personal Access Token (PAT) impersonation is activated. Apply all changes using Tableau Services Manager (TSM) commands.

Meeting these prerequisites ensures a successful connection between Ataccama ONE and Tableau, enabling seamless import and management of Tableau reports within the Ataccama platform.

Authentication options:

  • The Client app secret ID, Personal access token name, Connection app secret value, Connection app client ID, Username and Personal access token secret need to be provided and are stored securely upon saving. Providing details through the Azure KeyVault is possible.

📚Read more on our documentation

 

OneLake

Show content

To connect Ataccama ONE to OneLake, certain authentication and permission requirements must be met. Users must prepare a Client ID, Client Secret, and Tenant ID, as authentication is handled through Azure Active Directory (AAD). Managed Identity authentication is also supported, allowing secure, passwordless access for applications running within Azure without managing credentials manually.

The user must ensure that read-only permissions are granted to OneLake to allow data access while preventing unintended modifications. This requires assigning the appropriate Azure RBAC roles, such as Storage Blob Data Reader or equivalent permissions, to the application or Managed Identity. Additionally, the correct OneLake Workspace Name must be specified to establish a valid connection. Proper firewall and network configurations should be in place to allow connectivity between Ataccama ONE and OneLake.

Authentication options:

  • Azure AD client credential - Client ID, Client Secret and Tenant ID need to be provided and are stored securely upon saving. Providing details through the Azure KeyVault is possible.
  • Azure AD managed identity - The managed Identity Client ID need to be provided and are stored securely upon saving.

📚Read more on our documentation

 

Local File System

Show content

To connect Ataccama ONE to a local file system, a Hybrid DPE is required with a properly configured File System location on the VM where the DPE is installed. To ensure proper access and operations, the target folder must be owned by the DPE Service technical user. This folder can be a local directory on the VM, a mounted storage location, or connected to external storage solutions such as Amazon S3 or Azure Data Lake Storage Gen2 (ADLS Gen2), etc.

Once the File System location is configured in the DPE settings, it will become available in the DQ&C UI for creating a new Source. A default File System configuration is already in place, which can either be reused or adjusted based on specific requirements. This setup ensures that Ataccama ONE can access and process files from the designated storage location seamlessly.

Authentication not required as the DPE application accessing file stored locally on the same VM and owned by the same Service user.

📚Read more on our documentation

 

Post-Processing: API Services
 

Services that can be connected via REST API or JSON call for Monitoring Project Results Post-Processing.

 

ServiceNow

Show content

ServiceNow is a cloud-based platform that provides IT service management (ITSM), business process automation, and enterprise workflow solutions. It allows integration with external systems using the ServiceNow REST API, which follows RESTful principles for querying, updating, and managing ServiceNow data.

The base URL for ServiceNow REST API follows this format: https://<instance>.service-now.com/api/now/v1/<resource>

The ServiceNow user or API client must have the appropriate roles (such as rest_api_explorer or itil) to access specific resources. ACLs (Access Control Lists) in ServiceNow can restrict which tables and fields are accessible via API.

To successfully connect via REST API, ensure that network access to ServiceNow is available, proper authentication is configured, and the user or API client has the necessary permissions.

 

Hope you find these useful! Let us know if you have any questions in the comments below.

Did this topic help you find an answer to your question?

0 replies

Be the first to reply!

Reply


Cookie policy

We use cookies to enhance and personalize your experience. If you accept you agree to our full cookie policy. Learn more about our cookies.

 
Cookie settings