Model Context Protocol (MCP) finally gives AI models a way to access the business data needed to make them really useful at work. CData MCP Servers have the depth and performance to make sure AI has access to all of the answers.
Try them now for free →Search External Azure Data Lake Storage Objects in Salesforce Connect
Use CData Connect Cloud to securely provide OData feeds of Azure Data Lake Storage data to smart devices and cloud-based applications. Use the CData Connect and Salesforce Connect to create Azure Data Lake Storage Data objects that you can access from apps and the dashboard.
CData Connect Cloud, enables you to access Azure Data Lake Storage data from cloud-based applications like the Salesforce console and mobile applications like the Salesforce Mobile App. In this article, you will use CData Connect Cloud and Salesforce Connect to access external Azure Data Lake Storage objects alongside standard Salesforce objects.
Connect to Azure Data Lake Storage from Salesforce
To work with live Azure Data Lake Storage data in Salesforce Connect, we need to connect to Azure Data Lake Storage from Connect Cloud, provide user access to the connection, and create a Workspace for the Azure Data Lake Storage data.
Connect to Azure Data Lake Storage from Connect Cloud
CData Connect Cloud uses a straightforward, point-and-click interface to connect to data sources.
- Log into Connect Cloud, click Sources, and then click Add Connection
- Select "Azure Data Lake Storage" from the Add Connection panel
-
Enter the necessary authentication properties to connect to Azure Data Lake Storage.
Authenticating to a Gen 1 DataLakeStore Account
Gen 1 uses OAuth 2.0 in Entra ID (formerly Azure AD) for authentication.
For this, an Active Directory web application is required. You can create one as follows:
To authenticate against a Gen 1 DataLakeStore account, the following properties are required:
- Schema: Set this to ADLSGen1.
- Account: Set this to the name of the account.
- OAuthClientId: Set this to the application Id of the app you created.
- OAuthClientSecret: Set this to the key generated for the app you created.
- TenantId: Set this to the tenant Id. See the property for more information on how to acquire this.
- Directory: Set this to the path which will be used to store the replicated file. If not specified, the root directory will be used.
Authenticating to a Gen 2 DataLakeStore Account
To authenticate against a Gen 2 DataLakeStore account, the following properties are required:
- Schema: Set this to ADLSGen2.
- Account: Set this to the name of the account.
- FileSystem: Set this to the file system which will be used for this account.
- AccessKey: Set this to the access key which will be used to authenticate the calls to the API. See the property for more information on how to acquire this.
- Directory: Set this to the path which will be used to store the replicated file. If not specified, the root directory will be used.
- Click Create & Test
-
Navigate to the Permissions tab in the Add Azure Data Lake Storage Connection page and update the User-based permissions.


Add a Personal Access Token
When connecting to Connect Cloud through the REST API, the OData API, or the Virtual SQL Server, a Personal Access Token (PAT) is used to authenticate the connection to Connect Cloud. It is best practice to create a separate PAT for each service to maintain granularity of access.
- Click on the Gear icon () at the top right of the Connect Cloud app to open the settings page.
- On the Settings page, go to the Access Tokens section and click Create PAT.
-
Give the PAT a name and click Create.
- The personal access token is only visible at creation, so be sure to copy it and store it securely for future use.
Configure Azure Data Lake Storage Endpoints for Salesforce Connect
After connecting to Azure Data Lake Storage, create a workspace for your desired table(s).
-
Navigate to the Workspaces page and click Add to create a new Workspace (or select an existing workspace).
- Click Add to add new assets to the Workspace.
-
Select the Azure Data Lake Storage connection (e.g. ADLS1) and click Next.
-
Select the table(s) you wish to work with and click Confirm.
- Make note of the OData Service URL for your workspace, e.g. https://cloud.cdata.com/api/odata/{workspace_name}
With the connection, PAT, and Workspace configured, you are ready to connect to Azure Data Lake Storage data from Salesforce Connect.
Connect to Azure Data Lake Storage Data as an External Data Sources
Follow the steps below to connect to the feed produced by Connect Cloud.
- Log into Salesforce and click Setup -> Integrations -> External Data Sources.
- Click Now External Data Sources.
- Enter values for the following properties:
- External Data Sources: Enter a label to be used in list views and reports.
- Name: Enter a unique identifier.
- Type: Select the option "Salesforce Connect: OData 4.0".
- URL: Enter the URL to the OData endpoint of Connect Cloud: https://cloud.cdata.com/api/odata/{workspace_name}
- Select JSON in the Format menu.
- In the Authentication section, set the following properties:
- Identity Type: If all members of your organization will use the same credentials to access Connect Cloud, select "Named Principal". If the members of your organization will connect with their own credentials, select "Per User".
- Authentication Protocol: Select Password Authentication to use basic authentication.
- Certificate: Enter or browse to the certificate to be used to encrypt and authenticate communications from Salesforce to your server.
- Username: Enter a CData Connect Cloud username (e.g. [email protected]).
- Password: Enter the user's PAT.

Synchronize Azure Data Lake Storage Objects
After you have created the external data source, follow the steps below to create Azure Data Lake Storage external objects that reflect any changes in the data source. You will synchronize the definitions for the Azure Data Lake Storage external objects with the definitions for Azure Data Lake Storage tables.
- Click the link for the external data source you created.
- Click Validate and Scan.
- Select the Azure Data Lake Storage tables you want to work with as external objects.

Access Azure Data Lake Storage Data as Salesforce Objects
After adding Azure Data Lake Storage data as an external data source and syncing Azure Data Lake Storage tables as external objects, you can use the external Azure Data Lake Storage objects just as you would standard Salesforce objects.
-
Create a new tab with a filter list view:
-
Create reports of external objects:
Simplified Access to Azure Data Lake Storage Data from Cloud Applications
At this point, you have a direct, cloud-to-cloud connection to live Azure Data Lake Storage data from Salesforce. For more information on gaining simplified access to data from more than 100 SaaS, Big Data, and NoSQL sources in cloud applications like Salesforce, refer to our Connect Cloud page.