Model Context Protocol (MCP) finally gives AI models a way to access the business data needed to make them really useful at work. CData MCP Servers have the depth and performance to make sure AI has access to all of the answers.
Try them now for free →Consume Azure Data Lake Storage OData Feeds in SAP Lumira
Use the API Server to create data visualizations on Azure Data Lake Storage feeds that reflect any changes in SAP Lumira.
You can use the CData API Server to create data visualizations based on Azure Data Lake Storage data in SAP Lumira. The API Server enables connectivity to live data: dashboards and reports can be refreshed on demand. This article shows how to create a chart that is always up to date.
Set Up the API Server
If you have not already done so, download the CData API Server. Once you have installed the API Server, follow the steps below to begin producing secure Azure Data Lake Storage OData services:
Connect to Azure Data Lake Storage
To work with Azure Data Lake Storage data from SAP Lumira, we start by creating and configuring a Azure Data Lake Storage connection. Follow the steps below to configure the API Server to connect to Azure Data Lake Storage data:
- First, navigate to the Connections page.
-
Click Add Connection and then search for and select the Azure Data Lake Storage connection.
-
Enter the necessary authentication properties to connect to Azure Data Lake Storage.
Authenticating to a Gen 1 DataLakeStore Account
Gen 1 uses OAuth 2.0 in Entra ID (formerly Azure AD) for authentication.
For this, an Active Directory web application is required. You can create one as follows:
To authenticate against a Gen 1 DataLakeStore account, the following properties are required:
- Schema: Set this to ADLSGen1.
- Account: Set this to the name of the account.
- OAuthClientId: Set this to the application Id of the app you created.
- OAuthClientSecret: Set this to the key generated for the app you created.
- TenantId: Set this to the tenant Id. See the property for more information on how to acquire this.
- Directory: Set this to the path which will be used to store the replicated file. If not specified, the root directory will be used.
Authenticating to a Gen 2 DataLakeStore Account
To authenticate against a Gen 2 DataLakeStore account, the following properties are required:
- Schema: Set this to ADLSGen2.
- Account: Set this to the name of the account.
- FileSystem: Set this to the file system which will be used for this account.
- AccessKey: Set this to the access key which will be used to authenticate the calls to the API. See the property for more information on how to acquire this.
- Directory: Set this to the path which will be used to store the replicated file. If not specified, the root directory will be used.
- After configuring the connection, click Save & Test to confirm a successful connection.
Configure API Server Users
Next, create a user to access your Azure Data Lake Storage data through the API Server. You can add and configure users on the Users page. Follow the steps below to configure and create a user:
- On the Users page, click Add User to open the Add User dialog.
-
Next, set the Role, Username, and Privileges properties and then click Add User.
-
An Authtoken is then generated for the user. You can find the Authtoken and other information for each user on the Users page:
Creating API Endpoints for Azure Data Lake Storage
Having created a user, you are ready to create API endpoints for the Azure Data Lake Storage tables:
-
First, navigate to the API page and then click
Add Table
.
-
Select the connection you wish to access and click Next.
-
With the connection selected, create endpoints by selecting each table and then clicking Confirm.
Gather the OData Url
Having configured a connection to Azure Data Lake Storage data, created a user, and added resources to the API Server, you now have an easily accessible REST API based on the OData protocol for those resources. From the API page in API Server, you can view and copy the API Endpoints for the API:

Connect to Azure Data Lake Storage from SAP Lumira
Follow the steps below to retrieve Azure Data Lake Storage data into SAP Lumira. You can execute an SQL query or use the UI.
- In SAP Lumira, click File -> New -> Query with SQL. The Add New Dataset dialog is displayed.
- Expand the Generic section and click the Generic OData 2.0 Connector option.
-
In the Service Root URI box, enter the OData endpoint of the API Server. This URL will resemble the following:
https://your-server:8080/api.rsc
-
In the User Name and Password boxes, enter the username and authtoken of an API user. These credentials will be used in HTTP Basic authentication.
Select entities in the tree or enter an SQL query. This article imports Azure Data Lake Storage Resources entities.
-
When you click Connect, SAP Lumira will generate the corresponding OData request and load the results into memory. You can then use any of the data processing tools available in SAP Lumira, such as filters, aggregates, and summary functions.
Create Data Visualizations
After you have imported the data, you can create data visualizations in the Visualize room. Follow the steps below to create a basic chart.
In the Measures and Dimensions pane, drag measures and dimensions onto the x-axis and y-axis fields in the Visualization Tools pane. SAP Lumira automatically detects dimensions and measures from the metadata service of the API Server.
By default, the SUM function is applied to all measures. Click the gear icon next to a measure to change the default summary.
- In the Visualization Tools pane, select the chart type.
- In the Chart Canvas pane, apply filters, sort by measures, add rankings, and update the chart with the current Azure Data Lake Storage data.