Ready to get started?

Download a free trial of the Azure Data Lake Storage Driver to get started:

 Download Now

Learn more:

Azure Data Lake Storage Icon Azure Data Lake Storage JDBC Driver

Rapidly create and deploy powerful Java applications that integrate with Azure Data Lake Storage.

Connect to Azure Data Lake Storage Data in DigitalSuite Studio through RunMyProcess DSEC



Use Arkobi Digital's low-code cloud native platform RunMyProcess's DigitalSuite EnterpriseConnect (DSEC) to connect to Azure Data Lake Storage.

The CData JDBC Driver for Azure Data Lake Storage implements JDBC Standards and enables a applications ranging from BI to IDE to connect with Azure Data Lake Storage. In this article, we describe how to connect to Azure Data Lake Storage data from Arkobi Digital RunMyProcess's DSEC and connect to Azure Data Lake Storage in RunMyProcess.

Setting up EnterpriseConnect Agent

Configure the EnterpriseConnect Agent following the EnterpriseConnect page in the RunMyProcess documentation.

Setting up JDBC Adapter

The JDBC Adapter section describes the steps to connect to RDBMS through JDBC. Follow the steps and open the JDBC.config file.

  1. Create a JSON entry for the CData JDBC Driver for Azure Data Lake Storage, e.g. ADLS = { "sqlDriver" : "...", "sqlSource" : "...", "sqlDriverPath" : "..." }
  2. Set the "sqlDriver" field to the Class name for the CData JDBC Driver, e.g.
    cdata.jdbc.adls.ADLSDriver
  3. Set the "sqlSource" field to a JDBC URL for connecting to Azure Data Lake Storage, e.g.
    jdbc:adls:Schema=ADLSGen2;Account=myAccount;FileSystem=myFileSystem;AccessKey=myAccessKey;InitiateOAuth=GETANDREFRESH

    Built-in Connection String Designer

    For assistance in constructing the JDBC URL, use the connection string designer built into the Azure Data Lake Storage JDBC Driver. Either double-click the JAR file or execute the jar file from the command-line.

    java -jar cdata.jdbc.adls.jar

    Fill in the connection properties and copy the connection string to the clipboard.

    Authenticating to a Gen 1 DataLakeStore Account

    Gen 1 uses OAuth 2.0 in Azure AD for authentication.

    For this, an Active Directory web application is required. You can create one as follows:

    1. Sign in to your Azure Account through the .
    2. Select "Azure Active Directory".
    3. Select "App registrations".
    4. Select "New application registration".
    5. Provide a name and URL for the application. Select Web app for the type of application you want to create.
    6. Select "Required permissions" and change the required permissions for this app. At a minimum, "Azure Data Lake" and "Windows Azure Service Management API" are required.
    7. Select "Key" and generate a new key. Add a description, a duration, and take note of the generated key. You won't be able to see it again.

    To authenticate against a Gen 1 DataLakeStore account, the following properties are required:

    • Schema: Set this to ADLSGen1.
    • Account: Set this to the name of the account.
    • OAuthClientId: Set this to the application Id of the app you created.
    • OAuthClientSecret: Set this to the key generated for the app you created.
    • TenantId: Set this to the tenant Id. See the property for more information on how to acquire this.
    • Directory: Set this to the path which will be used to store the replicated file. If not specified, the root directory will be used.

    Authenticating to a Gen 2 DataLakeStore Account

    To authenticate against a Gen 2 DataLakeStore account, the following properties are required:

    • Schema: Set this to ADLSGen2.
    • Account: Set this to the name of the account.
    • FileSystem: Set this to the file system which will be used for this account.
    • AccessKey: Set this to the access key which will be used to authenticate the calls to the API. See the property for more information on how to acquire this.
    • Directory: Set this to the path which will be used to store the replicated file. If not specified, the root directory will be used.
  4. Set the "sqlDriverPath" field to the name of the CData JDBC Driver JAR file, e.g.
    cdata.jdbc.adls.jar

Sample JDBC.config File

#DBAgent Configuration ADLS = {"sqlDriver" : "cdata.jdbc.adls.ADLSDriver", "sqlSource" = "jdbc:adls:Schema=ADLSGen2;Account=myAccount;FileSystem=myFileSystem;AccessKey=myAccessKey;","sqlDriverPath" = "cdata.jdbc.adls.jar" }

Put the JDBC driver JAR file (cdata.jdbc.adls.jar) into the same directory as unified-adapter-[version].jar.

Note: Make sure to put the CData license file (cdata.jdbc.adls.lic) into the same directory. Since the license is generated based on the unique identifier of the machine where the product in installed, you will need an offline activation if you want to put the file on another machine.

Starting DigitalSuite EnterpriseConnect Agent

In Windows, start RunMyProcess DigitalSuite EnterpriseConnect Agent in Windows services. To start the application through command line, see Starting the EnterpriseConnect Agent in the RunMyProcess documents.

Starting the JDBC Adapter

Start the JDBC Adapter from runAdapter.bat. Once the Adapter is running, you can access the application through the agent address (e.g. 127.0.0.1:8080). Below is an example executing the command in Windows.

... > java -Djava.util.logging.config.file=./log.properties -cp lib/* org.runmyprocess.sec2.AdapterHandler : 2021-06-09 14:37:58|INFO|correlationId=|Searching for config file... 2021-06-09 14:37:58|INFO|correlationId=|Adapter Handler started with [JDBC] configuration 2021-06-09 14:37:59|INFO|correlationId=|agent address: 127.0.0.1:8080 2021-06-09 14:38:00.251:INFO::ConnectionThread: Logging initialized @1820ms to org.eclipse.jetty.util.log.StdErrLog 2021-06-09 14:38:00|INFO|correlationId=|onConnect() websocket connection between Agent and Adapter established

Once the DigitalSuite EnterpriseConnect Agent and JDBC Adapter are running, access http://localhost:(specified-port-number)/ through your browser to open the page shown below.

Check the availability of the JDBC Adapter using tools such as Postman or cURL. Here, we use Postman to send the HTTP POST request.

Configure the RequestHeader as follows:

Content-Type application/json

Configure the RequestBody as follows:

{ "protocol":"JDBC", "data":{ "DBType":"ADLS", "sqlUsername":"", "sqlPassword":"", "sqlStatement":"SELECT * FROM Resources" } }

If the JDBC.config file contains credential information, sqlUsername and sqlPassword can be left empty. If you are not sure of the table name, you can retrieve the list of tables using the request SELECT * FROM sys_tables

The request is successful if the Status is 200 and the Body contains Azure Data Lake Storage data in JSON format.

Connect to Azure Data Lake Storage through DSEC Agent in DigitalSuite Studio

Create a DigitalSuite Studio project and then create a Provider in the project.

  • URL: The URL for accessing JDBC Agent (e.g. http:localhost:8080/)
  • Authentication Scheme: Login/password
  • Login: The value from agent.user in the application.properties file
  • Password: agent.password in the application.properties file
  • Secured: Checked
  • Use DigitalSuite EnterpriseConnect: Checked
  • With domain: The value from agent.domain in application.properties file

Next, create a Connector in the Provider.

  • Connector URL: Leave this empty
  • Architecture: REST/XML-RPC
  • Method: POST
  • Result format: JSON
  • Accept media type: application/json
  • Character set: Automatic
  • Content: Same as the Request body used in the JDBC Adapter
  • Content type: application/json

The JSON data we used as the Request body in JDBC Adapter:

{ "protocol":"JDBC", "data":{ "DBType":"ADLS", "sqlUsername":"", "sqlPassword":"", "sqlStatement":"SELECT * FROM Resources" } }

Open Launch Test to perform the test. The test is successful if Azure Data Lake Storage data is shown in Result on the right pane.

Now you can use Azure Data Lake Storage data in RunMyProcess DigitalSuite Studio through DSEC.

For the detailed information on supported SQL commands, refer to the SQL Compliance section in our help documentation. For information on tables, refer to the Data Model section.