Build AI/ML Models with Live Azure Data Lake Storage Data using Dataiku



Connect Azure Data Lake Storage Data with Dataiku using the CData JDBC Driver for Azure Data Lake Storage.

Dataiku is a data science and machine learning platform used for data preparation, analysis, visualization, and AI/ML model deployment, enabling collaborative and efficient data-driven decision-making. When paired with the CData JDBC Driver for Azure Data Lake Storage, Dataiku enhances data integration, preparation, real-time analysis, and reliable model deployment for Azure Data Lake Storage data.

With built-in optimized data processing, the CData JDBC Driver offers unmatched performance for interacting with live Azure Data Lake Storage data. When you issue complex SQL queries to Azure Data Lake Storage, the driver pushes supported SQL operations, like filters and aggregations, directly to Azure Data Lake Storage and utilizes the embedded SQL engine to process unsupported operations client-side (often SQL functions and JOIN operations). Its built-in dynamic metadata querying allows you to work with and analyze Azure Data Lake Storage data using native data types.

This article shows how you can easily integrate to Azure Data Lake Storage using CData JDBC Driver for Azure Data Lake Storage in Dataiku DSS (Data Science Studio) platform, allowing you to prepare the data and build custom AI/ML models.

Preparing the Dataiku DSS environment

In this section, we will explore how to set up Dataiku, as previously introduced, with Azure Data Lake Storage data. Be sure to install Dataiku DSS (On-Prem version) for your preferred operating system, beforehand.

Install the CData JDBC Driver for Azure Data Lake Storage

First, install the CData JDBC Driver for Azure Data Lake Storage on the same machine as Dataiku. The JDBC Driver will be installed in the following path:

C:\Program Files\CData[product_name] 20xx\lib\cdata.jdbc.adls.jar

Connecting the JDBC Driver in Dataiku DSS

To use the CData JDBC driver in Dataiku, you must create a new SQL database connection and add the JDBC Driver JAR file in the DSS connection settings.

  1. Log into Dataiku DSS platform. It should open locally in your browser (e.g. localhost:11200)
  2. Click on Navigate to other sections of Dataiku menu on the top right section of the platform and select Administration.
  3. Select the Connections tab.
  4. In Connections, click on New Connections button.
  5. Now, scroll down and select Other SQL databases.
  6. Generate a JDBC URL for connecting to Azure Data Lake Storage, beginning with jdbc:adls: followed by a series of semicolon-separated connection string properties.

    Authenticating to a Gen 1 DataLakeStore Account

    Gen 1 uses OAuth 2.0 in Azure AD for authentication.

    For this, an Active Directory web application is required. You can create one as follows:

    1. Sign in to your Azure Account through the .
    2. Select "Azure Active Directory".
    3. Select "App registrations".
    4. Select "New application registration".
    5. Provide a name and URL for the application. Select Web app for the type of application you want to create.
    6. Select "Required permissions" and change the required permissions for this app. At a minimum, "Azure Data Lake" and "Windows Azure Service Management API" are required.
    7. Select "Key" and generate a new key. Add a description, a duration, and take note of the generated key. You won't be able to see it again.

    To authenticate against a Gen 1 DataLakeStore account, the following properties are required:

    • Schema: Set this to ADLSGen1.
    • Account: Set this to the name of the account.
    • OAuthClientId: Set this to the application Id of the app you created.
    • OAuthClientSecret: Set this to the key generated for the app you created.
    • TenantId: Set this to the tenant Id. See the property for more information on how to acquire this.
    • Directory: Set this to the path which will be used to store the replicated file. If not specified, the root directory will be used.

    Authenticating to a Gen 2 DataLakeStore Account

    To authenticate against a Gen 2 DataLakeStore account, the following properties are required:

    • Schema: Set this to ADLSGen2.
    • Account: Set this to the name of the account.
    • FileSystem: Set this to the file system which will be used for this account.
    • AccessKey: Set this to the access key which will be used to authenticate the calls to the API. See the property for more information on how to acquire this.
    • Directory: Set this to the path which will be used to store the replicated file. If not specified, the root directory will be used.

    Built-in Connection String Designer

    For assistance in constructing the JDBC URL, use the connection string designer built into the Azure Data Lake Storage JDBC Driver. Either double-click the JAR file or execute the jar file from the command-line.

    java -jar cdata.jdbc.adls.jar

    Fill in the connection properties and copy the connection string to the clipboard.

    A typical JDBC URL is given below:

    jdbc:adls:Schema=ADLSGen2;Account=myAccount;FileSystem=myFileSystem;AccessKey=myAccessKey;InitiateOAuth=GETANDREFRESH
  7. On the New SQL database (JDBC) connection screen, enter a name in the New connection name field and specify the basic parameters:
    • JDBC Driver Class: cdata.jdbc.adls.ADLSDriver
    • JDBC URL: JDBC connection URL obtained in the previous step
    • Driver jars directory: the folder path where the JAR file is installed on your system

    Next, select the SQL dialect of your choice. Here, we have selected 'SQL Server' as the preferred dialect. Click on Create. If the connection is successful, a prompt will display, saying 'Connection OK'.

  8. The Data Catalog window will appear. Select the desired connection, catalog, and schema from the Connection to browse, Restrict to catalog, and Restrict to schema dropdowns, then click on List Tables. The Dataiku platform will list all the required tables.
  9. Select any table from the list and click Preview to view the table data. Click Close to exit the window.

Creating a new project

To prepare data flows, create dashboards, analyze the Azure Data Lake Storage data, and build AI and ML models in the Dataiku DSS platform, you need to first create a new project.

  1. Select Projects from the Navigate to other sections of Dataiku menu.
  2. In the Projects screen, click New Project and select + Blank Project.
  3. In the New Project window, assign a Name and Project Key. Click Create. The new project dashboard opens up.
  4. Select Notebooks from the menu at the top of the project screen.
  5. Click on + Create Your First Notebook dropdown menu and select Write your own option.
  6. In the New Notebook window, select SQL.
  7. Now, select the required connection from the Connection dropdown and enter a name in the Notebook Name field.

Testing the connection

To test the Azure Data Lake Storage connection and analyze the Azure Data Lake Storage data, write a query in the query compiler and click Run. The queried/filtered Azure Data Lake Storage data results will then appear on the screen.

Get Started Today

Download a free, 30-day trial of the CData JDBC Driver for Azure Data Lake Storage to integrate with Dataiku, and effortlessly build custom AI/ML models from Azure Data Lake Storage data.

Reach out to our Support Team if you have any questions.

Ready to get started?

Download a free trial of the Azure Data Lake Storage Driver to get started:

 Download Now

Learn more:

Azure Data Lake Storage Icon Azure Data Lake Storage JDBC Driver

Rapidly create and deploy powerful Java applications that integrate with Azure Data Lake Storage.