Ready to get started?

Download a free trial of the Azure Data Lake Storage Driver to get started:

 Download Now

Learn more:

Azure Data Lake Storage Icon Azure Data Lake Storage JDBC Driver

Rapidly create and deploy powerful Java applications that integrate with Azure Data Lake Storage.

Create Informatica Mappings From/To a JDBC Data Source for Azure Data Lake Storage



Create Azure Data Lake Storage data objects in Informatica using the standard JDBC connection process: Copy the JAR and then connect.

Informatica provides a powerful, elegant means of transporting and transforming your data. By utilizing the CData JDBC Driver for Azure Data Lake Storage, you are gaining access to a driver based on industry-proven standards that integrates seamlessly with Informatica's powerful data transportation and manipulation features. This tutorial shows how to transfer and browse Azure Data Lake Storage data in Informatica PowerCenter.

Deploy the Driver

To deploy the driver to the Informatica PowerCenter server, copy the CData JAR and .lic file, located in the lib subfolder in the installation directory, to the following folder: Informatica-installation-directory\services\shared\jars\thirdparty.

To work with Azure Data Lake Storage data in the Developer tool, you will need to copy the CData JAR and .lic file, located in the lib subfolder in the installation directory, into the following folders:

  • Informatica-installation-directory\client\externaljdbcjars
  • Informatica-installation-directory\externaljdbcjars

Create the JDBC Connection

Follow the steps below to connect from Informatica Developer:

  1. In the Connection Explorer pane, right-click your domain and click Create a Connection.
  2. In the New Database Connection wizard that is displayed, enter a name and Id for the connection and in the Type menu select JDBC.
  3. In the JDBC Driver Class Name property, enter: cdata.jdbc.adls.ADLSDriver
  4. In the Connection String property, enter the JDBC URL, using the connection properties for Azure Data Lake Storage.

    Authenticating to a Gen 1 DataLakeStore Account

    Gen 1 uses OAuth 2.0 in Azure AD for authentication.

    For this, an Active Directory web application is required. You can create one as follows:

    1. Sign in to your Azure Account through the .
    2. Select "Azure Active Directory".
    3. Select "App registrations".
    4. Select "New application registration".
    5. Provide a name and URL for the application. Select Web app for the type of application you want to create.
    6. Select "Required permissions" and change the required permissions for this app. At a minimum, "Azure Data Lake" and "Windows Azure Service Management API" are required.
    7. Select "Key" and generate a new key. Add a description, a duration, and take note of the generated key. You won't be able to see it again.

    To authenticate against a Gen 1 DataLakeStore account, the following properties are required:

    • Schema: Set this to ADLSGen1.
    • Account: Set this to the name of the account.
    • OAuthClientId: Set this to the application Id of the app you created.
    • OAuthClientSecret: Set this to the key generated for the app you created.
    • TenantId: Set this to the tenant Id. See the property for more information on how to acquire this.
    • Directory: Set this to the path which will be used to store the replicated file. If not specified, the root directory will be used.

    Authenticating to a Gen 2 DataLakeStore Account

    To authenticate against a Gen 2 DataLakeStore account, the following properties are required:

    • Schema: Set this to ADLSGen2.
    • Account: Set this to the name of the account.
    • FileSystem: Set this to the file system which will be used for this account.
    • AccessKey: Set this to the access key which will be used to authenticate the calls to the API. See the property for more information on how to acquire this.
    • Directory: Set this to the path which will be used to store the replicated file. If not specified, the root directory will be used.

    Built-in Connection String Designer

    For assistance in constructing the JDBC URL, use the connection string designer built into the Azure Data Lake Storage JDBC Driver. Either double-click the JAR file or execute the jar file from the command-line.

    java -jar cdata.jdbc.adls.jar

    Fill in the connection properties and copy the connection string to the clipboard.

    A typical connection string is below:

    jdbc:adls:Schema=ADLSGen2;Account=myAccount;FileSystem=myFileSystem;AccessKey=myAccessKey;InitiateOAuth=GETANDREFRESH

Browse Azure Data Lake Storage Tables

After you have added the driver JAR to the classpath and created a JDBC connection, you can now access Azure Data Lake Storage entities in Informatica. Follow the steps below to connect to Azure Data Lake Storage and browse Azure Data Lake Storage tables:

  1. Connect to your repository.
  2. In the Connection Explorer, right-click the connection and click Connect.
  3. Clear the Show Default Schema Only option.

You can now browse Azure Data Lake Storage tables in the Data Viewer: Right-click the node for the table and then click Open. On the Data Viewer view, click Run.

Create Azure Data Lake Storage Data Objects

Follow the steps below to add Azure Data Lake Storage tables to your project:

  1. Select tables in Azure Data Lake Storage, then right-click a table in Azure Data Lake Storage, and click Add to Project.
  2. In the resulting dialog, select the option to create a data object for each resource.
  3. In the Select Location dialog, select your project.

    Create a Mapping

    Follow the steps below to add the Azure Data Lake Storage source to a mapping:

    1. In the Object Explorer, right-click your project and then click New -> Mapping.
    2. Expand the node for the Azure Data Lake Storage connection and then drag the data object for the table onto the editor.
    3. In the dialog that appears, select the Read option.

    Follow the steps below to map Azure Data Lake Storage columns to a flat file:

    1. In the Object Explorer, right-click your project and then click New -> Data Object.
    2. Select Flat File Data Object -> Create as Empty -> Fixed Width.
    3. In the properties for the Azure Data Lake Storage object, select the rows you want, right-click, and then click copy. Paste the rows into the flat file properties.
    4. Drag the flat file data object onto the mapping. In the dialog that appears, select the Write option.
    5. Click and drag to connect columns.

    To transfer Azure Data Lake Storage data, right-click in the workspace and then click Run Mapping.