Bridge Azure Data Lake Storage Connectivity with Apache NiFi



Access and process Azure Data Lake Storage data in Apache NiFi using the CData JDBC Driver.

Apache NiFi supports powerful and scalable directed graphs of data routing, transformation, and system mediation logic. When paired with the CData JDBC Driver for Azure Data Lake Storage, NiFi can work with live Azure Data Lake Storage data. This article describes how to connect to and query Azure Data Lake Storage data from an Apache NiFi Flow.

With built-in optimized data processing, the CData JDBC driver offers unmatched performance for interacting with live Azure Data Lake Storage data. When you issue complex SQL queries to Azure Data Lake Storage, the driver pushes supported SQL operations, like filters and aggregations, directly to Azure Data Lake Storage and utilizes the embedded SQL engine to process unsupported operations client-side (often SQL functions and JOIN operations). Its built-in dynamic metadata querying allows you to work with and analyze Azure Data Lake Storage data using native data types.

Connecting to Azure Data Lake Storage Data in Apache NiFi

  1. Download the CData JDBC Driver for Azure Data Lake Storage installer, unzip the package, run the .exe file to install the driver.
  2. Copy the CData JDBC Driver JAR file (and license file if it exists), cdata.jdbc.adls.jar (and cdata.jdbc.adls.lic), to the Apache NiFi lib subfolder, for example, C:\nifi-1.3.0-bin\nifi-1.3.0\lib.

    On Windows, the default location for the CData JDBC Driver is C:\Program Files\CData\CData JDBC Driver for Azure Data Lake Storage.

  3. Start Apache NiFi by running the run-nifi.bat file in bin subfolder, for example, C:\nifi-1.3.0-bin\nifi-1.3.0\bin.

    (OR)

    Use the command prompt to navigate to the particular directory and run the run-nifi.bat file for example:

    cd C:\nifi-1.3.0-bin\nifi-1.3.0\bin
    .\run-nifi.bat
    
  4. Navigate to the Apache NiFi UI in your web browser: It should be https://localhost:8443/nifi.

    Note: If users are utilizing an older version of Apache NiFi, they should access it via http://localhost:8080/nifi. In earlier versions, HTTP was the protocol employed. However, in the most recent version, HTTPS is the standard. By default, HTTP operates on port 8080, while HTTPS uses port 8443.

  5. When accessing Apache NiFi via a URL, it prompts you to enter a username and password for login.

    To retrieve login credentials, users should consult the 'App.log' file located within the log directory of their NiFi installation. This file typically contains the necessary details for accessing the NiFi interface.

  6. Right-click on the Nifi Flow's workspace and click "Controller Services"
  7. Click the button to create a new controller service.
  8. In the Controller Services section, location the newly created "DBCPConnection Pool" and click the menu () >> Edit to configure the new connection.
  9. Fill in the properties:

    • Database Connection URL: jdbc:adls:Schema=ADLSGen2;Account=myAccount;FileSystem=myFileSystem;AccessKey=myAccessKey;InitiateOAuth=GETANDREFRESH
    • Database Driver Class Name: cdata.jdbc.adls.ADLSDriver
    • Database Driver Location(s): Path to your Apache NiFi's lib folder where the JAR files are present.

    Built-in Connection String Designer

    For assistance in constructing the JDBC URL, use the connection string designer built into the Azure Data Lake Storage JDBC Driver. Either double-click the JAR file or execute the JAR file from the command-line.

    java -jar cdata.jdbc.adls.jar

    Fill in the connection properties and copy the connection string to the clipboard.

    Authenticating to a Gen 1 DataLakeStore Account

    Gen 1 uses OAuth 2.0 in Azure AD for authentication.

    For this, an Active Directory web application is required. You can create one as follows:

    1. Sign in to your Azure Account through the .
    2. Select "Azure Active Directory".
    3. Select "App registrations".
    4. Select "New application registration".
    5. Provide a name and URL for the application. Select Web app for the type of application you want to create.
    6. Select "Required permissions" and change the required permissions for this app. At a minimum, "Azure Data Lake" and "Windows Azure Service Management API" are required.
    7. Select "Key" and generate a new key. Add a description, a duration, and take note of the generated key. You won't be able to see it again.

    To authenticate against a Gen 1 DataLakeStore account, the following properties are required:

    • Schema: Set this to ADLSGen1.
    • Account: Set this to the name of the account.
    • OAuthClientId: Set this to the application Id of the app you created.
    • OAuthClientSecret: Set this to the key generated for the app you created.
    • TenantId: Set this to the tenant Id. See the property for more information on how to acquire this.
    • Directory: Set this to the path which will be used to store the replicated file. If not specified, the root directory will be used.

    Authenticating to a Gen 2 DataLakeStore Account

    To authenticate against a Gen 2 DataLakeStore account, the following properties are required:

    • Schema: Set this to ADLSGen2.
    • Account: Set this to the name of the account.
    • FileSystem: Set this to the file system which will be used for this account.
    • AccessKey: Set this to the access key which will be used to authenticate the calls to the API. See the property for more information on how to acquire this.
    • Directory: Set this to the path which will be used to store the replicated file. If not specified, the root directory will be used.
  10. In the Controller Services section, locate the newly created DBCPConnection Pool and click the menu () >> Enable to enable the new connection.
  11. In the "Enable Controller Service" window, set Scope to "Service and referencing components"
  12. To establish a connection and execute a select query, simply drag and drop the processor (indicated by the yellow highlight) into the workspace.
  13. Select the 'ExecuteSQL' processor and click the 'Add' button to make it visible in the workspace.
  14. Double-click on the added processor (ExecuteSQL)to open the connection page.
  15. In the Properties section, fill in the required information. Make sure to set the Database Connection Pooling Service to match the DBCPConnectionPool that you have created, and set your desired SQL query that you want to get executed in the SQL select query section.
  16. Go to Relationships and make sure to select an option on how the component should proceed in case of success and failure of the execution process.
  17. You can enable the ExecuteSQL component by either selecting it and clicking Enable on the Operation section, or by right-clicking it and selecting Enable.

Your Azure Data Lake Storage data is now available for use in Apache NiFi. For example, you can use the DBCPConnection Pool as the source for a QueryDatabaseTable processor (shown below).

Download a free, 30-day trial of the CData JDBC Driver for Azure Data Lake Storage and start working with your live Azure Data Lake Storage data in Apache NiFi. Reach out to our Support Team if you have any questions.

Ready to get started?

Download a free trial of the Azure Data Lake Storage Driver to get started:

 Download Now

Learn more:

Azure Data Lake Storage Icon Azure Data Lake Storage JDBC Driver

Rapidly create and deploy powerful Java applications that integrate with Azure Data Lake Storage.