Bridge Kafka Connectivity with Apache NiFi

Ready to get started?

Download a free trial:

Download Now

Learn more:

Apache Kafka JDBC Driver

Rapidly create and deploy powerful Java applications that integrate with Apache Kafka.



Access and process Kafka data in Apache NiFi using the CData JDBC Driver.

Apache NiFi supports powerful and scalable directed graphs of data routing, transformation, and system mediation logic. When paired with the CData JDBC Driver for Kafka, NiFi can work with live Kafka data. This article describes how to connect to and query Kafka data from an Apache NiFi Flow.

With built-in optimized data processing, the CData JDBC Driver offers unmatched performance for interacting with live Kafka data. When you issue complex SQL queries to Kafka, the driver pushes supported SQL operations, like filters and aggregations, directly to Kafka and utilizes the embedded SQL engine to process unsupported operations client-side (often SQL functions and JOIN operations). Its built-in dynamic metadata querying allows you to work with and analyze Kafka data using native data types.

Connecting to Kafka Data in Apache NiFi

  1. Download the CData JDBC Driver for Kafka installer, unzip the package, and run the JAR file to install the driver.
  2. Copy the CData JDBC Driver JAR file (and license file if it exists), cdata.jdbc.apachekafka.jar (and cdata.jdbc.apachekafka.lic), to the Apache NiFi lib subfolder, for example, C:\nifi-1.3.0-bin\nifi-1.3.0\lib.

    On Windows, the default location for the CData JDBC Driver is C:\Program Files\CData\CData JDBC Driver for Kafka.

  3. Start Apache NiFi. For example:

    cd C:\nifi-1.3.0-bin\nifi-1.3.0\bin
    run-nifi.bat
    
  4. Navigate to the Apache NiFi UI in your web browser: typically http://localhost:8080/nifi
  5. Click the (Configuration) button from the Operate sidebar.
  6. In the NiFi Flow Configuration page, navigate to the Controller Services tab.
  7. Click the button to create a new controller service.
  8. In the Add Controller Service page, select DBCPConnection Pool, and then click Add.
  9. Click the gear button to configure the new DBCPConnection Pool.
  10. In the Configure Controller Service page, navigate to the Properties tab. Fill in the properties:

    PropertyValue
    Database Connection URLjdbc:apachekafka:User=admin;Password=pass;BootStrapServers=https://localhost:9091;Topic=MyTopic;
    Database Driver Class Namecdata.jdbc.apachekafka.ApacheKafkaDriver

    Built-in Connection String Designer

    For assistance in constructing the JDBC URL, use the connection string designer built into the Kafka JDBC Driver. Either double-click the JAR file or execute the jar file from the command-line.

    java -jar cdata.jdbc.apachekafka.jar

    Fill in the connection properties and copy the connection string to the clipboard.

    Set BootstrapServers and the Topic properties to specify the address of your Apache Kafka server, as well as the topic you would like to interact with.

    Authorization Mechanisms

    • SASL Plain: The User and Password properties should be specified. AuthScheme should be set to 'Plain'.
    • SASL SSL: The User and Password properties should be specified. AuthScheme should be set to 'Scram'. UseSSL should be set to true.
    • SSL: The SSLCert and SSLCertPassword properties should be specified. UseSSL should be set to true.
    • Kerberos: The User and Password properties should be specified. AuthScheme should be set to 'Kerberos'.

    You may be required to trust the server certificate. In such cases, specify the TrustStorePath and the TrustStorePassword if necessary.

  11. Click the button to enable the new DBCPConnection Pool.

Your Kafka data is now available for use in Apache NiFi. For example, you can use the DBCPConnection Pool as the source for a QueryDatabaseTable processor.

Download a free, 30-day trial of the CData JDBC Driver for Kafka and start working with your live Kafka data in Apache NiFi. Reach out to our Support Team if you have any questions.