Discover how a bimodal integration strategy can address the major data management challenges facing your organization today.
Get the Report →Query Kafka Data in DataGrip
Create a Data Source for Kafka in DataGrip and use SQL to query live Kafka data.
DataGrip is a database IDE that allows SQL developers to query, create, and manage databases. When paired with the CData JDBC Driver for Apache Kafka, DataGrip can work with live Kafka data. This article shows how to establish a connection to Kafka data in DataGrip and use the table editor to load Kafka data.
Create a New Driver Definition for Kafka
The steps below describe how to create a new Data Source in DataGrip for Kafka.
- In DataGrip, click File -> New > Project and name the project
- In the Database Explorer, click the plus icon () and select Driver.
- In the Driver tab:
- Set Name to a user-friendly name (e.g. "CData Kafka Driver")
- Set Driver Files to the appropriate JAR file. To add the file, click the plus (), select "Add Files," navigate to the "lib" folder in the driver's installation directory and select the JAR file (e.g. cdata.jdbc.apachekafka.jar).
- Set Class to cdata.jdbc.apachekafka.ApacheKafka.jar
Additionally, in the advanced tab you can change driver properties and some other settings like VM Options, VM environment, VM home path, DBMS, etc - For most cases, change the DBMS type to "Unknown" in Expert options to avoid native SQL Server queries (Transact-SQL), which might result in an invalid function error
- Click "Apply" then "OK" to save the Connection
Configure a Connection to Kafka
- Once the connection is saved, click the plus (), then "Data Source" then "CData Kafka Driver" to create a new Kafka Data Source.
- In the new window, configure the connection to Kafka with a JDBC URL.
Built-in Connection String Designer
For assistance in constructing the JDBC URL, use the connection string designer built into the Kafka JDBC Driver. Either double-click the JAR file or execute the jar file from the command-line.
java -jar cdata.jdbc.apachekafka.jar
Fill in the connection properties and copy the connection string to the clipboard.
Set BootstrapServers and the Topic properties to specify the address of your Apache Kafka server, as well as the topic you would like to interact with.
Authorization Mechanisms
- SASL Plain: The User and Password properties should be specified. AuthScheme should be set to 'Plain'.
- SASL SSL: The User and Password properties should be specified. AuthScheme should be set to 'Scram'. UseSSL should be set to true.
- SSL: The SSLCert and SSLCertPassword properties should be specified. UseSSL should be set to true.
- Kerberos: The User and Password properties should be specified. AuthScheme should be set to 'Kerberos'.
You may be required to trust the server certificate. In such cases, specify the TrustStorePath and the TrustStorePassword if necessary.
- Set URL to the connection string, e.g.,
jdbc:apachekafka:User=admin;Password=pass;BootStrapServers=https://localhost:9091;Topic=MyTopic;
- Click "Apply" and "OK" to save the connection string
At this point, you will see the data source in the Data Explorer.
Execute SQL Queries Against Kafka
To browse through the Kafka entities (available as tables) accessible through the JDBC Driver, expand the Data Source.
To execute queries, right click on any table and select "New" -> "Query Console."
In the Console, write the SQL query you wish to execute. For example: SELECT Id, Column1 FROM SampleTable_1 WHERE Column2 = '100'
Download a free, 30-day trial of the CData JDBC Driver for Apache Kafka and start working with your live Kafka data in DataGrip. Reach out to our Support Team if you have any questions.