Discover how a bimodal integration strategy can address the major data management challenges facing your organization today.
Get the Report →Build AI/ML Models with Live Databricks Data using Dataiku
Connect Databricks Data with Dataiku using the CData JDBC Driver for Databricks.
Dataiku is a data science and machine learning platform used for data preparation, analysis, visualization, and AI/ML model deployment, enabling collaborative and efficient data-driven decision-making. When paired with the CData JDBC Driver for Databricks, Dataiku enhances data integration, preparation, real-time analysis, and reliable model deployment for Databricks data.
With built-in optimized data processing, the CData JDBC Driver offers unmatched performance for interacting with live Databricks data. When you issue complex SQL queries to Databricks, the driver pushes supported SQL operations, like filters and aggregations, directly to Databricks and utilizes the embedded SQL engine to process unsupported operations client-side (often SQL functions and JOIN operations). Its built-in dynamic metadata querying allows you to work with and analyze Databricks data using native data types.
This article shows how you can easily integrate to Databricks using CData JDBC Driver for Databricks in Dataiku DSS (Data Science Studio) platform, allowing you to prepare the data and build custom AI/ML models.
About Databricks Data Integration
Accessing and integrating live data from Databricks has never been easier with CData. Customers rely on CData connectivity to:
- Access all versions of Databricks from Runtime Versions 9.1 - 13.X to both the Pro and Classic Databricks SQL versions.
- Leave Databricks in their preferred environment thanks to compatibility with any hosting solution.
- Secure authenticate in a variety of ways, including personal access token, Azure Service Principal, and Azure AD.
- Upload data to Databricks using Databricks File System, Azure Blog Storage, and AWS S3 Storage.
While many customers are using CData's solutions to migrate data from different systems into their Databricks data lakehouse, several customers use our live connectivity solutions to federate connectivity between their databases and Databricks. These customers are using SQL Server Linked Servers or Polybase to get live access to Databricks from within their existing RDBMs.
Read more about common Databricks use-cases and how CData's solutions help solve data problems in our blog: What is Databricks Used For? 6 Use Cases.
Getting Started
Preparing the Dataiku DSS environment
In this section, we will explore how to set up Dataiku, as previously introduced, with Databricks data. Be sure to install Dataiku DSS (On-Prem version) for your preferred operating system, beforehand.
Install the CData JDBC Driver for Databricks
First, install the CData JDBC Driver for Databricks on the same machine as Dataiku. The JDBC Driver will be installed in the following path:
C:\Program Files\CData[product_name] 20xx\lib\cdata.jdbc.databricks.jar
Connecting the JDBC Driver in Dataiku DSS
To use the CData JDBC driver in Dataiku, you must create a new SQL database connection and add the JDBC Driver JAR file in the DSS connection settings.
- Log into Dataiku DSS platform. It should open locally in your browser (e.g. localhost:11200)
- Click on Navigate to other sections of Dataiku menu on the top right section of the platform and select Administration.
- Select the Connections tab.
- In Connections, click on New Connections button.
- Now, scroll down and select Other SQL databases.
Generate a JDBC URL for connecting to Databricks, beginning with jdbc:databricks: followed by a series of semicolon-separated connection string properties.
To connect to a Databricks cluster, set the properties as described below.
Note: The needed values can be found in your Databricks instance by navigating to Clusters, and selecting the desired cluster, and selecting the JDBC/ODBC tab under Advanced Options.
- Server: Set to the Server Hostname of your Databricks cluster.
- HTTPPath: Set to the HTTP Path of your Databricks cluster.
- Token: Set to your personal access token (this value can be obtained by navigating to the User Settings page of your Databricks instance and selecting the Access Tokens tab).
Built-in Connection String Designer
For assistance in constructing the JDBC URL, use the connection string designer built into the Databricks JDBC Driver. Either double-click the JAR file or execute the jar file from the command-line.
java -jar cdata.jdbc.databricks.jar
Fill in the connection properties and copy the connection string to the clipboard.
A typical JDBC URL is given below:
jdbc:databricks:Server=127.0.0.1;Port=443;TransportMode=HTTP;HTTPPath=MyHTTPPath;UseSSL=True;User=MyUser;Password=MyPassword;
- On the New SQL database (JDBC) connection screen, enter a name in the New connection name field and specify the basic parameters:
- JDBC Driver Class: cdata.jdbc.databricks.DatabricksDriver
- JDBC URL: JDBC connection URL obtained in the previous step
- Driver jars directory: the folder path where the JAR file is installed on your system
Next, select the SQL dialect of your choice. Here, we have selected 'SQL Server' as the preferred dialect. Click on Create. If the connection is successful, a prompt will display, saying 'Connection OK'.
- The Data Catalog window will appear. Select the desired connection, catalog, and schema from the Connection to browse, Restrict to catalog, and Restrict to schema dropdowns, then click on List Tables. The Dataiku platform will list all the required tables.
- Select any table from the list and click Preview to view the table data. Click Close to exit the window.
Creating a new project
To prepare data flows, create dashboards, analyze the Databricks data, and build AI and ML models in the Dataiku DSS platform, you need to first create a new project.
- Select Projects from the Navigate to other sections of Dataiku menu.
- In the Projects screen, click New Project and select + Blank Project.
- In the New Project window, assign a Name and Project Key. Click Create. The new project dashboard opens up.
- Select Notebooks from the menu at the top of the project screen.
- Click on + Create Your First Notebook dropdown menu and select Write your own option.
- In the New Notebook window, select SQL.
- Now, select the required connection from the Connection dropdown and enter a name in the Notebook Name field.
Testing the connection
To test the Databricks connection and analyze the Databricks data, write a query in the query compiler and click Run. The queried/filtered Databricks data results will then appear on the screen.
Get Started Today
Download a free, 30-day trial of the CData JDBC Driver for Databricks to integrate with Dataiku, and effortlessly build custom AI/ML models from Databricks data.
Reach out to our Support Team if you have any questions.