How to Work with Lakebase Data in ETL Validator JDBC
ETL Validator provides data movement and transformation capabilities for integrating data platforms across your organization. CData's JDBC driver seamlessly integrates with ETL Validator and extends its native connectivity to include Lakebase data.
This tutorial walks through the process of building a simple ETL validator data flow to extract data from Lakebase data and load it into an example data storage solution: SQL Server.
Add a new ETL Validator data source via CData
CData extends ETL Validator's data connectivity capabilities by providing the ability to add data sources that connect via CData's JDBC drivers. Connecting to Lakebase data simply requires creating a new data source in ETL Validator through CData's connectiviy suite as described below.
Login to ETL Validator
Begin by logging into ETL Validator to view the application dashboard.
Click on Add a DataSource
CData extends the data source options within ETL Validator.
Click on CData
CData's connectivity is embedded within ETL Validator's data source options.
Configure the CData Driver Connection String
You will need a JDBC connection string to establish a connection to Lakebase in ETL Validator.
To connect to Databricks Lakebase, start by setting the following properties:
- DatabricksInstance: The Databricks instance or server hostname, provided in the format instance-abcdef12-3456-7890-abcd-abcdef123456.database.cloud.databricks.com.
- Server: The host name or IP address of the server hosting the Lakebase database.
- Port (optional): The port of the server hosting the Lakebase database, set to 5432 by default.
- Database (optional): The database to connect to after authenticating to the Lakebase Server, set to the authenticating user's default database by default.
OAuth Client Authentication
To authenicate using OAuth client credentials, you need to configure an OAuth client in your service principal. In short, you need to do the following:
- Create and configure a new service principal
- Assign permissions to the service principal
- Create an OAuth secret for the service principal
For more information, refer to the Setting Up OAuthClient Authentication section in the Help documentation.
OAuth PKCE Authentication
To authenticate using the OAuth code type with PKCE (Proof Key for Code Exchange), set the following properties:
- AuthScheme: OAuthPKCE.
- User: The authenticating user's user ID.
For more information, refer to the Help documentation.
Built-in Connection String Designer
For assistance in constructing the JDBC URL, use the connection string designer built into the Lakebase JDBC Driver. Either double-click the JAR file or execute the jar file from the command-line.
java -jar cdata.jdbc.lakebase.jar
A typical connection string looks like this:
jdbc:lakebase:DatabricksInstance=lakebase;Server=127.0.0.1;Port=5432;Database=my_database;InitiateOAuth=GETANDREFRESH;
Licensing the Driver
To ensure the JDBC driver is licensed appropriately, copy the license file to the appropriate location:
Copy the JDBC Driver for Lakebase and lic file from "C:\Program Files\CData[product_name]\lib" to "C:\Datagaps\ETLValidator\Server\apache-tomcat\bin".
cdata.jdbc.lakebase.jar
cdata.jdbc.lakebase.lic
Note: If you do not copy the .lic file with the jar, you will see a licensing error that indicates you do not have a valid license installed. This is true for both the trial and full versions.
Save the connection
Should you encounter any difficulties loading the CData JDBC driver class, please contact DataGap's team, and they will provide you instructions on how to load the jar file for the relevant driver.
Add SQL Server as a Target
This example will use SQL Server as a destination for Lakebase data data, but any preferred destination can be used instead.
Go to DataSources and select MS_SQL_SERVER
This option is the default.
Fill in the necessary connection details and test the connection
The details will depend on the specific target, but these details may include a URL, authentiation credentials, etc.
Create a Dataflow in ETL Validator
Open the Dataflows tab
Configured data flows will appear in this window.
Select Create Dataflow
Name your new dataflow and save it.
Open the Dataflow to view the Dataflow Diagram
The details of the data movement will be configured in this panel.
Drag & drop the JDBC as a source from the right side
Give the new source an appropriate name and save it.
Fill in the Query section of the new source
Select the Table from the Schema option that reflects which data should be pulled from Lakebase data.
View the expected results of your query
The anticipated outcome of the configured query is displayed in the Result tab.
Add the destination to the Dataflow
Select Switch to Diagram, then drag & drop the DB Sink as a target from the right side (under Sink options). Give the sink an appropriate name and save it.
Set the appropriate Schema for the destination
Choose the Schema and table that matches the structure of the source table. For this example, the table on the target side was created to match the Source so that data flow seamlessly. More advanced schema transformation operations are beyond the scope of this article.
Hit the RUN option to begin replication
Running the job will take some time.
View the finished Dataflow
Return to the diagram to see the finished data replication job from Lakebase data to SQL Server.
Get Started Today
Download a free, 30-day trial of the CData JDBC Driver for Lakebase and start building Lakebase-connected applications with ETL Validator. Reach out to our Support Team if you have any questions.