Discover how a bimodal integration strategy can address the major data management challenges facing your organization today.
Get the Report →Connect to Azure Data Lake Storage Data in as an External Source in Dremio
Use the CData JDBC Driver to connect to Azure Data Lake Storage as an External Source in Dremio.
The CData JDBC Driver for Azure Data Lake Storage implements JDBC Standards and allows various applications, including Dremio, to work with live Azure Data Lake Storage data. Dremio is a data lakehouse platform designed to empower self-service, interactive analytics on the data lake. With the CData JDBC Driver, you can include live Azure Data Lake Storage data as a part of your enterprise data lake. This article describes how to connect to Azure Data Lake Storage data from Dremio as an External Source.
The CData JDBC Driver enables high-speed access to live Azure Data Lake Storage data in Dremio. Once you install the driver, authenticate with Azure Data Lake Storage and gain immediate access to Azure Data Lake Storage data within your data lake. By surfacing Azure Data Lake Storage data using native data types and handling complex filters, aggregations, & other operations automatically, the CData JDBC Driver grants seamless access to Azure Data Lake Storage data.
Build the ARP Connector
To use the CData JDBC Driver in Dremio, you need to build an Advanced Relation Pushdown (ARP) Connector. You can view the source code for the Connector on GitHub or download the ZIP file (GitHub.com) directly. Once you copy or extract the files, run the following command from the root directory of the connector (the directory containing the pom.xml file) to build the connector.
mvn clean install
Once the JAR file for the connector is built (in the target directory), you are ready to copy the ARP connector and JDBC Driver to your Dremio instance.
Installing the Connector and JDBC Driver
Install the ARP Connector to %DREMIO_HOME%/jars/ and the JDBC Driver for Azure Data Lake Storage to %DREMIO_HOME%/jars/3rdparty. You can use commands similar to the following:
ARP Connector
docker cp PATH\TO\dremio-adls-plugin-20.0.0.jar dremio_image_name:/opt/dremio/jars/
JDBC Driver for Azure Data Lake Storage
docker cp PATH\TO\cdata.jdbc.adls.jar dremio_image_name:/opt/dremio/jars/3rdparty/
Connecting to Azure Data Lake Storage
Azure Data Lake Storage will now appear as an External Source option in Dremio. The ARP Connector built uses a JDBC URL to connect to Azure Data Lake Storage data. The JDBC Driver has a built-in connection string designer that you can use (see below).
Built-in Connection String Designer
For assistance in constructing the JDBC URL, use the connection string designer built into the Azure Data Lake Storage JDBC Driver. Double-click the JAR file or execute the jar file from the command line.
java -jar cdata.jdbc.adls.jar
Fill in the connection properties and copy the connection string to the clipboard.
Authenticating to a Gen 1 DataLakeStore Account
Gen 1 uses OAuth 2.0 in Azure AD for authentication.
For this, an Active Directory web application is required. You can create one as follows:
To authenticate against a Gen 1 DataLakeStore account, the following properties are required:
- Schema: Set this to ADLSGen1.
- Account: Set this to the name of the account.
- OAuthClientId: Set this to the application Id of the app you created.
- OAuthClientSecret: Set this to the key generated for the app you created.
- TenantId: Set this to the tenant Id. See the property for more information on how to acquire this.
- Directory: Set this to the path which will be used to store the replicated file. If not specified, the root directory will be used.
Authenticating to a Gen 2 DataLakeStore Account
To authenticate against a Gen 2 DataLakeStore account, the following properties are required:
- Schema: Set this to ADLSGen2.
- Account: Set this to the name of the account.
- FileSystem: Set this to the file system which will be used for this account.
- AccessKey: Set this to the access key which will be used to authenticate the calls to the API. See the property for more information on how to acquire this.
- Directory: Set this to the path which will be used to store the replicated file. If not specified, the root directory will be used.
NOTE: To use the JDBC Driver in Dremio, you will need a license (full or trial) and a Runtime Key (RTK). For more information on obtaining this license (or a trial), contact our sales team.
Add the Runtime Key (RTK) to the JDBC URL. You will end up with a JDBC URL similar to the following:
jdbc:adls:RTK=5246...;Schema=ADLSGen2;Account=myAccount;FileSystem=myFileSystem;AccessKey=myAccessKey;InitiateOAuth=GETANDREFRESH
Access Azure Data Lake Storage as an External Source
To add Azure Data Lake Storage as an External Source, click to add a new source and select ADLS. Copy the JDBC URL and paste it into the New ADLS Source wizard.
Save the connection and you are ready to query live Azure Data Lake Storage data in Dremio, easily incorporating Azure Data Lake Storage data into your data lake.
More Information & Free Trial
Using the CData JDBC Driver for Azure Data Lake Storage in Dremio, you can incorporate live Azure Data Lake Storage data into your data lake. Check out our CData JDBC Driver for Azure Data Lake Storage page for more information about connecting to Azure Data Lake Storage. Download a free, 30 day trial of the CData JDBC Driver for Azure Data Lake Storage and get started today.