Ready to get started?

Download a free trial of the Databricks ODBC Driver to get started:

 Download Now

Learn more:

Databricks Icon Databricks ODBC Driver

The Databricks ODBC Driver is a powerful tool that allows you to connect with live data from Databricks, directly from any applications that support ODBC connectivity.

Access Databricks data like you would a database - read, write, and update through a standard ODBC Driver interface.

How to Access Live Databricks Data in Power Automate Desktop via ODBC



The CData ODBC Driver for Databricks enables you to integrate Databricks data into workflows built using Microsoft Power Automate Desktop.

The CData ODBC Driver for Databricks enables you to access live Databricks data in workflow automation tools like Power Automate. This article shows how to integrate Databricks data into a simple workflow, moving Databricks data into a CSV file.

Through optimized data processing, CData ODBC Drivers offer unmatched performance for interacting with live Databricks data in Microsoft Power Automate. When you issue complex SQL queries from Power Automate to Databricks, the driver pushes supported SQL operations, like filters and aggregations, directly to Databricks and utilizes the embedded SQL engine to process unsupported operations client-side (e.g. SQL functions and JOIN operations).

Connect to Databricks as an ODBC Data Source

If you have not already, first specify connection properties in an ODBC DSN (data source name). This is the last step of the driver installation. You can use the Microsoft ODBC Data Source Administrator to create and configure ODBC DSNs.

To connect to a Databricks cluster, set the properties as described below.

Note: The needed values can be found in your Databricks instance by navigating to Clusters, and selecting the desired cluster, and selecting the JDBC/ODBC tab under Advanced Options.

  • Server: Set to the Server Hostname of your Databricks cluster.
  • HTTPPath: Set to the HTTP Path of your Databricks cluster.
  • Token: Set to your personal access token (this value can be obtained by navigating to the User Settings page of your Databricks instance and selecting the Access Tokens tab).

When you configure the DSN, you may also want to set the Max Rows connection property. This will limit the number of rows returned, which is especially helpful for improving performance when designing workflows.

Integrate Databricks Data into Power Automate Workflows

After configuring the DSN for Databricks, you are ready to integrate Databricks data into your Power Automate workflows. Open Microsoft Power Automate, add a new flow, and name the flow.

In the flow editor, you can add the actions to connect to Databricks, query Databricks using SQL, and write the query results to a CSV document.

Add an Open SQL Connection Action

Add an "Open SQL connection" action (Actions -> Database) and configure the properties.

  • Connection string: DSN=CData Databricks Source

After configuring the action, click Save.

Add an Execute SQL Statement Action

Add an "Execute SQL statement" action (Actions -> Database) and configure the properties.

  • Get connection by: SQL connection variable
  • SQL connection: %SQLConnection% (the variable from the "Open SQL connection" action above)
  • SQL statement: SELECT * FROM Customers

After configuring the action, click Save.

Add a Write to CSV File Action

Add a "Write to CSV file" action (Actions -> File) and configure the properties.

  • Variable to write to: %QueryResult% (the variable from the "Execute SQL statement" action above)
  • File path: set to a file on disk
  • Configure Advanced settings as needed.

After configuring the action, click Save.

Add a Close SQL Connection Action

Add a "Close SQL connection" action (Actions -> Database) and configure the properties.

  • SQL Connection: %SQLConnection% (the variable from the "Open SQL connection" action above)

After configuring the action, click Save.

Save & Run the Flow

Once you have configured all the actions for the flow, click the disk icon to save the flow. Click the play icon to run the flow.

Now you have a workflow to move Databricks data into a CSV file.

With the CData ODBC Driver for Databricks, you get live connectivity to Databricks data within your Microsoft Power Automate workflows.

Related Power Automate Articles

This article walks through using the CData ODBC Driver for Databricks with Power Automate Desktop. Check out our other articles for more ways to work with Power Automate (Desktop & Online):