Integrating Gumloop with Databricks Data via CData Connect AI
Gumloop is a visual automation platform designed to create AI-powered workflows by combining triggers, AI nodes, APIs, and data connectors. By integrating Gumloop with CData Connect AI through the built-in MCP (Model Context Protocol) Server, workflows can seamlessly access and interact with live Databricks data.
The platform provides a low-code environment, making it easier to orchestrate complex processes without heavy development effort. Its flexibility allows integration across multiple business applications, enabling end-to-end automation with live data.
This article outlines the steps required to configure Databricks connectivity in Connect AI, register the MCP server in Gumloop, and build a workflow that queries Databricks data.
About Databricks Data Integration
Accessing and integrating live data from Databricks has never been easier with CData. Customers rely on CData connectivity to:
- Access all versions of Databricks from Runtime Versions 9.1 - 13.X to both the Pro and Classic Databricks SQL versions.
- Leave Databricks in their preferred environment thanks to compatibility with any hosting solution.
- Secure authenticate in a variety of ways, including personal access token, Azure Service Principal, and Azure AD.
- Upload data to Databricks using Databricks File System, Azure Blog Storage, and AWS S3 Storage.
While many customers are using CData's solutions to migrate data from different systems into their Databricks data lakehouse, several customers use our live connectivity solutions to federate connectivity between their databases and Databricks. These customers are using SQL Server Linked Servers or Polybase to get live access to Databricks from within their existing RDBMs.
Read more about common Databricks use-cases and how CData's solutions help solve data problems in our blog: What is Databricks Used For? 6 Use Cases.
Getting Started
Step 1: Configure Databricks Connectivity for Gumloop
Connectivity to Databricks from Gumloop is made possible through CData Connect AI's Remote MCP Server. To interact with Databricks data from Gumloop, we start by creating and configuring a Databricks connection in CData Connect AI.
- Log into Connect AI, click Sources, and then click Add Connection
- Select "Databricks" from the Add Connection panel
-
Enter the necessary authentication properties to connect to Databricks.
To connect to a Databricks cluster, set the properties as described below.
Note: The needed values can be found in your Databricks instance by navigating to Clusters, and selecting the desired cluster, and selecting the JDBC/ODBC tab under Advanced Options.
- Server: Set to the Server Hostname of your Databricks cluster.
- HTTPPath: Set to the HTTP Path of your Databricks cluster.
- Token: Set to your personal access token (this value can be obtained by navigating to the User Settings page of your Databricks instance and selecting the Access Tokens tab).
- Click Save & Test
-
Navigate to the Permissions tab in the Add Databricks Connection page and update the User-based permissions.
Add a Personal Access Token
A Personal Access Token (PAT) is used to authenticate the connection to Connect AI from Gumloop. It is best practice to create a separate PAT for each service to maintain granularity of access.
- Click on the Gear icon () at the top right of the Connect AI app to open the settings page.
- On the Settings page, go to the Access Tokens section and click Create PAT.
-
Give the PAT a name and click Create.
- The personal access token is only visible at creation, so be sure to copy it and store it securely for future use.
With the Databricks connection configured and a PAT generated, Gumloop is prepared to connect to Databricks data through the CData MCP server.
Step 2: Connect to the MCP server in Gumloop
The MCP server endpoint and authentication values from Connect AI must be added to Gumloop credentials.
- Sign in to Gumloop and create an account
- Visit the Gumloop Credentials page to configure MCP server
- Click on Add Credentials and search and select MCP Server
- Provide the following details:
- URL: https://mcp.cloud.cdata.com/mcp
- Label: A descriptive name such as Databricks-mcp-server
- Access Token / API Key: leave blank
- Additional Header: Authorization: Basic YOUR EMAIL:YOUR PAT
- Save the credentials
The MCP server is now available to build workflows in Gumloop.
Step 3: Build a workflow and explore live Databricks data with Gumloop
- Visit Gumloop Personal workspace and click on the Create Flow
- Select the icon or press Ctrl + B to add a node or a subflow
- Search for Ask AI and select it
- Click Show More Options and enable the Connect MCP Server? option
- From the MCP Servers dropdown, choose the saved MCP credential
- Add a Prompt and Choose an AI Model according to your requirements
- After configuring the required details, Click Run to run the pipeline
With the workflow run completed, Gumloop demonstrates successful retrieval of Databricks data through the CData Connect AI MCP server, with the MCP Client node providing the ability to ask questions, retrieve records, and perform actions on the data.
Get CData Connect AI
To get live data access to 300+ SaaS, Big Data, and NoSQL sources directly from your cloud applications, try CData Connect AI today!