We are proud to share our inclusion in the 2024 Gartner Magic Quadrant for Data Integration Tools. We believe this recognition reflects the differentiated business outcomes CData delivers to our customers.
Get the Report →Create SSAS Tabular Models from Databricks Data
How to build a SQL Server Analysis Service Tabular Model from Databricks data using CData drivers.
SQL Server Analysis Services (SSAS) is an analytical data engine used in decision support and business analytics. It provides enterprise-grade semantic data models for business reports and client applications, such as Power BI, Excel, Reporting Services reports, and other data visualization tools. When paired with the CData ODBC Driver for Databricks, you can create a tabular model from Databricks data for deeper and faster data analysis.
Create a Connection to Databricks Data
If you have not already, first specify connection properties in an ODBC DSN (data source name). This is the last step of the driver installation. You can use the Microsoft ODBC Data Source Administrator to create and configure ODBC DSNs.
To connect to a Databricks cluster, set the properties as described below.
Note: The needed values can be found in your Databricks instance by navigating to Clusters, and selecting the desired cluster, and selecting the JDBC/ODBC tab under Advanced Options.
- Server: Set to the Server Hostname of your Databricks cluster.
- HTTPPath: Set to the HTTP Path of your Databricks cluster.
- Token: Set to your personal access token (this value can be obtained by navigating to the User Settings page of your Databricks instance and selecting the Access Tokens tab).
About Databricks Data Integration
Accessing and integrating live data from Databricks has never been easier with CData. Customers rely on CData connectivity to:
- Access all versions of Databricks from Runtime Versions 9.1 - 13.X to both the Pro and Classic Databricks SQL versions.
- Leave Databricks in their preferred environment thanks to compatibility with any hosting solution.
- Secure authenticate in a variety of ways, including personal access token, Azure Service Principal, and Azure AD.
- Upload data to Databricks using Databricks File System, Azure Blog Storage, and AWS S3 Storage.
While many customers are using CData's solutions to migrate data from different systems into their Databricks data lakehouse, several customers use our live connectivity solutions to federate connectivity between their databases and Databricks. These customers are using SQL Server Linked Servers or Polybase to get live access to Databricks from within their existing RDBMs.
Read more about common Databricks use-cases and how CData's solutions help solve data problems in our blog: What is Databricks Used For? 6 Use Cases.
Getting Started
Creating a Data Source for Databricks
Start by creating a new Analysis Services Tabular Project in Visual Studio. Next create a Data Source for Databricks in the project.
- In the Tabular Model Explorer, right-click Data Sources and select "New Data Source"
- Select "ODBC" from the Other tab and click "Connect"
- Select the DSN you previously configured
- Choose "Default or Custom" as the authentication option and click "Connect"
Add Tables & Relationships
After creating the data source you are ready to import tables and define the relationships between the tables.
- Right-click the new data source, click "Import New Tables" and select the tables to import
- After importing the tables, right-click "Relationships" and click "Create Relationships"
- Select table(s), and choose the foreign keys, cardinality, and filter direction
Create Measures
After importing the tables and defining the relationships, you are ready to create measures.
- Select the column in the table for which you wish to create a measure
- In the Extensions menu -> click "Columns" -> "AutoSum" and select your aggregation method
Deploy the Model
Once you create measures, you are ready to deploy the model. Configure the target server and database by right-clicking the project found in the Solution Explorer and selecting "Properties." Configure the "Deployment Server" properties and click "OK."
After configuring the deployment server, open the "Build" menu and click "Deploy Solution." You now have a tabular model for Databricks data in your SSAS instance, ready to be analyzed, reported, and viewed. Get started with a free, 30-day trial of the CData ODBC Driver for Databricks.