Discover how a bimodal integration strategy can address the major data management challenges facing your organization today.
Get the Report →Model, Search, and Visualize Live Azure Data Lake Storage Data in ThoughtSpot
Use CData Connect Cloud to connect to live Azure Data Lake Storage data for modeling, searching, and visualizing.
ThoughtSpot is a cloud-based analytics platform that uses artificial intelligence (AI) and natural language processing (NLP) to help users analyze data and make decisions. When paired with CData Connect Cloud, you get instant, cloud-to-cloud access to Azure Data Lake Storage data for visualizations, dashboards, and more. This article shows how to connect to Azure Data Lake Storage and build visualizations from Azure Data Lake Storage data in ThoughtSpot.
CData Connect Cloud provides a pure SQL Server, cloud-to-cloud interface for Azure Data Lake Storage, allowing you to easily build models and visualizations from live Azure Data Lake Storage data in ThoughtSpot. As you build visualizations, ThoughtSpot generates SQL queries to gather data. Using optimized data processing out of the box, CData Connect Cloud pushes all supported SQL operations (filters, JOINs, etc) directly to Azure Data Lake Storage, leveraging server-side processing to quickly return Azure Data Lake Storage data.
Configure Azure Data Lake Storage Connectivity for ThoughtSpot
Connectivity to Azure Data Lake Storage from ThoughtSpot is made possible through CData Connect Cloud. To work with Azure Data Lake Storage data from ThoughtSpot, we start by creating and configuring a Azure Data Lake Storage connection.
- Log into Connect Cloud, click Connections and click Add Connection
- Select "Azure Data Lake Storage" from the Add Connection panel
-
Enter the necessary authentication properties to connect to Azure Data Lake Storage.
Authenticating to a Gen 1 DataLakeStore Account
Gen 1 uses OAuth 2.0 in Azure AD for authentication.
For this, an Active Directory web application is required. You can create one as follows:
To authenticate against a Gen 1 DataLakeStore account, the following properties are required:
- Schema: Set this to ADLSGen1.
- Account: Set this to the name of the account.
- OAuthClientId: Set this to the application Id of the app you created.
- OAuthClientSecret: Set this to the key generated for the app you created.
- TenantId: Set this to the tenant Id. See the property for more information on how to acquire this.
- Directory: Set this to the path which will be used to store the replicated file. If not specified, the root directory will be used.
Authenticating to a Gen 2 DataLakeStore Account
To authenticate against a Gen 2 DataLakeStore account, the following properties are required:
- Schema: Set this to ADLSGen2.
- Account: Set this to the name of the account.
- FileSystem: Set this to the file system which will be used for this account.
- AccessKey: Set this to the access key which will be used to authenticate the calls to the API. See the property for more information on how to acquire this.
- Directory: Set this to the path which will be used to store the replicated file. If not specified, the root directory will be used.
- Click Create & Test
- Navigate to the Permissions tab in the Add Azure Data Lake Storage Connection page and update the User-based permissions.
Add a Personal Access Token
If you are connecting from a service, application, platform, or framework that does not support OAuth authentication, you can create a Personal Access Token (PAT) to use for authentication. Best practices would dictate that you create a separate PAT for each service, to maintain granularity of access.
- Click on your username at the top right of the Connect Cloud app and click User Profile.
- On the User Profile page, scroll down to the Personal Access Tokens section and click Create PAT.
- Give your PAT a name and click Create.
- The personal access token is only visible at creation, so be sure to copy it and store it securely for future use.
With the connection configured, you are ready to connect to Azure Data Lake Storage data from ThoughtSpot.
Model, Search, and Visualize Live Azure Data Lake Storage Data in ThoughtSpot
To establish a connection from ThoughtSpot to the CData Connect Cloud Virtual SQL Server API, follow these steps.
- Log into ThoughtSpot
- On the top navigation bar, click Data.
- Click Create new > Connection.
- Name the connection and click "SQL Server" as the data warehouse.
- Click Continue on the top right.
- Enter the connection settings:
- Host: enter the Virtual SQL Server endpoint: tds.cdata.com
- Port: : enter 14333
- Username: enter your CData Connect Cloud username. This is displayed in the top-right corner of the CData Connect Cloud interface. For example, [email protected].
- Password: enter the PAT you generated on the Settings page.
- Database: enter the Connection Name of the CData Connect Cloud data source you want to connect to (for example, ADLS1).
- Click Continue.
- After connecting successfully, you will be able to choose which tables to include.
- Click Create Connection.
After you successfully configure your connection, you can build models, search, and visualize your Azure Data Lake Storage data.
Real-Time Access to Azure Data Lake Storage Data from Cloud Applications
At this point, you have a direct, cloud-to-cloud connection to live Azure Data Lake Storage data from ThoughtSpot. You can model, search, and visualize your data from ThoughtSpot . For more information on gaining live access to data from more than 100 SaaS, Big Data, and NoSQL sources from cloud applications like ThoughtSpot, refer to our Connect Cloud page.