Create Azure Data Lake Storage-Connected Visualizations in Klipfolio

Ready to get started?

Learn more or sign up for a free trial:

CData Connect



Use CData Connect Cloud to connect to Azure Data Lake Storage from Klipfolio and build custom visualizations using live Azure Data Lake Storage data.

Klipfolio is an online dashboard platform for building real-time business dashboards for your team or your clients. When paired with CData Connect Cloud, you get instant, cloud-to-cloud access to Azure Data Lake Storage data for visualizations, reports, and more. This article shows how to create a virtual database for Azure Data Lake Storage in Connect Cloud and build visualizations from Azure Data Lake Storage data in Klipfolio.

CData Connect Cloud provides a pure MySQL, cloud-to-cloud interface for Azure Data Lake Storage, allowing you to build reports from live Azure Data Lake Storage data in Klipfolio — without replicating the data to a natively supported database. As you create visualizations, Klipfolio generates SQL queries to gather data. Using optimized data processing out of the box, CData Connect Cloud pushes all supported SQL operations (filters, JOINs, etc.) directly to Azure Data Lake Storage, leveraging server-side processing to return the requested Azure Data Lake Storage data quickly.

Create a Virtual MySQL Database for Azure Data Lake Storage Data

CData Connect Cloud uses a straightforward, point-and-click interface to connect to data sources and generate APIs.

  1. Log into Connect Cloud and click Databases.
  2. Select "Azure Data Lake Storage" from Available Data Sources.
  3. Enter the necessary authentication properties to connect to Azure Data Lake Storage.

    Authenticating to a Gen 1 DataLakeStore Account

    Gen 1 uses OAuth 2.0 in Azure AD for authentication.

    For this, an Active Directory web application is required. You can create one as follows:

    1. Sign in to your Azure Account through the .
    2. Select "Azure Active Directory".
    3. Select "App registrations".
    4. Select "New application registration".
    5. Provide a name and URL for the application. Select Web app for the type of application you want to create.
    6. Select "Required permissions" and change the required permissions for this app. At a minimum, "Azure Data Lake" and "Windows Azure Service Management API" are required.
    7. Select "Key" and generate a new key. Add a description, a duration, and take note of the generated key. You won't be able to see it again.

    To authenticate against a Gen 1 DataLakeStore account, the following properties are required:

    • Schema: Set this to ADLSGen1.
    • Account: Set this to the name of the account.
    • OAuthClientId: Set this to the application Id of the app you created.
    • OAuthClientSecret: Set this to the key generated for the app you created.
    • TenantId: Set this to the tenant Id. See the property for more information on how to acquire this.
    • Directory: Set this to the path which will be used to store the replicated file. If not specified, the root directory will be used.

    Authenticating to a Gen 2 DataLakeStore Account

    To authenticate against a Gen 2 DataLakeStore account, the following properties are required:

    • Schema: Set this to ADLSGen2.
    • Account: Set this to the name of the account.
    • FileSystem: Set this to the file system which will be used for this account.
    • AccessKey: Set this to the access key which will be used to authenticate the calls to the API. See the property for more information on how to acquire this.
    • Directory: Set this to the path which will be used to store the replicated file. If not specified, the root directory will be used.
  4. Click Test Database.
  5. Click Privileges -> Add and add the new user (or an existing user) with the appropriate permissions.

With the virtual database created, you are ready to connect to Azure Data Lake Storage data from Klipfolio.

Connect to Azure Data Lake Storage in Klipfolio

The steps below outline connecting to CData Connect Cloud from Klipfolio to create a new Azure Data Lake Storage data source.

  1. Open Klipfolio
  2. Click in Data Sources to add a new data source
  3. Search for and select MySQL as the Service
  4. Click "Create a custom MySQL data source"
  5. Configure the data source by setting the MySQL connection properties:
    • Host: your instance (e.g., myinstance.cdatacloud.net)
    • Port: 3306
    • Database: your database (e.g., ADLS1)
    • Driver: MySQL
    • Username: your CloudHub username
    • Password: your CloudHub password
    • SQL Query: any query to retrieve data (e.g., SELECT * FROM Resources)
    • Select the checkbox to "Include column headers"
    • Select the checkbox to "Use SSL/TLS"
  6. Click "Get data" to preview the Azure Data Lake Storage data before building a data model.

Build a Data Model

After retrieving the data, click the checkbox to "Model your data" and click "Continue." In the new window, configure your data model.

  1. Confirm that the model includes all columns you wish to work with
  2. Name your model
  3. (optional) Set the Description
  4. Set "Header in row" to 1
  5. Click the toggle to "Exclude data before row" and set the value to 2
  6. Click "Save and Exit"

Create a Klip

With the data modeled, we are ready to create a Klip (or visualization) of the data to be used in the Klipfolio platform for dashboards, reporting, and more.

  1. Click "Create a Klip"
  2. Drag in a component
  3. For the Series, select the column whose count or aggregation you wish to visualize
  4. For the X Axis, select the column you wish to group by
  5. In the Properties tab for the X Axis, click the checkbox to "Group repeating labels"
  6. Click "Save"
  7. Set the Name & Description and click Save

SQL Access to Azure Data Lake Storage Data from Cloud Applications

Now you have a Klip built from live Azure Data Lake Storage data. You can add it to a new dashboard, share, and more. Easily create more data sources and new visualizations, produce reports, and more — all without replicating Azure Data Lake Storage data.

To get SQL data access to 200+ SaaS, Big Data, and NoSQL sources directly from your cloud applications, try CData Connect Cloud.