Migrating data from Azure Data Lake Storage to Snowflake using CData SSIS Components.



Easily push Azure Data Lake Storage data to Snowflake using the CData SSIS Tasks for Azure Data Lake Storage and Snowflake.

Snowflake is a leading cloud data warehouse and a popular backbone for enterprise BI, analytics, data management, and governance initiatives. Snowflake offers features such as data sharing, real-time data processing, and secure data storage which makes it a common choice for cloud data consolidation.

The CData SSIS Components enhance SQL Server Integration Services by enabling users to easily import and export data from various sources and destinations.

In this article, we explore the data type mapping considerations when exporting to Snowflake and walk through how to migrate Azure Data Lake Storage data to Snowflake using the CData SSIS Components for Azure Data Lake Storage and Snowflake.

Data Type Mapping

Snowflake Schema CData Schema

NUMBER, DECIMAL, NUMERIC, INT, INTEGER, BIGINT, SMALLINT, TINYINT, BYTEINT

decimal

DOUBLE, FLOAT, FLOAT4, FLOAT8, DOUBLEPRECISION, REAL

real

VARCHAR, CHAR, STRING, TEXT, VARIANT, OBJECT, ARRAY, GEOGRAPHY

varchar

BINARY, VARBINARY

binary

BOOLEAN

bool

DATE

date

DATETIME, TIMESTAMP, TIMESTAMP_LTZ, TIMESTAMP_NTZ, TIMESTAMP_TZ

datetime

TIME

time

Special Considerations

  • Casing: Snowflake enforces an exact case match by default for identifiers, so it is common to run into issues that can be attributed to mismatched casing. Set the IgnoreCase property to True in your CData SSIS Components for Snowflake connection to resolve these issues. This property directly maps to the QUOTED_IDENTIFIERS_IGNORE_CASE property in Snowflake and specifies whether Snowflake will treat identifiers as case-sensitive.
  • Timestamps: Snowflake supports three timestamp types:

    • TIMESTAMP_NTZ: This timestamp stores UTC time with a specified precision. However, all operations are performed in the current session's time zone, controlled by the TIMEZONE session parameter.
    • TIMESTAMP_LTZ: This timestamp stores "wallclock" time with a specified precision. All operations are performed without taking any time zone into account.
    • TIMESTAMP_TZ: This timestamp stores UTC time together with an associated time zone offset. When a time zone isn't provided, the session time zone offset is used.

    By default the CData SSIS Components write timestamps to Snowflake as TIMESTAMP_NTZ unless manually configured.

Prerequisites

Create the project and add components

  1. Open Visual Studio and create a new Integration Services Project.
  2. Add a new Data Flow Task to the Control Flow screen and open the Data Flow Task.
  3. Add a CData Azure Data Lake Storage Source control and a CData Snowflake Destination control to the data flow task.

Configure the Azure Data Lake Storage source

Follow the steps below to specify properties required to connect to Azure Data Lake Storage.

  1. Double-click the CData Azure Data Lake Storage Source to open the source component editor and add a new connection.
  2. In the CData Azure Data Lake Storage Connection Manager, configure the connection properties, then test and save the connection.

    Authenticating to a Gen 1 DataLakeStore Account

    Gen 1 uses OAuth 2.0 in Entra ID (formerly Azure AD) for authentication.

    For this, an Active Directory web application is required. You can create one as follows:

    1. Sign in to your Azure Account through the .
    2. Select "Entra ID" (formerly Azure AD).
    3. Select "App registrations".
    4. Select "New application registration".
    5. Provide a name and URL for the application. Select Web app for the type of application you want to create.
    6. Select "Required permissions" and change the required permissions for this app. At a minimum, "Azure Data Lake" and "Windows Azure Service Management API" are required.
    7. Select "Key" and generate a new key. Add a description, a duration, and take note of the generated key. You won't be able to see it again.

    To authenticate against a Gen 1 DataLakeStore account, the following properties are required:

    • Schema: Set this to ADLSGen1.
    • Account: Set this to the name of the account.
    • OAuthClientId: Set this to the application Id of the app you created.
    • OAuthClientSecret: Set this to the key generated for the app you created.
    • TenantId: Set this to the tenant Id. See the property for more information on how to acquire this.
    • Directory: Set this to the path which will be used to store the replicated file. If not specified, the root directory will be used.

    Authenticating to a Gen 2 DataLakeStore Account

    To authenticate against a Gen 2 DataLakeStore account, the following properties are required:

    • Schema: Set this to ADLSGen2.
    • Account: Set this to the name of the account.
    • FileSystem: Set this to the file system which will be used for this account.
    • AccessKey: Set this to the access key which will be used to authenticate the calls to the API. See the property for more information on how to acquire this.
    • Directory: Set this to the path which will be used to store the replicated file. If not specified, the root directory will be used.
  3. After saving the connection, select "Table or view" and select the table or view to export into Snowflake, then close the CData Azure Data Lake Storage Source Editor.

Configure the Snowflake destination

With the Azure Data Lake Storage Source configured, we can configure the Snowflake connection and map the columns.

  1. Double-click the CData Snowflake Destination to open the destination component editor and add a new connection.
  2. In the CData Snowflake Connection Manager, configure the connection properties, then test and save the connection.
    • The component supports Snowflake user authentication, federated authentication, and SSL client authentication. To authenticate, set User and Password, and select the authentication method in the AuthScheme property. Starting with accounts created using Snowflake’s bundle 2024_08 (October 2024), password-based authentication is no longer supported due to security concerns. Instead, use alternative authentication methods such as OAuth or Private Key authentication.

    Other helpful connection properties

    • QueryPassthrough: When this is set to True, queries are passed through directly to Snowflake.
    • ConvertDateTimetoGMT: When this is set to True, the components will convert date-time values to GMT, instead of the local time of the machine.
    • IgnoreCase: A session parameter that specifies whether Snowflake will treat identifiers as case sensitive. Default: false(case is sensitive).
    • BindingType: There are two kinds of binding types: DEFAULT and TEXT. DEFAULT uses the binding type DATE for the Date type, TIME for the Time type, and TIMESTAMP_* for the Timestamp_* type. TEST uses the binding type TEXT for Date, Time, and Timestamp_* types.
  3. After saving the connection, select a table in the Use a Table menu and in the Action menu, select Insert.
  4. On the Column Mappings tab, configure the mappings from the input columns to the destination columns.

Run the project

You can now run the project. After the SSIS Task has finished executing, data from your SQL table will be exported to the chosen table.

Ready to get started?

Download a free trial of the Azure Data Lake Storage SSIS Component to get started:

 Download Now

Learn more:

Azure Data Lake Storage Icon Azure Data Lake Storage SSIS Components

Powerful SSIS Source & Destination Components that allows you to easily connect SQL Server with Azure Data Lake Storage through SSIS Workflows.

Use the Azure Data Lake Storage Data Flow Components to synchronize with Azure Data Lake Storage ADLSData, and more. Perfect for data synchronization, local back-ups, workflow automation, and more!