Ready to get started?

Download a free trial of the SingleStore Driver to get started:

 Download Now

Learn more:

SingleStore Icon SingleStore JDBC Driver

Rapidly create and deploy powerful Java applications that integrate with SingleStore.

How to integrate SingleStore with Apache Airflow



Access and process SingleStore data in Apache Airflow using the CData JDBC Driver.

Apache Airflow supports the creation, scheduling, and monitoring of data engineering workflows. When paired with the CData JDBC Driver for SingleStore, Airflow can work with live SingleStore data. This article describes how to connect to and query SingleStore data from an Apache Airflow instance and store the results in a CSV file.

With built-in optimized data processing, the CData JDBC Driver offers unmatched performance for interacting with live SingleStore data. When you issue complex SQL queries to SingleStore, the driver pushes supported SQL operations, like filters and aggregations, directly to SingleStore and utilizes the embedded SQL engine to process unsupported operations client-side (often SQL functions and JOIN operations). Its built-in dynamic metadata querying allows you to work with and analyze SingleStore data using native data types.

Configuring the Connection to SingleStore

Built-in Connection String Designer

For assistance in constructing the JDBC URL, use the connection string designer built into the SingleStore JDBC Driver. Either double-click the JAR file or execute the jar file from the command-line.

java -jar cdata.jdbc.singlestore.jar

Fill in the connection properties and copy the connection string to the clipboard.

The following connection properties are required in order to connect to data.

  • Server: The host name or IP of the server hosting the SingleStore database.
  • Port: The port of the server hosting the SingleStore database.
  • Database (Optional): The default database to connect to when connecting to the SingleStore Server. If this is not set, tables from all databases will be returned.

Connect Using Standard Authentication

To authenticate using standard authentication, set the following:

  • User: The user which will be used to authenticate with the SingleStore server.
  • Password: The password which will be used to authenticate with the SingleStore server.

Connect Using Integrated Security

As an alternative to providing the standard username and password, you can set IntegratedSecurity to True to authenticate trusted users to the server via Windows Authentication.

Connect Using SSL Authentication

You can leverage SSL authentication to connect to SingleStore data via a secure session. Configure the following connection properties to connect to data:

  • SSLClientCert: Set this to the name of the certificate store for the client certificate. Used in the case of 2-way SSL, where truststore and keystore are kept on both the client and server machines.
  • SSLClientCertPassword: If a client certificate store is password-protected, set this value to the store's password.
  • SSLClientCertSubject: The subject of the TLS/SSL client certificate. Used to locate the certificate in the store.
  • SSLClientCertType: The certificate type of the client store.
  • SSLServerCert: The certificate to be accepted from the server.

Connect Using SSH Authentication

Using SSH, you can securely login to a remote machine. To access SingleStore data via SSH, configure the following connection properties:

  • SSHClientCert: Set this to the name of the certificate store for the client certificate.
  • SSHClientCertPassword: If a client certificate store is password-protected, set this value to the store's password.
  • SSHClientCertSubject: The subject of the TLS/SSL client certificate. Used to locate the certificate in the store.
  • SSHClientCertType: The certificate type of the client store.
  • SSHPassword: The password that you use to authenticate with the SSH server.
  • SSHPort: The port used for SSH operations.
  • SSHServer: The SSH authentication server you are trying to authenticate against.
  • SSHServerFingerPrint: The SSH Server fingerprint used for verification of the host you are connecting to.
  • SSHUser: Set this to the username that you use to authenticate with the SSH server.

To host the JDBC driver in clustered environments or in the cloud, you will need a license (full or trial) and a Runtime Key (RTK). For more information on obtaining this license (or a trial), contact our sales team.

The following are essential properties needed for our JDBC connection.

PropertyValue
Database Connection URLjdbc:singlestore:RTK=5246...;User=myUser;Password=myPassword;Database=NorthWind;Server=myServer;Port=3306;
Database Driver Class Namecdata.jdbc.singlestore.SingleStoreDriver

Establishing a JDBC Connection within Airflow

  1. Log into your Apache Airflow instance.
  2. On the navbar of your Airflow instance, hover over Admin and then click Connections.
  3. Next, click the + sign on the following screen to create a new connection.
  4. In the Add Connection form, fill out the required connection properties:
    • Connection Id: Name the connection, i.e.: singlestore_jdbc
    • Connection Type: JDBC Connection
    • Connection URL: The JDBC connection URL from above, i.e.: jdbc:singlestore:RTK=5246...;User=myUser;Password=myPassword;Database=NorthWind;Server=myServer;Port=3306;)
    • Driver Class: cdata.jdbc.singlestore.SingleStoreDriver
    • Driver Path: PATH/TO/cdata.jdbc.singlestore.jar
  5. Test your new connection by clicking the Test button at the bottom of the form.
  6. After saving the new connection, on a new screen, you should see a green banner saying that a new row was added to the list of connections:

Creating a DAG

A DAG in Airflow is an entity that stores the processes for a workflow and can be triggered to run this workflow. Our workflow is to simply run a SQL query against SingleStore data and store the results in a CSV file.

  1. To get started, in the Home directory, there should be an "airflow" folder. Within there, we can create a new directory and title it "dags". In here, we store Python files that convert into Airflow DAGs shown on the UI.
  2. Next, create a new Python file and title it singlestore_hook.py. Insert the following code inside of this new file:
    	import time
    	from datetime import datetime
    	from airflow.decorators import dag, task
    	from airflow.providers.jdbc.hooks.jdbc import JdbcHook
    	import pandas as pd
    
    	# Declare Dag
    	@dag(dag_id="singlestore_hook", schedule_interval="0 10 * * *", start_date=datetime(2022,2,15), catchup=False, tags=['load_csv'])
    	
    	# Define Dag Function
    	def extract_and_load():
    	# Define tasks
    		@task()
    		def jdbc_extract():
    			try:
    				hook = JdbcHook(jdbc_conn_id="jdbc")
    				sql = """ select * from Account """
    				df = hook.get_pandas_df(sql)
    				df.to_csv("/{some_file_path}/{name_of_csv}.csv",header=False, index=False, quoting=1)
    				# print(df.head())
    				print(df)
    				tbl_dict = df.to_dict('dict')
    				return tbl_dict
    			except Exception as e:
    				print("Data extract error: " + str(e))
                
    		jdbc_extract()
        
    	sf_extract_and_load = extract_and_load()
    
  3. Save this file and refresh your Airflow instance. Within the list of DAGs, you should see a new DAG titled "singlestore_hook".
  4. Click on this DAG and, on the new screen, click on the unpause switch to make it turn blue, and then click the trigger (i.e. play) button to run the DAG. This executes the SQL query in our singlestore_hook.py file and export the results as a CSV to whichever file path we designated in our code.
  5. After triggering our new DAG, we check the Downloads folder (or wherever you chose within your Python script), and see that the CSV file has been created - in this case, account.csv.
  6. Open the CSV file to see that your SingleStore data is now available for use in CSV format thanks to Apache Airflow.

More Information & Free Trial

Download a free, 30-day trial of the CData JDBC Driver for SingleStore and start working with your live SingleStore data in Apache Airflow. Reach out to our Support Team if you have any questions.