We are proud to share our inclusion in the 2024 Gartner Magic Quadrant for Data Integration Tools. We believe this recognition reflects the differentiated business outcomes CData delivers to our customers.
Get the Report →Integrate Airtable Data in Pentaho Data Integration
Build ETL pipelines based on Airtable data in the Pentaho Data Integration tool.
The CData JDBC Driver for Airtable enables access to live data from data pipelines. Pentaho Data Integration is an Extraction, Transformation, and Loading (ETL) engine that data, cleanses the data, and stores data using a uniform format that is accessible.This article shows how to connect to Airtable data as a JDBC data source and build jobs and transformations based on Airtable data in Pentaho Data Integration.
Configure to Airtable Connectivity
APIKey, BaseId and TableNames parameters are required to connect to Airtable. ViewNames is an optional parameter where views of the tables may be specified.
- APIKey : API Key of your account. To obtain this value, after logging in go to Account. In API section click Generate API key.
- BaseId : Id of your base. To obtain this value, it is in the same section as the APIKey. Click on Airtable API, or navigate to https://airtable.com/api and select a base. In the introduction section you can find "The ID of this base is appxxN2ftedc0nEG7."
- TableNames : A comma separated list of table names for the selected base. These are the same names of tables as found in the UI.
- ViewNames : A comma separated list of views in the format of (table.view) names. These are the same names of the views as found in the UI.
Built-in Connection String Designer
For assistance in constructing the JDBC URL, use the connection string designer built into the Airtable JDBC Driver. Either double-click the JAR file or execute the jar file from the command-line.
java -jar cdata.jdbc.airtable.jar
Fill in the connection properties and copy the connection string to the clipboard.

When you configure the JDBC URL, you may also want to set the Max Rows connection property. This will limit the number of rows returned, which is especially helpful for improving performance when designing reports and visualizations.
Below is a typical JDBC URL:
jdbc:airtable:APIKey=keymz3adb53RqsU;BaseId=appxxN2fe34r3rjdG7;TableNames=Table1,...;ViewNames=Table1.View1,...;
Save your connection string for use in Pentaho Data Integration.
Connect to Airtable from Pentaho DI
Open Pentaho Data Integration and select "Database Connection" to configure a connection to the CData JDBC Driver for Airtable
- Click "General"
- Set Connection name (e.g. Airtable Connection)
- Set Connection type to "Generic database"
- Set Access to "Native (JDBC)"
- Set Custom connection URL to your Airtable connection string (e.g.
jdbc:airtable:APIKey=keymz3adb53RqsU;BaseId=appxxN2fe34r3rjdG7;TableNames=Table1,...;ViewNames=Table1.View1,...;
- Set Custom driver class name to "cdata.jdbc.airtable.AirtableDriver"
- Test the connection and click "OK" to save.
Create a Data Pipeline for Airtable
Once the connection to Airtable is configured using the CData JDBC Driver, you are ready to create a new transformation or job.
- Click "File" >> "New" >> "Transformation/job"
- Drag a "Table input" object into the workflow panel and select your Airtable connection.
- Click "Get SQL select statement" and use the Database Explorer to view the available tables and views.
- Select a table and optionally preview the data for verification.
At this point, you can continue your transformation or jb by selecting a suitable destination and adding any transformations to modify, filter, or otherwise alter the data during replication.

Free Trial & More Information
Download a free, 30-day trial of the CData JDBC Driver for Airtable and start working with your live Airtable data in Pentaho Data Integration today.