Ready to get started?

Learn more about the CData JDBC Driver for USPS or download a free trial:

Download Now

Pipe USPS Data in Google Data Fusion

Load the CData JDBC Driver into Google Data Fusion and pipe live USPS data to any supported data platform.

Google Data Fusion allows users to perform self-service data integration to consolidate disparate data. Uploading the CData JDBC Driver for USPS enables users to access live USPS data from within their Google Data Fusion pipelines. While the CData JDBC Driver enables piping USPS data to any data source natively supported in Google Data Fusion, this article walks through piping data from USPS to Google BigQuery,

Upload the CData JDBC Driver for USPS to Google Data Fusion

Upload the CData JDBC Driver for USPS to your Google Data Fusion instance to work with live USPS data. Due to the naming restrictions for JDBC drivers in Google Data Fusion, create a copy or rename the JAR file to match the following format -.jar. For example: cdata.jdbc.usps-2019.jar

  1. Open your Google Data Fusion instance
  2. Click the to add an entity and upload a driver
  3. On the "Upload driver" tab, drag or browse to the renamed JAR file.
  4. On the "Driver configuration" tab:
    • Name: Create a name for the driver (cdata.jdbc.usps) and make note of the name
    • Class name: Set the JDBC class name: (cdata.jdbc.usps.USPSDriver)
  5. Click "Finish"

Pipe USPS Data in Google Data Fusion

With the JDBC Driver uploaded, you are ready to work with live USPS data in Google Data Fusion Pipelines.

  1. Navigate to the Pipeline Studio to create a new Pipeline
  2. From the "Source" options, click "Database" to add a source for the JDBC Driver
  3. Click "Properties" on the Database source to edit the properties
    • Set the Label
    • Set Reference Name to a value for any future references (i.e.: cdata-usps)
    • Set Plugin Type to "jdbc"
    • Set Connection String to the JDBC URL for USPS. For example:

      jdbc:usps:5246...;PostageProvider=ENDICIA; RequestId=12345; Password='abcdefghijklmnopqr'; AccountNumber='12A3B4C'

      To authenticate with USPS, set the following connection properties.

      • PostageProvider: The postage provider to use to process requests. Available options are ENDICIA and STAMPS. If unspecified, this property will default to ENDICIA.
      • UseSandbox: This controls whether live or test requests are sent to the production or sandbox servers. If set to true, the Password, AccountNumber, and StampsUserId properties are ignored.
      • StampsUserId: This value is used for logging into authentication to the Stamps servers. This value is not applicable for Endicia and is optional if UseSandbox is true.
      • Password: This value is used for logging into Endicia and Stamps servers. If the postage provider is Endicia, this will be the pass phrase associated with your postage account. It is optional if UseSandbox is true.
      • AccountNumber: The shipper's account number. It is optional if UseSandbox is true.
      • PrintLabelLocation: This property is required to use the GenerateLabels or GenerateReturnLabels stored procedures. This should be set to the folder location where generated labels should be stored.

      The Cache Database

      Many of the useful task available from USPS require a lot of data. To ensure this data is easy to input and recall later, utilize a cache database to make requests. Set the cache connection properties in order to use the cache:

      • CacheLocation: The path to the cache location, for which a connection will be configured with the default cache provider. For example, C:\users\username\documents\uspscache

      As an alternative to CacheLocation, set the combination of CacheConnection and CacheProvider to configure a cache connection using a provider separate from the default.

      To use the JDBC Driver in Google Data Fusion, you will need to set the RTK property in the JDBC URL. You can view the licensing file included in the installation for information on how to set this property.

      Built-in Connection String Designer

      For assistance in constructing the JDBC URL, use the connection string designer built into the USPS JDBC Driver. Either double-click the JAR file or execute the jar file from the command-line.

      java -jar cdata.jdbc.usps.jar

      Fill in the connection properties and copy the connection string to the clipboard.

    • Set Import Query to a SQL query that will extract the data you want from USPS, i.e.:
      SELECT * FROM Senders
  4. From the "Sink" tab, click to add a destination sink (we use Google BigQuery in this example)
  5. Click "Properties" on the BigQuery sink to edit the properties
    • Set the Label
    • Set Reference Name to a value like usps-bigquery
    • Set Projcect ID to a specific Google BigQuery Project ID (or leave as the default, "auto-detect")
    • Set Dataset to a specific Google BigQuery dataset
    • Set Table to the name of the table you wish to insert USPS data into

With the Source and Sink configured, you are ready to pipe USPS data into Google BigQuery. Save and deploy the pipeline. When you run the pipeline, Google Data Fusion will request live data from USPS and import it into Google BigQuery.

While this is a simple pipeline, you can create more complex USPS pipelines with transforms, analytics, conditions, and more. Download a free, 30-day trial of the CData JDBC Driver for USPS and start working with your live USPS data in Google Data Fusion today.