Ready to get started?

Learn more about the CData JDBC Driver for Apache Spark or download a free trial:

Download Now

DBArtisan でJDBC 経由でApache Spark Data をデータ連携利用

DBArtisan のウィザードを使用して、Apache Spark のJDBC データソースを作成します。

CData JDBC Driver for Apache Spark は、データベースとしてApache Spark data に連携できるようにすることで、Apache Spark data をDBArtisan などのデータベース管理ツールにシームレスに連携します。ここでは、DBArtisan でApache Spark のJDBC ソースを作成する方法を説明します。データを直観的に標準SQL で実行できます。

Apache Spark Data をDBArtisan Projects に連携

Follow the steps below to register Apache Spark data as a database instance in your project:

  1. In DBArtisan, click Data Source -> Register Datasource.
  2. Select Generic JDBC.
  3. Click Manage.
  4. In the resulting dialog, click New.Enter a name for the driver and click Add.In the resulting dialog, navigate to the driver JAR.The driver JAR is located in the lib subfolder of the installation directory.
  5. In the Connection URL box, enter credentials and other required connection properties in the JDBC URL.

    Set the Server, Database, User, and Password connection properties to connect to SparkSQL.

    ビルトイン接続文字列デザイナー

    For assistance in constructing the JDBC URL, use the connection string designer built into the Apache Spark JDBC Driver.Either double-click the JAR file or execute the jar file from the command-line.

    java -jar cdata.jdbc.sparksql.jar

    Fill in the connection properties and copy the connection string to the clipboard.

    Below is a typical connection string:

    jdbc:sparksql:Server=127.0.0.1;
  6. Finish the wizard to connect to Apache Spark data. Apache Spark entities are displayed in the Datasource Explorer.

You can now work with Apache Spark data as you work with any other database. See the driver help documentation for more information on the queries supported by the Apache Spark API.

 
 
ダウンロード