Getting Started

You can download the latest version of SingleStore Spark Connector from Maven Central or SparkPackages. You can download the source code from its GitHub repository. The group is com.singlestore and the artifact is singlestore-spark-connector_2.12.

The following matrix shows currently supported versions of the connector and their compatibility with different Spark versions:

Connector version

Supported Spark versions

Required JDBC driver

3.2.0

Spark 3.0, Spark 3.1

MariaDB JDBC driver version 2.+

3.2.1+

Spark 3.0, Spark 3.1, Spark 3.2

MariaDB JDBC driver version 2.+

4.0.x

Spark 3.0, Spark 3.1, Spark 3.2

SingleStore JDBC driver version 1.0.1

4.1.0

Spark 3.0, Spark 3.1, Spark 3.2

SingleStore JDBC driver version 1.1.0

4.1.1

Spark 3.0, Spark 3.1, Spark 3.2, Spark 3.3

SingleStore JDBC driver version 1.1.0

Note

We recommend using the latest version of the connector compatible with the corresponding Spark version. See Migrate between SingleStore Spark Connector Versions for more information.

The connector follows the x.x.x-spark-y.y.y naming convention, where x.x.x represents the connector version and y.y.y represents the corresponding Spark version. For example, in connector 3.0.0-spark-3.2.0, 3.0.0 is the version of the connector, compiled and tested against Spark version 3.2.0. It is critical to select the connector version that corresponds to the Spark version in use.

Release Highlights

Version 4.1.x

  • Added support for JWT-based authentication.

  • Added support for connection pooling.

  • Added support for Spark 3.3 for connector version 4.1.1.

  • Added support for clientEndpoint option.

  • Added multi-partition support to parallel read feature.

Version 4.0.x

  • The connector uses the SingleStore JDBC driver instead of the MariaDB JDBC driver.

Version 3.2

  • Added support for parallel reads from aggregator nodes.

  • Added support for repartition results by columns in parallel read from aggregators.

Version 3.1

  • The connector uses the MariaDB JDBC driver and rebranded the connector from memsql-spark-connector to singlestore-spark-connector.

  • Adapts the rebranding from memsql to singlestore. For example, configuration prefix changed from spark.datasource.memsql.<config_name> to spark.datasource.singlestore.<config_name>.

Last modified: August 23, 2022

Was this article helpful?