Getting Started

The SingleStore Spark Connector supports SingleStore versions 7.1+. Version support varies for different versions of the connector.

You can download the latest version of SingleStore Spark Connector from Maven Central or SparkPackages. You can download the source code from its GitHub repository. The group is com.singlestore and the artifact is singlestore-spark-connector_2.12.

The following matrix shows currently supported versions of the connector and their compatibility with different Spark versions:

Connector version

Supported Spark versions

Required JDBC driver

3.1.0 - 3.1.2

Spark 3.0

MariaDB JDBC driver version 2.+

3.1.3

Spark 3.0, Spark 3.1

MariaDB JDBC driver version 2.+

3.2.0

Spark 3.0, Spark 3.1

MariaDB JDBC driver version 2.+

3.2.1+

Spark 3.0, Spark 3.1, Spark 3.2

MariaDB JDBC driver version 2.+

4.0.x

Spark 3.0, Spark 3.1, Spark 3.2

SingleStore JDBC driver version 1.0.1

Note

We recommend using the latest version of the connector compatible with the corresponding Spark version. See Migrating between Spark Versions for more information.

The connector follows the x.x.x-spark-y.y.y naming convention, where x.x.x represents the connector version and y.y.y represents the corresponding Spark version. For example, in connector 3.0.0-spark-3.2.0, 3.0.0 is the version of the connector, compiled and tested against Spark version 3.2.0. It is critical to select the connector version that corresponds to the Spark version in use.

Release Highlights

Version 4.x

  • The connector uses the SingleStore JDBC driver instead of the MariaDB JDBC driver.

Version 3.2

  • Added support for parallel reads from aggregator nodes.

  • Added support for repartition results by columns in parallel read from aggregators.

Version 3.1

  • The connector uses the MariaDB JDBC driver and rebranded the connector from memsql-spark-connector to singlestore-spark-connector.

  • Adapts the rebranding from memsql to singlestore. For example, configuration prefix changed from spark.datasource.memsql.<config_name> to spark.datasource.singlestore.<config_name>.