SingleStore DB

Security and Permissions
SQL Permissions

The spark user must have access to the Master Aggregator.

Additionally, SingleStore has a Permissions Matrix which describes the permissions required to run each command.

To make any SQL operations through Spark connector you should have different permissions for different type of operation. The matrix below describes the minimum permissions you should have to perform some operation. As alternative to minimum required permissions, ALL PRIVILEGES allow you to perform any operation.


Min. Permission

Alternative Permission

READ from collection



WRITE to collection



DROP database or collection



CREATE database or collection



SSL Support

The SingleStore Spark Connector uses the MariaDB JDBC Driver under the hood and thus supports SSL configuration out of the box.

In order to configure SSL, first ensure that your SingleStore cluster has SSL configured. Documentation on how to set this up can be found in SSL Secure Connections.

Once you have setup SSL on your server, you can enable SSL via setting the following options:

spark.conf.set("spark.datasource.singlestore.useSSL", "true")
spark.conf.set("spark.datasource.singlestore.serverSslCert", "PATH/TO/CERT")

Note: the serverSslCert option may be server’s certificate in DER form, or the server’s CA certificate. Can be used in one of 3 forms:

  • serverSslCert=/path/to/cert.pem (full path to certificate)

  • serverSslCert=classpath:relative/cert.pem (relative to current classpath)

  • or as verbatim DER-encoded certificate string ------BEGIN CERTIFICATE-----...

You may also want to set these additional options depending on your SSL configuration:

spark.conf.set("spark.datasource.singlestore.trustServerCertificate", "true")
spark.conf.set("spark.datasource.singlestore.disableSslHostnameVerification", "true")

More information on the above parameters can be found at MariaDB’s documentation for their JDBC driver here:

Connecting with a Kerberos-authenticated User

You can use the SingleStore Spark Connector with a Kerberized user without any additional configuration. To use a Kerberized user, you need to configure the connector with the given SingleStore database user that is authenticated with Kerberos (via the user option). Please visit our Kerberos Authentication documentation to learn about how to configure SingleStore users with Kerberos.

Here is an example of configuring the Spark connector globally with a Kerberized SingleStore user called krb_user.

spark = SparkSession.builder()
    .config(“spark.datasource.singlestore.user”, “krb_user”)

You do not need to provide a password when configuring a Spark Connector user that is Kerberized. The connector driver (MariaDB) will be able to authenticate the Kerberos user from the cache by the provided username. Other than omitting a password with this configuration, using a Kerberized user with the Connector is no different than using a standard user. Note that if you do provide a password, it will be ignored.