Getting Started with Confluent Cloud (Self-Managed)

This guide shows how to use the SingleStore Kafka Sink connector ("the connector") on Confluent Cloud by deploying it on a self-managed Kafka Connect cluster and connecting to your SingleStore deployments.

SingleStore recommends having hands-on experience with Confluent Cloud and an understanding of its concepts. Refer to Connect Self-Managed Kafka Connect to Confluent Cloud for related information.

Prerequisites

Configure the Connection

To connect to your SingleStore deployment from Confluent Cloud using the SingleStore Kafka Sink connector, perform the following tasks:

  1. Download the latest ZIP or TAR distribution of Confluent Platform and extract it.

  2. Launch a Confluent Cloud cluster. Refer to Quick Start for Confluent Cloud for more information.

  3. Create a topic.

    1. On the Confluent dashboard, select Topics > Add topic.

    2. On the New topic page, enter a name for the topic and the number of partitions.

    3. Select Create with defaults > Skip.

    4. Add data to the Kafka topic.

      1. Select Topics > <your_Kafka_Topic> > Messages > Produce new message. This example uses a Kafka topic named SingleStore-quickstart.

      2. On the Produce a new message dialog, add a message. In this example the following are added:

        Key

        1

        Value

        {"schema": {"type": "struct", "optional": false, "version": 1, "fields": [{ "field": "Id", "type": "string", "optional": true }, { "field": "Artist", "type": "string", "optional": true }, { "field": "Song", "type": "string", "optional": true }] }, "payload": { "Id": "1", "Artist": "Rick Astley", "Song": "Never Gonna Give You Up"}}

      3. Select Produce.

  4. Generate a Kafka Connect properties file.

    1. On the Confluent Platform, select Connectors.

    2. Search and select the SingleStore Sink Connector from the list of connectors.

    3. On the Configure SingleStore Sink Connector page, select Next.

    4. Under Connect to Confluent Cloud, select Standalone.

    5. Select Create Kafka cluster API key & secret.

    6. Select Create Schema Registry API key & secret.

    7. Select Generate Config.

    8. Copy the generated file.

  5. In your local environment, create the /<path_to_confluent>/confluent-<version>/etc/my-connect-standalone.properties file and paste the configuration in this file.

  6. Download the singlestore-singlestore-kafka-connector-<version>.zip file from the SingleStore Kafka Connector GitHub repository.

  7. Extract the downloaded .zip archive to /<path_to_confluent>/confluent-<version>/plugins/ directory.

  8. Append plugin.path=/path/to/confluent-<version>/plugins/ to the my-connect-standalone.properties file.

  9. Create a connection configuration file /<path_to_confluent>/confluent-<version>/etc/singlestore-sink.properties, and add the following to the file:

    name=singlestore-sink-connector
    connector.class=com.singlestore.kafka.SingleStoreSinkConnector
    topics=SingleStore-quickstart
    connection.ddlEndpoint=<hostname_or_IP_address_of_SingleStore_deployment>
    connection.database=<SingleStore_Database>
    connection.user=<SingleStore_User>
    connection.password=<Password>
    value.converter=org.apache.kafka.connect.json.JsonConverter
    value.converter.schemas.enable=true

    where,

    Key

    Value

    connection.ddlEndpoint

    connection.dmlEndpoint

    IP address or hostname of the SingleStore deployment.

    connection.database

    Name of the SingleStore database to connect with.

    connection.user

    Name of the SingleStore database user with which to access the database.

    connection.password

    Password for the SingleStore database user.

    topics

    List of Kafka topics.

    value.converter

    Set to org.apache.kafka.connect.json.JsonConverter.

    value.converter.schemas.enable

    Set to true.

    Refer to SingleStore Kafka Sink Connector Properties for more information.

  10. Start Kafka Connect. Run the following command in the Confluent directory:

    ./bin/connect-standalone ./etc/my-connect-standalone.properties ./etc/singlestore-sink.properties
  11. Wait a few minutes, and then log in to your SingleStore deployment and run the following command to verify that the data has been ingested.

    SELECT * FROM `SingleStore-quickstart`;
    +------+-------------+-------------------------+
    | Id   | Artist      | Song                    |
    +------+-------------+-------------------------+
    | 1    | Rick Astley | Never Gonna Give You Up |
    +------+-------------+-------------------------+

Last modified: August 3, 2025

Was this article helpful?

Verification instructions

Note: You must install cosign to verify the authenticity of the SingleStore file.

Use the following steps to verify the authenticity of singlestoredb-server, singlestoredb-toolbox, singlestoredb-studio, and singlestore-client SingleStore files that have been downloaded.

You may perform the following steps on any computer that can run cosign, such as the main deployment host of the cluster.

  1. (Optional) Run the following command to view the associated signature files.

    curl undefined
  2. Download the signature file from the SingleStore release server.

    • Option 1: Click the Download Signature button next to the SingleStore file.

    • Option 2: Copy and paste the following URL into the address bar of your browser and save the signature file.

    • Option 3: Run the following command to download the signature file.

      curl -O undefined
  3. After the signature file has been downloaded, run the following command to verify the authenticity of the SingleStore file.

    echo -n undefined |
    cosign verify-blob --certificate-oidc-issuer https://oidc.eks.us-east-1.amazonaws.com/id/CCDCDBA1379A5596AB5B2E46DCA385BC \
    --certificate-identity https://kubernetes.io/namespaces/freya-production/serviceaccounts/job-worker \
    --bundle undefined \
    --new-bundle-format -
    Verified OK