Getting Started with Confluent Cloud (Custom Connect)

This guide shows how to use the SingleStore Kafka Sink connector ("the connector") as a custom connector to integrate with Confluent Cloud and connect to your SingleStore deployments.

SingleStore recommends having hands-on experience with Confluent Cloud and an understanding of its concepts. Refer to Custom Connector for Confluent Cloud Quick Start for related information.

Prerequisites

Configure the Connection

To connect to your SingleStore deployment from Confluent Cloud using the SingleStore Kafka Sink connector, perform the following tasks:

  1. Download the singlestore-singlestore-kafka-connector-<version>.zip file from the SingleStore Kafka Connector GitHub repository.

  2. Update the firewall configuration of your SingleStore deployment to allow access to Confluent Cloud.

    1. On the Cloud Portal, select <your_SingleStore_deployment> > Firewall.

    2. Under Inbound, select Edit.

    3. Either specify the inbound IP addresses to allow or select Allow access from anywhere to allow access from any IP address.

    4. Select Save.

  3. Launch a Confluent Cloud cluster. Refer to Quick Start for Confluent Cloud for more information.

  4. Create a topic.

    1. On the Confluent dashboard, select Topics > Add topic.

    2. On the New topic page, enter a name for the topic and the number of partitions.

    3. Select Create with defaults > Skip.

  5. Add a custom connector plugin.

    1. On the Confluent dashboard, select Connectors > Add plugin.

    2. On the Add Custom Connector Plugin page, enter or select the following information:

      1. Connector plugin name: Enter SingleStore Kafka Sink.

      2. Connector class: Enter com.singlestore.kafka.SingleStoreSinkConnector.

      3. Connector type: Select Sink.

      4. Connector archive: Select the SingleStore Kafka Sink connector archive downloaded earlier.

      5. Sensitive properties: Specify connection.password to mask the password.

    3. Agree to the disclaimer.

    4. Select Submit. The SingleStore Kafka Sink connector is now available under Connector Plugins.

  6. On the Connector Plugins page, select SingleStore Kafka Sink.

  7. On the Add SingleStore Kafka Sink connector page, enter the following information:

    1. Under Kafka Credentials, specify the method used to provide the connection credentials. Select one of the following:

      1. My account: Allows the connector to globally access everything that the user's account can access. With a user account, the connector uses an API key and secret to access the Kafka cluster. SingleStore does not recommend this option for production environments.

      2. Service account: Provides limited access to the connector by using a service account. SingleStore recommends this option for production environments.

      3. Use an existing API key: Allows access to the Kafka cluster via an API key and secret. SingleStore does not recommend this option for production environments.

    2. Select Continue.

    3. Under Configuration, enter the following connection configuration details in key-value pairs:

      Key

      Value

      connection.ddlEndpoint

      connection.dmlEndpoint

      IP address or hostname of the SingleStore deployment.

      connection.database

      Name of the SingleStore database to connect with.

      connection.user

      Name of the SingleStore database user with which to access the database.

      connection.password

      Password for the SingleStore database user.

      topics

      List of Kafka topics.

      value.converter

      Set to org.apache.kafka.connect.json.JsonConverter.

      value.converter.schemas.enable

      Set to true.

      Refer to SingleStore Kafka Sink Connector Properties for more information.

    4. Select Continue.

    5. Under Networking, whitelist the SingleStore deployment endpoint. Specify the endpoint in the <hostname>:<port>:TCP format.

    6. Select Continue > Continue (Sizing step) > Continue (Review and launch step).

  8. Add data to the Kafka topic created earlier.

    1. Select Topics > <your_Kafka_Topic> > Messages > Produce new message. This example uses a Kafka topic named SingleStore-quickstart.

    2. On the Produce a new message dialog, add a message. In this example, the following are added:

      Key

      1

      Value

      {"schema": {"type": "struct", "optional": false, "version": 1, "fields": [{ "field": "Id", "type": "string", "optional": true }, { "field": "Artist", "type": "string", "optional": true }, { "field": "Song", "type": "string", "optional": true }] }, "payload": { "Id": "1", "Artist": "Rick Astley", "Song": "Never Gonna Give You Up"}}

    3. Select Produce. The data is added to a SingleStore table named SingleStore-quickstart, in the specified database.

  9. Log in to your SingleStore deployment and run the following command to verify that the data has been ingested.

    SELECT * FROM `SingleStore-quickstart`;
    +------+-------------+-------------------------+
    | Id   | Artist      | Song                    |
    +------+-------------+-------------------------+
    | 1    | Rick Astley | Never Gonna Give You Up |
    +------+-------------+-------------------------+

Last modified: August 3, 2025

Was this article helpful?

Verification instructions

Note: You must install cosign to verify the authenticity of the SingleStore file.

Use the following steps to verify the authenticity of singlestoredb-server, singlestoredb-toolbox, singlestoredb-studio, and singlestore-client SingleStore files that have been downloaded.

You may perform the following steps on any computer that can run cosign, such as the main deployment host of the cluster.

  1. (Optional) Run the following command to view the associated signature files.

    curl undefined
  2. Download the signature file from the SingleStore release server.

    • Option 1: Click the Download Signature button next to the SingleStore file.

    • Option 2: Copy and paste the following URL into the address bar of your browser and save the signature file.

    • Option 3: Run the following command to download the signature file.

      curl -O undefined
  3. After the signature file has been downloaded, run the following command to verify the authenticity of the SingleStore file.

    echo -n undefined |
    cosign verify-blob --certificate-oidc-issuer https://oidc.eks.us-east-1.amazonaws.com/id/CCDCDBA1379A5596AB5B2E46DCA385BC \
    --certificate-identity https://kubernetes.io/namespaces/freya-production/serviceaccounts/job-worker \
    --bundle undefined \
    --new-bundle-format -
    Verified OK