Important

The SingleStore 9.0 release candidate (RC) gives you the opportunity to preview, evaluate, and provide feedback on new and upcoming features prior to their general availability. In the interim, SingleStore 8.9 is recommended for production workloads, which can later be upgraded to SingleStore 9.0.

Getting Started with Apache Kafka (Self-Managed)

This guide shows how to install and use the Java-based SingleStore Kafka Sink connector ("the connector") with open-source Apache Kafka and connect to your SingleStore deployments.

SingleStore recommends having hands-on experience with Apache Kafka and an understanding of its concepts. Refer to Apache Kafka Quickstart for related information.

Prerequisites

Install the Connector

To install the SingleStore Kafka Sink connector, perform the following tasks:

  1. Download the singlestore-singlestore-kafka-connector-<version>.zip file from the SingleStore Kafka Connector GitHub repository.

  2. Extract the downloaded .zip archive to a directory in your local environment.

  3. Add the path of the directory to which the archive is extracted to the Kafka Connect's plugin path by configuring the plugin.path property in the connect-distributed.properties file. For example, if the archive is extracted to the /home/user/kafka/plugins/singlestore-singlestore-kafka-connector directory, set plugin.path to /home/user/kafka/plugins/.

  4. Restart the Kafka Connect process to identify and add the plugin JARs.

Configure the Connection

To configure the connection to SingleStore, perform the following tasks:

  1. Run the following command to register the connector.

    Note: Specify the Kafka Connect URL and connection configuration of your SingleStore deployment before running this command.

    curl -i -X POST \
    -H "Accept:application/json" \
    -H "Content-Type:application/json" \
    <Kafka_Connect_URL>/connectors/ \
    -d '{
    "name": "singlestore-sink-connector",
    "config": {
    "connector.class":"com.singlestore.kafka.SingleStoreSinkConnector",
    "topics":"<topic_name>",
    "connection.ddlEndpoint":"<endpoint>",
    "connection.database":"<database>",
    "connection.user":"<username>",
    "connection.password":"<password>"
    }
    }'

    where,

    • endpoint: Hostname or IP address of the SingleStore deployment.

    • database: Name of the SingleStore database to connect with.

    • username: Username of the SingleStore database user.

    • password: Password for the SingleStore database user.

    • topic_name: Name of the Kafka topic.

    Refer to SingleStore Kafka Sink Connector Properties for more information.

  2. Verify that the connector is running.

    curl -X GET <Kafka Connect URL>/connectors/singlestore-sink-connector/status

    If the connector is running, the output is similar to the following (output is formatted for readability):

    {
    "name": "singlestore-sink-connector",
    "connector": {
    "state": "RUNNING",
    "worker_id": "<endpoint>:<port>"
    },
    "tasks": [
    { "id": 0,
    "state": "RUNNING",
    "worker_id": "<endpoint>:<port>"
    }],
    "type": "sink"
    }
  3. Add data to the topic.

    1. From the Kafka installation directory, run the console producer client to add events to the Kafka topic. For example,

      bin/kafka-console-producer.sh --topic <topic_name> --bootstrap-server <Bootstrap_Server_URL>
    2. Add each line (event) in the console. In this example, the following events are added to a topic named songs:

      {"schema": {"type": "struct", "optional": false, "version": 1, "fields": [{ "field": "Id", "type": "string", "optional": true }, { "field": "Artist", "type": "string", "optional": true }, { "field": "Song", "type": "string", "optional": true }] }, "payload": { "Id": "1", "Artist": "Rick Astley", "Song": "Never Gonna Give You Up"}}
      {"schema": {"type": "struct", "optional": false, "version": 1, "fields": [{ "field": "Id", "type": "string", "optional": true }, { "field": "Artist", "type": "string", "optional": true }, { "field": "Song", "type": "string", "optional": true }] }, "payload": { "Id": "2", "Artist": "AC/DC", "Song": "Highway to hell"}}
      {"schema": {"type": "struct", "optional": false, "version": 1, "fields": [{ "field": "Id", "type": "string", "optional": true }, { "field": "Artist", "type": "string", "optional": true }, { "field": "Song", "type": "string", "optional": true }] }, "payload": { "Id": "3", "Artist": "Nirvana", "Song": "Smells like teen spirit"}}
      {"schema": {"type": "struct", "optional": false, "version": 1, "fields": [{ "field": "Id", "type": "string", "optional": true }, { "field": "Artist", "type": "string", "optional": true }, { "field": "Song", "type": "string", "optional": true }] }, "payload": { "Id": "4", "Artist": "Judy Garland", "Song": "Over the Rainbow"}}
      {"schema": {"type": "struct", "optional": false, "version": 1, "fields": [{ "field": "Id", "type": "string", "optional": true }, { "field": "Artist", "type": "string", "optional": true }, { "field": "Song", "type": "string", "optional": true }] }, "payload": { "Id": "5", "Artist": "Bing Crosby", "Song": "White Christmas"}}
      {"schema": {"type": "struct", "optional": false, "version": 1, "fields": [{ "field": "Id", "type": "string", "optional": true }, { "field": "Artist", "type": "string", "optional": true }, { "field": "Song", "type": "string", "optional": true }] }, "payload": { "Id": "6", "Artist": "Woody Guthrie", "Song": "This Land Is Your Land"}}
      {"schema": {"type": "struct", "optional": false, "version": 1, "fields": [{ "field": "Id", "type": "string", "optional": true }, { "field": "Artist", "type": "string", "optional": true }, { "field": "Song", "type": "string", "optional": true }] }, "payload": { "Id": "7", "Artist": "Aretha Franklin", "Song": "Respect"}}
      {"schema": {"type": "struct", "optional": false, "version": 1, "fields": [{ "field": "Id", "type": "string", "optional": true }, { "field": "Artist", "type": "string", "optional": true }, { "field": "Song", "type": "string", "optional": true }] }, "payload": { "Id": "8", "Artist": "Don McLean", "Song": "American Pie"}}
      {"schema": {"type": "struct", "optional": false, "version": 1, "fields": [{ "field": "Id", "type": "string", "optional": true }, { "field": "Artist", "type": "string", "optional": true }, { "field": "Song", "type": "string", "optional": true }] }, "payload": { "Id": "9", "Artist": "The Andrews Sisters", "Song": "Boogie Woogie Bugle Boy"}}
      {"schema": {"type": "struct", "optional": false, "version": 1, "fields": [{ "field": "Id", "type": "string", "optional": true }, { "field": "Artist", "type": "string", "optional": true }, { "field": "Song", "type": "string", "optional": true }] }, "payload": { "Id": "10", "Artist": "Billy Murray", "Song": "Take Me Out to the Ball Game"}}

      By default, each line represents a separate event that is written to the Kafka topic. The input follows the schema, payload format.

      Alternatively, you can redirect input from a file. For example,

      bin/kafka-console-producer.sh --topic <topic_name> --bootstrap-server <Bootstrap_Server_URL> < <filename.json>
  4. Stop the producer, for example using Ctrl+c.

  5. After events are added to the topic, the connector automatically creates a corresponding table named songs in the specified SingleStore database and inserts rows into the table. For example:

    SELECT * FROM songs;
    +------+---------------------+------------------------------+
    | Id   | Artist              | Song                         |
    +------+---------------------+------------------------------+
    | 7    | Aretha Franklin     | Respect                      |
    | 6    | Woody Guthrie       | This Land Is Your Land       |
    | 9    | The Andrews Sisters | Boogie Woogie Bugle Boy      |
    | 10   | Billy Murray        | Take Me Out to the Ball Game |
    | 5    | Bing Crosby         | White Christmas              |
    | 3    | Nirvana             | Smells like teen spirit      |
    | 1    | Rick Astley         | Never Gonna Give You Up      |
    | 8    | Don McLean          | American Pie                 |
    | 2    | AC/DC               | Highway to hell              |
    | 4    | Judy Garland        | Over the Rainbow             |
    +------+---------------------+------------------------------+

Last modified: August 5, 2025

Was this article helpful?

Verification instructions

Note: You must install cosign to verify the authenticity of the SingleStore file.

Use the following steps to verify the authenticity of singlestoredb-server, singlestoredb-toolbox, singlestoredb-studio, and singlestore-client SingleStore files that have been downloaded.

You may perform the following steps on any computer that can run cosign, such as the main deployment host of the cluster.

  1. (Optional) Run the following command to view the associated signature files.

    curl undefined
  2. Download the signature file from the SingleStore release server.

    • Option 1: Click the Download Signature button next to the SingleStore file.

    • Option 2: Copy and paste the following URL into the address bar of your browser and save the signature file.

    • Option 3: Run the following command to download the signature file.

      curl -O undefined
  3. After the signature file has been downloaded, run the following command to verify the authenticity of the SingleStore file.

    echo -n undefined |
    cosign verify-blob --certificate-oidc-issuer https://oidc.eks.us-east-1.amazonaws.com/id/CCDCDBA1379A5596AB5B2E46DCA385BC \
    --certificate-identity https://kubernetes.io/namespaces/freya-production/serviceaccounts/job-worker \
    --bundle undefined \
    --new-bundle-format -
    Verified OK