Getting Started with Confluent Cloud (Custom Connect)
On this page
This guide shows how to use the SingleStore Kafka Sink connector ("the connector") as a custom connector to integrate with Confluent Cloud and connect to your SingleStore deployments.
SingleStore recommends having hands-on experience with Confluent Cloud and an understanding of its concepts.
Prerequisites
-
An active SingleStore deployment.
-
Access to a Confluent Cloud cluster.
Configure the Connection
To connect to your SingleStore deployment from Confluent Cloud using the SingleStore Kafka Sink connector, perform the following tasks:
-
Download the
singlestore-singlestore-kafka-connector-<version>.
file from the SingleStore Kafka Connector GitHub repository.zip -
Update the firewall configuration of your SingleStore deployment to allow access to Confluent Cloud.
-
On the Cloud Portal, select <your_
SingleStore_ deployment> > Firewall. -
Under Inbound, select Edit.
-
Either specify the inbound IP addresses to allow or select Allow access from anywhere to allow access from any IP address.
-
Select Save.
-
-
Launch a Confluent Cloud cluster.
Refer to Quick Start for Confluent Cloud for more information. -
Create a topic.
-
On the Confluent dashboard, select Topics > Add topic.
-
On the New topic page, enter a name for the topic and the number of partitions.
-
Select Create with defaults > Skip.
-
-
Add a custom connector plugin.
-
On the Confluent dashboard, select Connectors > Add plugin.
-
On the Add Custom Connector Plugin page, enter or select the following information:
-
Connector plugin name: Enter SingleStore Kafka Sink.
-
Connector class: Enter
com.
.singlestore. kafka. SingleStoreSinkConnector -
Connector type: Select Sink.
-
Connector archive: Select the SingleStore Kafka Sink connector archive downloaded earlier.
-
Sensitive properties: Specify
connection.
to mask the password.password
-
-
Agree to the disclaimer.
-
Select Submit.
The SingleStore Kafka Sink connector is now available under Connector Plugins.
-
-
On the Connector Plugins page, select SingleStore Kafka Sink.
-
On the Add SingleStore Kafka Sink connector page, enter the following information:
-
Under Kafka Credentials, specify the method used to provide the connection credentials.
Select one of the following: -
My account: Allows the connector to globally access everything that the user's account can access.
With a user account, the connector uses an API key and secret to access the Kafka cluster. SingleStore does not recommend this option for production environments. -
Service account: Provides limited access to the connector by using a service account.
SingleStore recommends this option for production environments. -
Use an existing API key: Allows access to the Kafka cluster via an API key and secret.
SingleStore does not recommend this option for production environments.
-
-
Select Continue.
-
Under Configuration, enter the following connection configuration details in key-value pairs:
Key
Value
connection.
ddlEndpoint connection.
dmlEndpoint IP address or hostname of the SingleStore deployment.
connection.
database Name of the SingleStore database to connect with.
connection.
user Name of the SingleStore database user with which to access the database.
connection.
password Password for the SingleStore database user.
topics
List of Kafka topics.
value.
converter Set to
org.
.apache. kafka. connect. json. JsonConverter value.
converter. schemas. enable Set to
true
.Refer to SingleStore Kafka Sink Connector Properties for more information.
-
Select Continue.
-
Under Networking, whitelist the SingleStore deployment endpoint.
Specify the endpoint in the <hostname>:<port>:TCP
format. -
Select Continue > Continue (Sizing step) > Continue (Review and launch step).
-
-
Add data to the Kafka topic created earlier.
-
Select Topics > <your_
Kafka_ Topic> > Messages > Produce new message. This example uses a Kafka topic named SingleStore-quickstart. -
On the Produce a new message dialog, add a message.
In this example, the following are added: Key
1
Value
{"schema": {"type": "struct", "optional": false, "version": 1, "fields": [{ "field": "Id", "type": "string", "optional": true }, { "field": "Artist", "type": "string", "optional": true }, { "field": "Song", "type": "string", "optional": true }] }, "payload": { "Id": "1", "Artist": "Rick Astley", "Song": "Never Gonna Give You Up"}}
-
Select Produce.
The data is added to a SingleStore table named SingleStore-quickstart
, in the specified database.
-
-
Log in to your SingleStore deployment and run the following command to verify that the data has been ingested.
SELECT * FROM `SingleStore-quickstart`;+------+-------------+-------------------------+ | Id | Artist | Song | +------+-------------+-------------------------+ | 1 | Rick Astley | Never Gonna Give You Up | +------+-------------+-------------------------+
Last modified: August 3, 2025