Connect with Confluent Cloud
On this page
This tutorial shows how to connect to your SingleStore databases from Confluent Cloud using the SingleStore Debezium connector.
For related information, refer to Custom Connector for Confluent Cloud.
Prerequisites
-
Access to a Confluent Cloud cluster.
-
An active SingleStore deployment with
OBSERVE
(CDC) queries enabled.Refer to Enable CDC for more information. Run the following command to enable CDC: SET GLOBAL enable_observe_queries = 1;
Connect Confluent Cloud to SingleStore
To connect to your SingleStore databases from Confluent Cloud using the SingleStore Debezium connector,
-
Launch a Confluent Cloud cluster.
Refer to the Quick Start for Confluent Cloud for installation instructions. -
Download the SingleStore Debezium Connector .
zip archive from GitHub. -
On Confluent Cloud, select Connectors > Add plugin.
-
On the Add Custom Connector Plugin dialog, select or enter the following information:
-
In the Connector plugin name box, enter SingleStore Debezium.
-
In the Connector class box, enter
com.
.singlestore. debezium. SingleStoreConnector -
Under Connector type, select Source.
-
Under Connector archive, upload the SingleStore Debezium connector archive.
-
Under Sensitive properties, add the following properties to ensure that the passwords are masked in connector configuration:
-
database.
password -
database.
ssl. keystore. password -
database.
ssl. truststore. password
-
-
-
Select Submit.
The SingleStore Debezium connector is now available under Connector Plugins. -
On the Connector Plugins page, select the SingleStore Debezium plugin.
-
On the Add SingleStore Debezium connector dialog, enter or select the following:
-
Under Kafka cluster credentials, select one of the following:
-
My account: Allows the connector to globally access everything that the user's account can access.
With a user account, the connector uses an API key and secret to access the Kafka cluster. SingleStore does not recommend this option for production environments. -
Service account: Provides limited access to the connector by using a service account.
SingleStore recommends this option for production environments. -
Use an existing API key: Allows access to the Kafka cluster via an API key and secret.
SingleStore does not recommend this option for production environments.
-
-
Select Continue.
-
Under Configuration, enter the following connection configuration details in key-value pairs:
Key
Value
database.
hostname IP address or hostname of the SingleStore deployment.
database.
port Port of the SingleStore deployment.
database.
user Name of the SingleStore database user with which to access the database.
database.
password Password for the SingleStore database user.
database.
dbname Name of the SingleStore database to connect with.
database.
table Name of the SingleStore table from which the connector captures changes.
topic.
prefix Prefix for the Kafka topic.
Refer to SingleStore Debezium Connector Properties for more information.
-
-
Configure other connector configuration details as required.
-
Provision the connector.
Once the connector completes provisioning, its status changes to Running
.The connector starts streaming changes to the table into the Kafka topic. The Kafka topic name is in the <topic.
format.prefix>. <database. dbname>. <database. table> -
Select <your_
Kafka_ cluster> > Topics > <your_ topic> > Messages to view the change events for the table.
Last modified: March 31, 2025