Important
The SingleStore 9.1 release candidate (RC) gives you the opportunity to preview, evaluate, and provide feedback on new and upcoming features prior to their general availability. In the interim, SingleStore 9.0 is recommended for production workloads, which can later be upgraded to SingleStore 9.1.
Load Data from Estuary Flow
On this page
Integrate your SingleStore databases with Estuary Flow to build and manage real-time ETL and ELT data pipelines that capture and replicate database events to downstream applications.
You can configure SingleStore as a destination in Estuary Flow.
Configure the SingleStore Destination
Use the SingleStore variant of the Kafka-compatible Dekaf connector to configure SingleStore as a destination in Estuary Flow.
Prerequisites
-
An Estuary Flow collection
-
An active SingleStore deployment
Connect to SingleStore from Estuary Flow
To configure SingleStore as a destination:
-
Log in to Estuary Flow.
-
Create a new materialization.
-
On the Dashboard, select Destinations from the left navigation pane.
-
On the Destinations page, select NEW MATERIALIZATION.
-
On the Create Materialization page, from the list of connectors, search and select the SingleStore connector.
-
Enter or select the following information:
-
Name: Enter a name for the materialization.
-
Data Plane: Select a data plane to use.
-
Auth Token: Specify an authentication token.
This token is used to authenticate Kafka consumers to this materialization task.
-
-
Link a capture (collection) to this task.
-
Select NEXT > TEST to test the materialization.
-
Select SAVE AND PUBLISH to publish this task.
-
-
Log in to your SingleStore deployment.
-
Create a table to store the ingested data.
For example: CREATE TABLE estuaryFlow(id INT, created DATETIME(6), product_ids JSON); -
Create a pipeline to ingest the data from Estuary Flow.
For example: CREATE PIPELINE estuaryIngest ASLOAD DATA KAFKA "dekaf.estuary-data.com:9092/<collection_name>"CONFIG '{"security.protocol":"SASL_SSL","sasl.mechanism":"PLAIN","sasl.username":"{<materialization_name>}","broker.address.family": "v4","schema.registry.username": "{<materialization_name>}","fetch.wait.max.ms": "2000"}'CREDENTIALS '{"sasl.password": "<auth_token>","schema.registry.password": "<auth_token>"}'INTO TABLE estuaryFlowFORMAT AVRO SCHEMA REGISTRY 'https://dekaf.estuary-data.com'( id <- id, created <- created, product_ids <- product_ids );where,
-
collection_: Name of the collection linked to the materialization task.name -
materialization_: Name of the Estuary Flow materialization.name -
auth_: Authentication token specified for the materialization.token
-
-
Start the pipeline.
For example: START PIPELINE estuaryIngest;
The pipeline starts ingesting data from the Estuary Flow collection to the SingleStore table.SHOW PIPELINES command to view the status of the pipeline.
Refer to SingleStore | Estuary Flow for more information, for example, configuration properties.
Last modified: October 10, 2025