Load Data from Estuary Flow
Warning
SingleStore 9.0 gives you the opportunity to preview, evaluate, and provide feedback on new and upcoming features prior to their general availability. In the interim, SingleStore 8.9 is recommended for production workloads, which can later be upgraded to SingleStore 9.0.
You can stream data from Estuary Flow to SingleStore using the Kafka-compatible Dekaf API via pipelines.
To load data into SingleStore:
-
Log in to the Estuary Dashboard.
-
Generate a Refresh Token.
-
On the left navigation pane, select Admin.
-
On the CLI-API tab, under the Refresh Token section, select Generate Token.
-
On the Generate Refresh Token dialog, enter a description for the token, and then select GENERATE TOKEN.
-
-
Copy and securely store the token.
This token is used in the CREDENTIALS
clause of theCREATE PIPELINE
statement to authenticate the connection.Note: The token is displayed only once.
-
Create a table in SingleStore to store the data to be ingested.
-
Create a pipeline to stream data from Estuary Flow.
Here's an example:
CREATE TABLE estExample (id INT, title VARCHAR(255));CREATE PIPELINE estPipeline ASLOAD DATA KAFKA "<endpoint/database>"CONFIG '{"security.protocol":"SASL_SSL","sasl.mechanism":"PLAIN","sasl.username":"{}","broker.address.family": "v4","schema.registry.username": "{}","fetch.wait.max.ms": "2000"}'CREDENTIALS '{"sasl.password": "<ESTUARY_ACCESS_TOKEN>","schema.registry.password": "<ESTUARY_ACCESS_TOKEN>"}'INTO table estExampleFORMAT AVRO SCHEMA REGISTRY '<schema_registry>'( id <- id, title <- title );START PIPELINE estPipeline;
Last modified: December 19, 2024