Load Data from Estuary Flow
You can stream data from Estuary Flow to SingleStore using the Kafka-compatible Dekaf API via pipelines.
To load data into SingleStore:
-
Log in to the Estuary Dashboard.
-
Generate a Refresh Token.
-
On the left navigation pane, select Admin.
-
On the CLI-API tab, under the Refresh Token section, select Generate Token.
-
On the Generate Refresh Token dialog, enter a description for the token, and then select GENERATE TOKEN.
-
-
Copy and securely store the token.
This token is used in the CREDENTIALS
clause of theCREATE PIPELINE
statement to authenticate the connection.Note: The token is displayed only once.
-
Create a table in SingleStore to store the data to be ingested.
-
Create a pipeline to stream data from Estuary Flow.
Here's an example:
CREATE TABLE estExample (id INT, title VARCHAR(255));CREATE PIPELINE estPipeline ASLOAD DATA KAFKA "<endpoint/database>"CONFIG '{"security.protocol":"SASL_SSL","sasl.mechanism":"PLAIN","sasl.username":"{}","broker.address.family": "v4","schema.registry.username": "{}","fetch.wait.max.ms": "2000"}'CREDENTIALS '{"sasl.password": "<ESTUARY_ACCESS_TOKEN>","schema.registry.password": "<ESTUARY_ACCESS_TOKEN>"}'INTO table estExampleFORMAT AVRO SCHEMA REGISTRY '<schema_registry>'( id <- id, title <- title );START PIPELINE estPipeline;
Last modified: December 19, 2024