Important
The SingleStore 9.0 release candidate (RC) gives you the opportunity to preview, evaluate, and provide feedback on new and upcoming features prior to their general availability. In the interim, SingleStore 8.9 is recommended for production workloads, which can later be upgraded to SingleStore 9.0.
Use Flow on Helios
On this page
Flow on Helios is a fully managed data migration and Change Data Capture (CDC) service within SingleStore Helios.
It includes two components:
-
Ingest: Transfers schema and data for tables up to 10GB, and supports ongoing CDC.
-
XL Ingest: Handles larger tables by partitioning them and transferring partitions in parallel.
Supported Source Databases
The following databases are supported as sources:
-
MySQL
-
PostgreSQL
-
Oracle
-
Microsoft SQL Server
-
Snowflake
These can be self-hosted, or hosted on:
-
AWS RDS
-
AWS Aurora
-
Google CloudSQL
-
Azure SQL
Note: On-premise and self-hosted sources must be accessible over the internet and allow inbound connections from Helios IPs.
For prerequisites and setup, refer to Source Configuration.
Destination Database
-
SingleStore database hosted in Helios.
To configure, refer to Destination Configuration.
Use SingleStore Flow
-
Go to the Cloud Portal
Navigate to the Load Data section.
-
Select the Source Database
Choose from:
-
MySQL by Flow
-
PostgreSQL by Flow
-
Oracle by Flow
-
SQL Server by Flow
-
Snowflake by Flow
-
-
Select the Destination Database
On the Select Destination page:
-
Choose the workspace and the destination database.
Note: The workspace should be of the Standard type.
-
Under Connection details, choose an existing connection or create a new one, and enter the Connection name.
Note: The Connection name must not start with a number, and cannot contain special characters, except for underscores.
Spaces are allowed.
-
-
Open the Flow Dashboard
The Dashboard provides a central interface to configure, manage, and monitor the overall status of an Ingest or XL Ingest instance.
It displays information about the source, destination, tables, replication schedule, and a checklist. Click Open Dashboard to launch the dashboard.
-
Select Ingestion Type
Choose Ingest or XL Ingest from the Programs panel on the right.
-
Complete the Checklist
Begin the data pipeline by following the steps in the Checklist:
-
Setup Source - Configure the source database.
Refer to Source Configuration. -
Setup Destination - Configure the destination database.
Refer to Destination Configuration. -
Setup Tables - Configure the tables to be ingested.
Refer to Tables. -
Setup Schedule - Define the replication schedule.
Refer to Schedule.
-
-
Start the Pipeline
After completing the Checklist, start the pipeline from the Operations tab.
The job begins based on the defined schedule. Refer to Operations for detailed information.
-
Monitor and Manage the Pipeline
Use the following tabs to track performance, troubleshoot issues, and adjust settings.
-
Operations - Provides options to manage and troubleshoot the data pipeline, and handle schema changes.
-
Settings - Contains configuration options for the Flow environment.
Refer to Settings. -
Reports - Used to generate reports and view table performance data, organized by day and month.
Refer to Reports. -
Logs - Monitors the progress of extract and load operations.
Refer to Logs.
-
Remarks
For additional information and reference, refer to:
-
SingleStore Ingest
-
SingleStore XL Ingest
-
Appendix
-
SingleStore Operational FAQ
Last modified: September 25, 2025