Use Flow on Helios
On this page
Flow on Helios is a fully managed data migration and Change Data Capture (CDC) service within SingleStore Helios.
It includes two components:
-
Ingest: Transfers schema and data for tables up to 10GB, and supports ongoing CDC.
-
XL Ingest: Handles larger tables by partitioning them and transferring partitions in parallel.
Supported Source Databases
The following databases are supported as sources:
-
MySQL
-
PostgreSQL
-
Oracle
-
Microsoft SQL Server
-
Snowflake
These can be self-hosted, or hosted on:
-
AWS RDS
-
AWS Aurora
-
Google CloudSQL
-
Azure SQL
Note: On-premise and self-hosted sources must be accessible over the internet and allow inbound connections from Helios IPs.
For prerequisites and setup, refer to Source Database Setup.
Destination Database
-
SingleStore database hosted in Helios.
To configure, refer to Destination Database Setup.
Use Flow on Helios
-
Go to the Cloud Portal
Navigate to the Load Data section.
-
Select the Source Database
Choose from:
-
MySQL by Flow
-
PostgreSQL by Flow
-
Oracle by Flow
-
SQL Server by Flow
-
Snowflake by Flow
-
-
Select the Destination Database
On the Select Destination page:
-
Choose the workspace and the destination database.
Note: The workspace should be of the Standard type.
-
Under Connection details, choose an existing connection or create a new one, and enter the Connection name.
Note: The Connection name must not start with a number, and cannot contain special characters, except for underscores.
Spaces are allowed.
-
-
Open the Flow Dashboard
The Dashboard provides a central interface to configure, manage, and monitor the overall status of an Ingest or XL Ingest instance.
It displays information about the source, destination, tables, replication schedule, and a checklist. Click Open Dashboard to launch the dashboard.
-
Select Ingestion Type
Choose Ingest from the dropdown on the top right.
Refer to SingleStore Ingest for more information. If the data to be ingested is larger than 10 GB, use XL Ingest by selecting XL Ingest from dropdown on the top right.
Refer to SingleStore XL Ingest for more information. -
Complete the Checklist
Begin the data pipeline by following the steps in the Checklist:
-
Setup Source: Configure the source database.
Refer to Source Database Setup. -
Setup Destination: Configure the destination database.
Refer to Destination Database Setup. -
Setup Tables: Configure the tables to be ingested.
Refer to Tables for more information. -
Setup Schedule: Define the replication schedule.
-
-
Start the Pipeline
After completing the Checklist, start the pipeline from the Operations tab.
The job begins based on the defined schedule. Refer to Operations for more information.
-
Monitor and Manage the Pipeline
Use the following tabs to track performance, troubleshoot issues, and adjust settings.
-
Operations: Provides options to manage and troubleshoot the data pipeline, and handle schema changes.
Refer to Operations for more information. -
Settings: Contains configuration options for the Flow environment.
Refer to Settings for more information. -
Reports: Used to generate reports and view table performance data, organized by day and month.
Refer to Reports for more information. -
Logs: Monitors the progress of extract and load operations.
Refer to Logs for more information.
-
Last modified: October 15, 2025