Flow on Helios FAQ

Do I need a license key to use Flow on Helios?

No, a license key is not required. You are only billed for the time your Flow instance is running.

How does billing for Flow work, and how do I pay for it?

Flow on Helios uses a pay-per-usage model. You pay only for the time your Flow instance is running. Charges are deducted from your Helios credits, so no separate payment is needed.

Where can I view the logs?

On the Cloud Portal, go to Load Data, then select your Flow pipeline and click Details. When the dashboard opens, you will see the Logs tab on the right side of the toolbar.

Which IP addresses do I need to whitelist on the source database?

You must whitelist the outbound IP addresses of your SingleStore workspace in your source database’s network configuration. To find them, go to Deployments > Workspace > Firewall > Outbound.

Why am I getting the error: "Unable to connect to destination database"?

Please verify that your username and password are correct in the Destination Database configuration. To reset your SingleStore database password, go to Deployments > Connect > Select your app > Reset password.

If I reset the password for my SingleStore workspace, do I need to update it in all my Flow instances?

Yes, you are required to update the password in all your Flow pipelines with the new password.

I am unable to view the destination database while creating a Flow pipeline, even though my workspace is active and a database is already attached. Why is that?

This issue may be due to firewall restrictions. Verify your firewall settings and add your current IP address to the list of allowed inbound IPs to ensure database access.

Can I connect to my source database using Flow via private links?

Yes. To connect, create an outbound private link in the Cloud Portal. Refer to Configure Outbound Connections for more information. 

How can I copy all the source tables to a SingleStore database of my choice?

On the Flow dashboard, go to the Destination Database configuration tab and select Advanced Options. Enter the database name in the Schema for all tables and Schema for staging tables fields.

Can I define a schema for tables before extraction?

Yes. There are two ways to do this:

Option 1: 

  1. Create all the tables using custom SQL queries.

  2. Go to the Destination Database configuration tab, select Advanced Options and select Truncate table instead of drop.

Option 2:

  1. Select the tables you want to move to SingleStore and enable Skip Initial Extract. 

  2. Go to Operations and do a Full Extract. This creates the tables in the SingleStore database.

  3. Once the tables are created, modify them to add shard or sort keys. 

  4. After modification, go to the Destination Database configuration tab, select Advanced Options and select Truncate table instead of drop.

How to copy just the schema of source before migration?

Select the tables you want to move to SingleStore and enable Skip Initial Extract. Go to Operations and do a Full Extract. This creates the tables in the SingleStore database.

When a connection fails, the error message says "Connection string is invalid. Unable to parse". How can I identify the issue?

This error typically indicates that you have spaces in any of the configuration fields or the hostname, port or database name is incorrect. Check these fields for formatting issues or incorrect values and try again.

My scheduled pipeline didn’t trigger - why?

This can happen if the scheduler is turned off or misconfigured, for example, set to run at 00h 00m 00s. In case of file-based replication on source (e.g., MySQL or Oracle Log Miner), Ingest may be waiting for the next log file to be created.

I have a database with 1TB of data. I want to migrate all the data to SingleStore and enable CDC for new transactions to my source table. How do I proceed?

For this use case, you will need both Ingest and XL Ingest. Follow these steps for the migration:

  1. Identify tables greater than 5GB. 

  2. Select the tables from the list and select Skip Initial Extract.

  3. Go to Operations and do a Full Extract. This creates the selected tables in SingleStore without any data. 

  4. Verify that the tables are created in SingleStore.

  5. Select XL Ingest from the dropdown list on the top right of the dashboard. 

  6. Migrate the tables using XL Ingest. Refer to SingleStore XL Ingest for more information. 

  7. Once the migration is done, go to Ingest and select all the tables.

  8. Go to Operations and select Sync New Tables. This migrates smaller tables first and then starts the CDC for all the tables. 

I want to update my scheduler. How do I do it?

To update the scheduler, go to the Schedule tab, update the configuration, and click Apply. Note that updating the scheduler is only applicable to time-based replication (MySQL Continuous Log Miner, Oracle Log Miner, SQL Server) not for file-based replication. 

Error messages appear at the top of the screen and disappear quickly, making them hard to notice. How can I see them more easily?

You can go to the Logs tab and view all the errors for your instance.

I configured firewall rules to allow IP access, but I still cannot connect Flow to my source database. What could be the issue?

Ensure that the correct outbound IP addresses from your SingleStore workspace have been added to your source database's network configuration. Also, make sure the network allows connections on the port your database is using.

I have successfully established a connection, but I'm encountering authentication errors. Why is the connection failing after setup?

Flow does not automatically update database passwords if they are changed after the connection is created. If the username or password for your source or destination database has been modified, you may receive authentication errors. To resolve this, update the credentials in Flow and re-test the connection.

Why does Flow create a new database when I have specified a destination database during pipeline creation?

By default, Flow creates a new database with the same name as the source database name. To avoid this, go to the Destination Database configuration tab, select Advanced Options and specify the database name in the field Schema for all tables.

How can I monitor the progress of data ingestion without accessing logs?

You can track the ingestion progress from the Flow dashboard. The extraction progress bar provides real-time information on how much data has been transferred and how much is still pending.

How can I load multiple source tables into one target table in Flow?

Currently, Flow does not support loading multiple source tables into a single target table.

How can I add a prefix to the database names migrated from source to SingleStore?

Go to the Destination Database configuration tab, select Advanced Options, and add a prefix of your choice in the Add Database Prefix field.

I see eff_dt and end_dt column errors in the logs, how can I resolve this?

These errors typically occur when Maintain History was enabled during the initial extract but later disabled. This leads to a schema mismatch between the source and the target (SingleStore) database. To resolve this, you can choose one of the following options:

  • Delete the eff_dt and end_dt columns from your SingleStore database.

  • Drop the table from your SingleStore database, select Redo Initial Extract for the table, then go to Operations and select Sync New Tables.

Is the ENUM data type supported in Flow's data type casting?

No, Flow does not currently support the ENUM data type in data type casting.

Last modified: October 16, 2025

Was this article helpful?

Verification instructions

Note: You must install cosign to verify the authenticity of the SingleStore file.

Use the following steps to verify the authenticity of singlestoredb-server, singlestoredb-toolbox, singlestoredb-studio, and singlestore-client SingleStore files that have been downloaded.

You may perform the following steps on any computer that can run cosign, such as the main deployment host of the cluster.

  1. (Optional) Run the following command to view the associated signature files.

    curl undefined
  2. Download the signature file from the SingleStore release server.

    • Option 1: Click the Download Signature button next to the SingleStore file.

    • Option 2: Copy and paste the following URL into the address bar of your browser and save the signature file.

    • Option 3: Run the following command to download the signature file.

      curl -O undefined
  3. After the signature file has been downloaded, run the following command to verify the authenticity of the SingleStore file.

    echo -n undefined |
    cosign verify-blob --certificate-oidc-issuer https://oidc.eks.us-east-1.amazonaws.com/id/CCDCDBA1379A5596AB5B2E46DCA385BC \
    --certificate-identity https://kubernetes.io/namespaces/freya-production/serviceaccounts/job-worker \
    --bundle undefined \
    --new-bundle-format -
    Verified OK