Moving Data Between Databases
There are ways to move your data between databases depending on whether SingleStore Helios or SingleStore Self-Managed is where the data is being accessed.
- 
      SELECT . . . INTO OUTFILE , then LOAD DATA INFILE. SELECT * FROM table_name_1 INTO OUTFILE '/home/username/file_name.csv'FIELDS TERMINATED BY ','LINES TERMINATED BY '\n';LOAD DATA INFILE '/home/username/file_name.csv'INTO TABLE table_name_2;
- 
      mysqldump with or without filters. mysqldump -uroot -p db_name table_name [--where='id<1000000']
- 
      SELECT . . . INTO , CREATE PIPELINE AS LOAD DATA, then START PIPELINE. The following example uses S3, but this will work for HDFS, GCS, KAFKS, FS, and AZURE as well. SELECT * FROM table_name_1INTO S3 bucket/targetCONFIG configuration_jsonCREDENTIALS credentials_jsonCREATE PIPELINE pipeline_name AS LOAD DATA S3 'bucket-name' | '<bucket-name/path>'[CONFIG '<configuration_json>']CREDENTIALS '<credentials_json>'INTO TABLE table_name_2;START PIPELINE pipeline_name;
Last modified: September 10, 2025