Moving Data Between Databases
Warning
SingleStore 9.0 gives you the opportunity to preview, evaluate, and provide feedback on new and upcoming features prior to their general availability. In the interim, SingleStore 8.9 is recommended for production workloads, which can later be upgraded to SingleStore 9.0.
There are ways to move your data between databases depending on whether SingleStore Helios or SingleStore Self-Managed is where the data is being accessed.
-
SELECT .
. . INTO OUTFILE , then LOAD DATA INFILE. SELECT * FROM table_name_1 INTO OUTFILE '/home/username/file_name.csv'FIELDS TERMINATED BY ','LINES TERMINATED BY '\n';LOAD DATA INFILE '/home/username/file_name.csv'INTO TABLE table_name_2; -
mysqldump with or without filters.
mysqldump -uroot -p db_name table_name [--where='id<1000000'] -
SELECT .
. . INTO , CREATE PIPELINE AS LOAD DATA, then START PIPELINE. The following example uses S3, but this will work for HDFS, GCS, KAFKS, FS, and AZURE as well. SELECT * FROM table_name_1INTO S3 bucket/targetCONFIG configuration_jsonCREDENTIALS credentials_jsonCREATE PIPELINE pipeline_name AS LOAD DATA S3 'bucket-name' | '<bucket-name/path>'[CONFIG '<configuration_json>']CREDENTIALS '<credentials_json>'INTO TABLE table_name_2;START PIPELINE pipeline_name;
Last modified: April 19, 2024