Moving Data Between Databases

There are ways to move your data between databases depending on whether SingleStore Helios or SingleStore Self-Managed is where the data is being accessed.

  • SELECT ... INTO OUTFILE , then LOAD DATA INFILE.

    SELECT * FROM table_name_1 INTO OUTFILE '/home/username/file_name.csv'
    FIELDS TERMINATED BY ','
    LINES TERMINATED BY '\n';
    LOAD DATA INFILE '/home/username/file_name.csv'
    INTO TABLE table_name_2;
  • mysqldump with or without filters.

    mysqldump -uroot -p db_name table_name [--where='id<1000000']
  • SELECT ... INTO , CREATE PIPELINE AS LOAD DATA, then START PIPELINE. The following example uses S3, but this will work for HDFS, GCS, KAFKS, FS, and AZURE as well.

    SELECT * FROM table_name_1
    INTO S3 bucket/target
    CONFIG configuration_json
    CREDENTIALS credentials_json
    CREATE PIPELINE pipeline_name AS LOAD DATA S3 'bucket-name' | '<bucket-name/path>'
    [CONFIG '<configuration_json>']
    CREDENTIALS '<credentials_json>'
    INTO TABLE table_name_2;
    START PIPELINE pipeline_name;

Last modified: April 19, 2024

Was this article helpful?