Skip to main content

Moving Data Between Databases

There are ways to move your data between databases depending on whether SingleStoreDB Cloud or SingleStoreDB On-Premises is where the data is being accessed.

  • SELECT ... INTO OUTFILE , then LOAD DATA INFILE.

    SELECT * FROM table_name_1 INTO OUTFILE '/home/username/file_name.csv'
            FIELDS TERMINATED BY ','
            LINES TERMINATED BY '\n';
    
    LOAD DATA INFILE '/home/username/file_name.csv'
            INTO TABLE table_name_2;
    
  • mysqldump with or without filters.

    mysqldump -uroot -p db_name table_name [--where='id<1000000']
  • SELECT ... INTO , CREATE PIPELINE AS LOAD DATA, then START PIPELINE. The following example uses S3, but this will work for HDFS, GCD, KAFKS, FS, and AZURE as well.

    SELECT * FROM table_name_1
    INTO S3 INTO S3 bucket/target 
    CONFIG configuration_json 
    CREDENTIALS credentials_json 
    
    CREATE PIPELINE pipeline_name AS LOAD DATA S3 'bucket-name' | '<bucket-name/path>' 
          [CONFIG '<configuration_json>']
          CREDENTIALS '<credentials_json>'
    INTO TABLE table_name_2;         
    
    START PIPELINE pipeline_name;