Load JSON Files with LOAD DATA

Data from JSON files can be loaded into SingleStore using the LOAD DATA command. 

JSON files can be loaded into SingleStore tables in two ways:

  • Fields in the JSON can be mapped to columns in a SingleStore table using FORMAT JSON and a subvalue_mapping.

    • SQL queries over the columns are used to query the JSON data.

  • Entire JSON values can be loaded in a single JSON type column in SingleStore using FORMAT JSON and the <- % subvalue_mapping. Using the JSON type provides:

    • Columnarized storage for efficient analytics.

    • JSON Functions to extract, search and modify the data.

    • Full-text search, computed columns to optimize lookups.

    • Refer to Using JSON for an overview and examples.

For more information on the LOAD DATA command, refer to LOAD DATA. For examples of using LOAD DATA with JSON files, refer to Load JSON Files Examples.

Syntax

LOAD DATA [LOCAL] INFILE 'file_name'
[REPLACE | SKIP { CONSTRAINT | DUPLICATE KEY } ERRORS]
INTO TABLE tbl_name
FORMAT JSON
subvalue_mapping
[SET col_name = expr,...]
[WHERE expr,...]
[MAX_ERRORS number]
[ERRORS HANDLE string]
subvalue_mapping:
( {col_name | @variable_name} <- subvalue_path [DEFAULT literal_expr], ...)
subvalue_path:
{% | [%::]ident [::ident ...]}

Map JSON Fields to Columns

Use the FORMAT JSON syntax and a subvalue_mapping to map fields from JSON values to columns in SingleStore.

When this syntax is used, for each JSON value in the file file_name, the values in the fields specified in the subvalue_mapping are extracted and the extracted subvalues are assigned to either:

  • A column of a row in the SingleStore table tbl_name.

  • A variable which is further used in the SET clause to assign a value to a column in a row in the SingleStore table. 

  • If a specified subvalue cannot be found in the input JSON, the engine will assign the DEFAULT clause literal instead.

  • If a column is not mentioned in the subvalue mapping, all values of that column will be set to NULL.

In the following example:

  • The values from json_subvalue_path1 and json_subvalue_path2 are assigned to the columns singlestore_col_name1, singlestore_col_name2, respectively.

  • If no value is found for json_subvalue_path2, singlestore_col_name2 is set to the DEFAULT value 42.

  • The value from json_subvalue_path3 is assigned to the variable @varname, and the SET statement assigns the result of the expression on @varname (expr(@varname)) to singlestore_col_name3.

LOAD DATA INFILE 'filename'
INTO TABLE tbl_name
(
singlestore_col_name1 <- json_subvalue_path1,
singlestore_col_name2 <- json_subvalue_path2 DEFAULT 42,
@varname <- json_fieldname3
)
SET singlestore_colname3 = expr(@varname)
FORMAT JSON;

Load Entire JSON Value to a Single Column

Use FORMAT JSON and the <- % subvalue_mapping to load entire json values into a single SingleStore column.

In the following example, each JSON value in the file will be loaded into a column in a row in the table tbl_name.

LOAD DATA INFILE 'filename'
INTO TABLE tbl_name
(
singlestore_col_name <- %
)
FORMAT JSON;

Remarks

  • To specify the compression type of an input file, use the COMPRESSION clause. See LOAD DATA for more information.

  • The file named by file_name must consist of concatenated UTF-8 encoded JSON values, optionally separated by whitespace. Newline-delimited JSON is accepted, for example.

  • Non-standard JSON values like NaN, Infinity, and -Infinity must not occur in file_name.

  • If file_name ends in .gz or .lz4, it will be decompressed.

  • JSON LOAD DATA supports a subset of the error recovery options allowed by CSV LOAD DATA. Their behavior is as described under Load CSV Files with LOAD DATA.

  • Like CSV LOAD DATA, JSON LOAD DATA allows you to use globbing to load data from multiple files.

  • Writing to multiple databases in a transaction is not supported.

  • The engine discards rows that do not match the WHERE clause.

Last modified: September 10, 2025

Was this article helpful?

Verification instructions

Note: You must install cosign to verify the authenticity of the SingleStore file.

Use the following steps to verify the authenticity of singlestoredb-server, singlestoredb-toolbox, singlestoredb-studio, and singlestore-client SingleStore files that have been downloaded.

You may perform the following steps on any computer that can run cosign, such as the main deployment host of the cluster.

  1. (Optional) Run the following command to view the associated signature files.

    curl undefined
  2. Download the signature file from the SingleStore release server.

    • Option 1: Click the Download Signature button next to the SingleStore file.

    • Option 2: Copy and paste the following URL into the address bar of your browser and save the signature file.

    • Option 3: Run the following command to download the signature file.

      curl -O undefined
  3. After the signature file has been downloaded, run the following command to verify the authenticity of the SingleStore file.

    echo -n undefined |
    cosign verify-blob --certificate-oidc-issuer https://oidc.eks.us-east-1.amazonaws.com/id/CCDCDBA1379A5596AB5B2E46DCA385BC \
    --certificate-identity https://kubernetes.io/namespaces/freya-production/serviceaccounts/job-worker \
    --bundle undefined \
    --new-bundle-format -
    Verified OK