Warning
SingleStore 9.0 gives you the opportunity to preview, evaluate, and provide feedback on new and upcoming features prior to their general availability. In the interim, SingleStore 8.9 is recommended for production workloads, which can later be upgraded to SingleStore 9.0.
Load JSON Files with LOAD DATA
On this page
Data from JSON files can be loaded using the LOAD DATA command.
Syntax
LOAD DATA [LOCAL] INFILE 'file_name'[REPLACE | SKIP { CONSTRAINT | DUPLICATE KEY } ERRORS]INTO TABLE tbl_nameFORMAT JSONsubvalue_mapping[SET col_name = expr,...][WHERE expr,...][MAX_ERRORS number][ERRORS HANDLE string]subvalue_mapping:( {col_name | @variable_name} <- subvalue_path [DEFAULT literal_expr], ...)subvalue_path:{% | [%::]ident [::ident ...]}
Semantics
Extract specified subvalues from each JSON value in file_
.tbl_
, or to variables used for a column assignment in a SET
clause.DEFAULT
clause literal instead.WHERE
clause.
To specify the compression type of an input file, use the COMPRESSION
clause.
The file named by file_
must consist of concatenated UTF-8 encoded JSON values, optionally separated by whitespace.
Non-standard JSON values like NaN
, Infinity
, and -Infinity
must not occur in file_
.
If file_
ends in .
or .
, it will be decompressed.
JSON LOAD DATA
supports a subset of the error recovery options allowed by CSV LOAD DATA
.
Like CSV LOAD DATA
, JSON LOAD DATA
allows you to use globbing to load data from multiple files.
Writing to multiple databases in a transaction is not supported.
LOAD DATA from an AWS S3 Source
JSON files that are stored in an AWS S3 bucket can be loaded via a LOAD DATA query without a pipeline.
LOAD DATA S3 '<bucket name>'CONFIG '{"region" : "<region_name>"}'CREDENTIALS '{"aws_access_key_id" : "<key_id>","aws_secret_access_key": "<access_key>"}'INTO TABLE <table_name>;
Last modified: July 25, 2025