information_schema.PIPELINES Table

The PIPELINES table stores high-level information about any pipelines that have been created in the cluster. Each row represents a single pipeline. The columns in this table are described below.

information_schema.PIPELINES Table Schema

Column Name



The name of the database associated with the pipeline.


The name of the pipeline.


The unique ID of the pipeline.


The pipeline’s configuration in JSON format. This JSON is read only, and it’s automatically generated when your pipeline configuration changes. The JSON schema for this column is described in CONFIG_JSON Schema, and is intended for use in a Web-based application (such as MemSQL Ops with SingleStore DB versions earlier than 7.5).


The current state of the pipeline. Possible values are Running, Error, and Stopped. Running: The pipeline is currently running. If the pipelines_stop_on_error variable is set to ON, the pipeline has not encountered any errors during extraction, transformation, or loading. Error: The pipeline encountered an error and is currently stopped. When a pipeline is in the Error state, it must be manually started.If the pipelines_stop_on_error variable is set to OFF, a pipeline cannot enter the Error state. The pipeline will remain in the Running state until it’s manually stopped, and any errors that occur will be written to the information_schema.PIPELINES_ERRORS table. Stopped: The pipeline is currently stopped. The pipeline can only enter the Stopped state due to manual intervention.


The sum of the number of batches that have been skipped in the pipeline. Batches may be skipped if the maximum number of batch retries was reached, which is set using the pipelines_max_retries_per_batch_partition variable.


The CONFIG_JSON column in the information_schema.PIPELINES table contains a fixed set of read-only JSON key/value pairs. Some of these JSON values can also be seen by executing the SHOW CREATE PIPELINE <pipeline-name> statement. Each key/value pair is described below.

Example CONFIG_JSON for Kafka Pipelines


Example CONFIG_JSON for S3 Pipelines

    "name": "my-s3-pipeline",
    "source_type": "S3",
    "connection_string": "my-s3-bucket-name",
    "config": "{\"region\": \"us-west-1\"}",
    "credentials": "<CREDENTIALS REDACTED>",
    "batch_interval": 2500,
    "max_partitions_per_batch": -1,
    "transform": null,
    "load_error_policy": null,
    "dup_key_policy": null,
    "table": "my_table_name",
    "fields_terminated_by": ",",
    "fields_enclosed_by": "",
    "fields_escaped_by": "\\",
    "lines_terminated_by": "\n",
    "lines_starting_by": "",
    "extended_null": false,
    "column_list": null,
    "on_duplicate_key_update": null

CONFIG_JSON Schema Definition

Key Name

Value Description


The name of the pipeline.


The data source type for the pipeline.


The name of the S3 bucket or bucket’s object with optional prefix.


The configuration information provided when creating an S3 pipeline, namely the region where the source bucket is hosted.


Either the Kafka topic URL for the pipeline or <CREDENTIALS REDACTED> for an S3 pipeline.


The time duration in milliseconds between batch extraction operations.


The transform’s URI, executable entry point, and arguments.


The load error policy for the pipeline. For example, if IGNORE or SKIP ... ERRORS was specified during pipeline creation, they will appear as a JSON key/value pair like so: {"load_error_policy": "skip_all_errors"}


The duplicate key policy that indicates how a row should be inserted if it contains a duplicate key value.


The name of the table in which to insert data.


The character that terminates a field.


The character that encloses a field.


The character that escapes a field.


The character that terminates a line.


The string prefix for a line.


Specifies whether the non-quoted and case-insensitive string null will be loaded as a null type.


The column list to load data into.


Specifies whether duplicate keys will be updated or ignored.


Specifies whether the pipeline is currently running. Current state of the pipeline, which is either running, error, or stopped.