Debugging Pipeline Errors

Create a Pipeline … Force to Collect More Logs

When the CREATE PIPELINE statement fails, more information can be found with the SHOW WARNINGS command.

If SHOW WARNINGS do not have enough information, the pipeline can be created using the force option and the errors can be fixed later. This will create a pipeline regardless of the connection error and store the errors in information_schema.pipelines_errors.

CREATE PIPELINE <pipeline_name> FORCE AS LOAD DATA <data_source>
INTO TABLE <table_name>;

Use SELECT * to see pipeline errors.

SELECT * FROM information_schema.pipelines_errors;
| DATABASE_NAME | PIPELINE_NAME | ERROR_UNIX_TIMESTAMP | ERROR_TYPE | ERROR_CODE | ERROR_MESSAGE|

Logging to Show More Information with Failures

Some data sources have debug logging availability which provides additional information along with the errors.

Use the following engine variable and command in the order they appear:

  • set global pipelines_extractor_debug_logging=ON - this engine variable will set debug logging for pipelines if available.

  • FLUSH EXTRACTOR POOLS - Flush extractor pools to remove any cached extractors

Larger errors in the information_schema.pipelines_errors will show more context about the failure cause.

Debug Running Pipelines

There are six tables in the information_schema database that relate to pipelines. Click on the links to get detailed information.

  • PIPELINES - stores high-level information about any pipelines.

  • PIPELINES_BATCHES_SUMMARY - high-level information about individual batches as they are loaded into the database.

  • PIPELINES_BATCHES - contains detailed, low-level information about individual batches as they are loaded into a database.

  • PIPELINES_ERRORS - contains detailed information about errors that occurred during extraction, transformation, or loading. Each row represents a single error event.

  • PIPELINES_CURSORS - contains information about a pipeline’s offset ranges.

  • PIPELINES_FILES - stores information about files that have been extracted from a file system-like data source, such as Amazon S3.

Last modified: April 1, 2024

Was this article helpful?