SQL Pushdown Debugging

Use df.explain() to show which parts of a query are being pushed down. It can also be used for debugging if you encounter an issue. Additionally, if you pass the argument true, you get more information in the output, including pre- and post-optimization passes.


In addition, the singlestore-spark-connector outputs a lot of helpful information when the TRACE log level is enabled for the com.singlestore.spark package. You can do this in your log4j configuration by adding the following line: Make sure not to leave it in place since it generates a huge amount of tracing output.

Last modified: February 10, 2022

Was this article helpful?