Running Notebooks from Another Notebook with Fusion SQL

SingleStore Notebooks support running one notebook from another using Fusion SQL, either within the same session or in a new session. This is useful when you want to:

  • Avoid code duplication such as reusable functions or environment setup by running the shared logic in a separate notebook.

  • Run code in parallel by executing notebooks in separate sessions.

  1. Create the Sample Notebook

    You can create a sample notebook in one of two ways:

    • On the Cloud Portal select Create New > New Notebook > Shared. 

    • Run the following Python code in a notebook:

      import nbformat as nbf
      nb = nbf.v4.new_notebook()
      cell = nbf.v4.new_code_cell("""# This is a code cell
      if 'sample_var' not in globals():
      sample_var = 'sample value'
      print('Sample Notebook has been executed!')""")
      cell.metadata = {
      "language": "python"
      }
      nb.cells.append(cell)
      # Save the notebook to a file
      with open('sample_notebook.ipynb', 'w') as f:
      nbf.write(nb, f)
      print("Notebook 'sample_notebook.ipynb' created successfully in the local filesystem.")
  2. Upload the Notebook to Shared Notebooks in the Data Studio

    This step generates a unique notebook name by appending a timestamp to the notebook filename to avoid naming conflicts.

    Skip this step if you already created a Shared Notebook via the UI in the first step.

    import time
    sample_notebook_name='Sample Notebook {}.ipynb'.format(int(time.time() * 1_000_000))
    %sql UPLOAD SHARED FILE TO '{{ sample_notebook_name }}' FROM 'sample_notebook.ipynb';
    print("Notebook '{}' has been created in the Data Studio shared files.".format(sample_notebook_name))
  3. Run the Notebook in the Current Session

    Use %run_shared to execute the sample notebook in the current session. Confirm that the sample_var variable set in the sample notebook is accessible in the current session.

    if 'sample_var' in globals():
    del sample_var
    %run_shared {{ sample_notebook_name }}
    print("The value of 'sample_var' is '{}'.\n".format(sample_var))

    Note: If you created the shared notebook via the UI, replace sample_notebook_name in the code above with the actual notebook name in single quotes:

    % run_shared {{ 'Sample Notebook.ipynb' }}
  4. Run the Sample Notebook in a New Session

    You can also run the sample notebook in a new session using jobs. This allows you to execute multiple notebooks in parallel using RUN JOB USING NOTEBOOK.

    Refer to RUN JOB USING NOTEBOOK for syntax and more information.

    job_ids = []
    for x in range(2):
    print("Running job for {}...".format(x))
    job_res = %sql RUN JOB USING NOTEBOOK '{{ sample_notebook_name }}' WITH PARAMETERS {"sample_var": "{{x}}"}
    job_ids.append(job_res[0].JobID)
    print(f'Waiting for jobs to complete... {job_ids}')
    success = %sql WAIT ON JOBS {{ job_ids }} WITH TIMEOUT 60 MINUTES
    print(f'All jobs completed with success: {bool(success[0].Success)}')

    Note: If you created the shared notebook via UI, replace sample_notebook_name in the code with your notebook name in single quotes (no brackets):

    %sql RUN JOB USING NOTEBOOK 'Sample Notebook.ipynb' WITH PARAMETERS {"sample_var": "{{x}}"}
  5. View Job Executions

    Use SHOW JOB EXECUTIONS to inspect job runs.

    for job_id in job_ids:
    execs = %sql SHOW JOB EXECUTIONS FOR {{ job_id }} from 1 to 1
    print(execs)
  6. Delete the Jobs

    Use DROP JOBS to delete the jobs.

    for id in job_ids:
    print(f"Dropping job '{id}'...")
    %sql DROP JOBS {{id}}

    Note: You can also view, inspect, and delete jobs from the Jobs section in the left navigation on the Cloud Portal. Refer to Scheduling Notebooks with SingleStore Job Service for more information.

  7. Delete the Sample Notebook

    Delete the sample notebook using DROP SHARED FILE, or via the Cloud Portal by selecting Delete from the Actions column for your notebook.

    %%sql
    DROP SHARED FILE '{{ sample_notebook_name }}'

Last modified: July 10, 2025

Was this article helpful?

Verification instructions

Note: You must install cosign to verify the authenticity of the SingleStore file.

Use the following steps to verify the authenticity of singlestoredb-server, singlestoredb-toolbox, singlestoredb-studio, and singlestore-client SingleStore files that have been downloaded.

You may perform the following steps on any computer that can run cosign, such as the main deployment host of the cluster.

  1. (Optional) Run the following command to view the associated signature files.

    curl undefined
  2. Download the signature file from the SingleStore release server.

    • Option 1: Click the Download Signature button next to the SingleStore file.

    • Option 2: Copy and paste the following URL into the address bar of your browser and save the signature file.

    • Option 3: Run the following command to download the signature file.

      curl -O undefined
  3. After the signature file has been downloaded, run the following command to verify the authenticity of the SingleStore file.

    echo -n undefined |
    cosign verify-blob --certificate-oidc-issuer https://oidc.eks.us-east-1.amazonaws.com/id/CCDCDBA1379A5596AB5B2E46DCA385BC \
    --certificate-identity https://kubernetes.io/namespaces/freya-production/serviceaccounts/job-worker \
    --bundle undefined \
    --new-bundle-format -
    Verified OK