Jupyter

Warning

Users are encouraged to use Jupyter primarily for testing or prototyping code, rather than for running large, resource-intensive workloads. By default, Jupyter imposes limits on cache, memory, and other settings, which can slow down execution or cause failures.

While these settings can be adjusted at launch, it’s often simpler and more efficient to run Python scripts directly using the Python interpreter, especially for heavier tasks.

There are multiple ways to launch jupyer in the cluster. At the moment, all of them require users to have basic familiarity with conda environments or, alternatively, the python venv module (See Python’s venv documentation).

Launching Jupyter through a batch job

For the convenience of users, the cluster already comes with pre-defined conda environments where jupyter is installed: pytorch, tensorflow. Users can use any of these environments or create their own.

1. (Optional) Create a jupyter environment

Login into the cluster and create a jupyter environment, plus any additional packages you need as follows:

module load miniforge3/25.3.1-gcc-11.4.1
conda create -n jupyter_env python numpy pandas notebook

2. Create the submission script

Login to the cluster and create the following sbatch script:

 1#!/bin/bash
 2#SBATCH --job-name="jupyter"
 3#SBATCH --time=08:00:00
 4#SBATCH --nodes=1
 5#SBATCH --partition=gpu1h100
 6#SBATCH --gpus-per-node=1
 7#SBATCH --output=/home/%u/jupyter.%j.out
 8#SBATCH --error=/home/%u/jupyter.%j.err
 9
10module load miniforge3/25.3.1-gcc-11.4.1
11
12# CHANGE THIS TO THE CONDA ENVIRONMENT YOU SEE FIT
13conda activate jupyter
14
15# This line prints a random, available port for jupyter to use
16PORT=`comm -23 <(seq 1024 65535 | sort) <(ss -Htan | awk '{print $4}' | cut -d':' -f2 | sort -u) | shuf | head -n 1`
17PASS=`openssl rand -base64 15`
18HASHED_PASS=`python -c "from jupyter_server.auth import passwd; print(passwd('$PASS'))"`
19
20cat 1>&2 <<END
211. SSH tunnel from your workstation using the following command:
22
23   ssh -N -L ${PORT}:`hostname`:${PORT} ${SLURM_JOB_USER}@zurada.rc.louisville.edu
24
25   and point your web browser to http://localhost:${PORT}
26
272. log in to Jupyter using the following password: ${PASS}
28
29When done using Jupyter, terminate the job by issuing the following command on the login node:
30
31      scancel -f ${SLURM_JOB_ID}
32END
33
34jupyter notebook --no-browser --port=$PORT \
35    --ServerApp.ip=0.0.0.0 \
36    --PasswordIdentityProvider.hashed_password="$HASHED_PASS"

3. Connect to jupyter from your web browser

The script will print to the standard error file (the one indicated in the #SBATCH --error option in the sbatch file) the instructions on how to connect to the jupyer web instance. Example output:

1. SSH tunnel from your workstation using the following command:

    ssh -N -L 10178:gpusm01:10178 user@zurada.rc.louisville.edu

    and point your web browser to http://localhost:10178

2. log in to Jupyter using the following password: Oc/ieDDJ0yulxiRP96Wt

   When done using Jupyter, terminate the job by issuing the following command on the login node:

    scancel -f 177

Note

10178 is a randomly picked port and the password Oc/ieDDJ0yulxiRP96Wt is randomly generated. This means users will be provided a new port and password upon a new job submission.

Launching Jupyter through an interactive job

For the convenience of users, the cluster already comes with pre-defined python virtual environments where jupyter is installed: pytorch, tensorflow. Users can use any of these environments or create their own.

1. (Optional) Create a jupyter environment

Login into the cluster and create a jupyter environment, plus any additional packages you need as follows:

module load miniforge3/25.3.1-gcc-11.4.1
conda create -n jupyter python numpy pandas notebook

2. Submit an interactive job

Here is an example, but users should change the parameters as they see fit

srun --partition=gpu1h100 --job-name jupyter --time=5:00:00 --nodes=1 --pty /bin/bash -i

3. Manually launch Jupyter

When you land on the assigned compute node, take note of the hostname of the server assigned to your job as you will need it for the following steps (you can use the hostname command). Then, start a jupyter server as follows:

module load miniforge3/25.3.1-gcc-11.4.1
# CHANGE THIS TO THE CONDA ENVIRONMENT YOU SEE FIT
conda activate jupyter
PORT=`comm -23 <(seq 1024 65535 | sort) <(ss -Htan | awk '{print $4}' | cut -d':' -f2 | sort -u) | shuf | head -n 1`
PASS=`openssl rand -base64 15`
HASHED_PASS=`python -c "from jupyter_server.auth import passwd; print(passwd('$PASS'))"`
echo "THIS IS YOUR PASSWORD: ${PASS}"
echo "THIS IS THE PORT JUPYTER IS RUNNING ON: ${PORT}"
jupyter notebook --no-browser --port=$PORT \
    --ServerApp.ip=0.0.0.0 \
    --PasswordIdentityProvider.hashed_password="$HASHED_PASS"

4. Access Jupyter from your workstation

ssh -N -L ${PORT}:${HOSTNAME}:${PORT} ${SLURM_JOB_USER}@zurada.rc.louisville.edu

For example, assume you landed on the server gpusm02 on step 2 and jupyter is using port 7070, then you would run: ssh -N -L 7070:gpusm02:7070 username@zurada.rc.louisville.edu.

Access jupyter through your (personal) workstation’s web browser by entering in the navigation bar: localhost:<port>. Following the example from step 4, you would use localhost:7070. Then, enter the password printed in step 3.

Transitioning from Jupyter to Python Script

Jupyter Notebooks are great for interactive development and prototyping, but for production or large-scale execution, Python scripts are often more efficient and maintainable. This guide outlines the steps to convert your notebook into a Python script.

1. Export the Notebook

In Jupyter:

  • Go to the top menu: File > Download as > Python (.py)

  • This will generate a .py file with all your code and markdown cells converted to comments.

2. Clean Up the Script

  • Open the exported .py file in a text editor or IDE.

  • Remove or convert markdown comments (lines starting with #) as needed.

  • Delete any unused cells or outputs.

  • Consolidate code into functions or a main block for better structure:

    def main():
        # your code here
    
    if __name__ == "__main__":
        main()
    

3. Replace Notebook-Specific Features

  • Remove magic commands like %matplotlib inline.

  • Replace interactive display functions with standard Python equivalents.

  • Save plots and images to disk using plt.savefig("filename.png") instead of displaying them with plt.show().

4. Handle File Paths and Inputs

  • Use relative or absolute paths for reading/writing files.

  • Consider using argparse to handle command-line arguments for flexibility.

5. Test the Script

  • Run the script from the terminal:

    python your_script.py
    
  • Check for errors and ensure outputs match expectations.

By following these steps, you can turn your interactive notebook into a reusable Python script suitable for larger workflows or automated execution.