This Docker container provides a pre-configured environment for running PyRIT (Python Risk Identification Tool for generative AI) with JupyterLab integration. It comes with pre-installed PyRIT, all necessary dependencies, and supports both CPU and optional GPU modes.
- Pre-installed PyRIT with all dependencies
- JupyterLab integration for interactive usage
- CPU mode enabled by default for broad compatibility
- Option to enable GPU support (requires NVIDIA drivers and container toolkit)
- Automatic documentation cloning from the PyRIT repository when
CLONE_DOCS=true
- Based on Microsoft Azure ML Python 3.12 inference image
.
├── Dockerfile # Container build configuration
├── README.md # This documentation file
├── requirements.txt # Python packages
├── docker-compose.yaml # Docker Compose configuration
├── .env_container_settings_example # Container's example env file (rename .env.container.settings)
└── start.sh # Container startup script
# Build and start the container in detached mode
docker-compose up -d
# View logs
docker-compose logs -f
# Stop the container
docker-compose down
Once the container is running, open your browser and navigate to:
http://localhost:8888
By default, JupyterLab is configured to run without a password or token.
- CLONE_DOCS: When set to
true
(default), the container automatically clones the PyRIT repository and copies the documentation files to the notebooks directory. To disable this behavior, setCLONE_DOCS=false
in your environment or in the.env.container.settings
file. - ENABLE_GPU: Set to
true
to enable GPU support (requires NVIDIA drivers and container toolkit). The container defaults to CPU-only mode.
The container expects a .env file and optionally a .env.local file to provide secret keys and configuration values. If these files do not exist, please create them by copying the provided example files:
cp ../.env.example ../.env
cp ../.env.local_example ../.env.local
cp .env_container_settings_example .env.container.settings
These files will automatically be pulled into the container (.env and .env.local), and if they're missing, you might encounter errors indicating that required environment files are not found.
- Notebooks: Place your Jupyter notebooks in the
notebooks/
directory. They will be available automatically in JupyterLab. - Data: Place your datasets or other files in the
data/
directory. Access them from your notebooks at/app/data/
.
Ensure your notebooks/
, data/
and ../assets/
directories have the correct permissions to allow container access:
chmod -R 777 notebooks/ data/ ../assets
To correctly map your local notebooks and data directories into the container, use the following Docker Compose configuration:
services:
pyrit:
build:
context: .
dockerfile: Dockerfile
image: pyrit:latest
container_name: pyrit-jupyter
ports:
- "8888:8888"
volumes:
- ./notebooks:/app/notebooks
- ./data:/app/data
- ../assets:/app/assets
env_file:
- ../.env
- ../.env.local
- .env.container.settings
restart: unless-stopped
healthcheck:
test: ["CMD-SHELL", "curl -sf http://localhost:8888 || exit 1"]
interval: 30s
timeout: 10s
retries: 3
start_period: 40s
volumes:
notebooks:
data:
Edit the docker-compose.yaml
file to change port mappings, environment variables, or volume mounts as needed.
Start a new notebook in JupyterLab and try the following:
import pyrit
print(pyrit.__version__)
# Example PyRIT usage:
# [Insert your PyRIT usage examples here]
To enable GPU support:
-
Edit
.env.container.settings
and add/modify the following:ENABLE_GPU=true # Enable GPU support
-
Restart the container:
docker-compose down docker-compose up -d
If you cannot access JupyterLab, check the container logs:
docker-compose logs pyrit
If you encounter permission issues with the notebooks or data directories, adjust the permissions:
chmod -R 777 notebooks/ data/ ../assets/
- Base Image:
mcr.microsoft.com/azureml/minimal-py312-inference:latest
- Python: 3.12
- PyTorch: Latest version with CUDA support
- PyRIT: Installed from PyPI (latest version)
You can further customize the container by:
- Modifying the
Dockerfile
to add additional system or Python dependencies. - Adding your own notebooks to the
/app/notebooks
directory. - Changing startup options in the
start.sh
script.
The JupyterLab instance is configured to run without authentication by default for ease of use. For production deployments, consider adding authentication or running behind a secured proxy.