Download and install the latest CUDA Toolkit from the official NVIDIA CUDA Downloads. After installation, verify the installation:
nvcc --version
Download from the official Visual Studio Build Tools page. During installation, select the following workloads:
- Desktop development with C++
- C++ tools for Linux development
Download Git from https://git-scm.com/downloads/win and follow the installation steps.
Conda helps manage Python environments. You can install either Anaconda or Miniconda from the official site.
You may have some various ways to install ComfyUI. For example, I used ComfyUI CLI. Once Python is installed, you can install ComfyUI via the CLI:
pip install comfy-cli
comfy-cli install
To launch ComfyUI:
comfy-cli launch
To ensure correct installation, you need to find the Python interpreter used by ComfyUI. Launch ComfyUI and look for this line in the log:
** Python executable: G:\ComfyuI\python\python.exe
Then verify the Python version and installed PyTorch version:
"G:\ComfyuI\python\python.exe" --version
"G:\ComfyuI\python\python.exe" -m pip show torch
Install PyTorch appropriate for your setup
-
For most users:
"G:\ComfyuI\python\python.exe" -m pip install torch==2.6 torchvision==0.21 torchaudio==2.6
-
For RTX 50-series GPUs (requires PyTorch ≥2.7 with CUDA 12.8):
"G:\ComfyuI\python\python.exe" -m pip install --pre torch torchvision torchaudio --index-url https://download.pytorch.org/whl/nightly/cu128
You can install Nunchaku wheels from one of the following:
Example (for Python 3.10 + PyTorch 2.6):
"G:\ComfyuI\python\python.exe" -m pip install https://huggingface.co/mit-han-lab/nunchaku/resolve/main/nunchaku-0.2.0+torch2.6-cp310-cp310-win_amd64.whl
To verify the installation:
"G:\ComfyuI\python\python.exe" -c "import nunchaku"
You can also run a test (requires a Hugging Face token for downloading the models):
"G:\ComfyuI\python\python.exe" -m huggingface-cli login
"G:\ComfyuI\python\python.exe" -m nunchaku.test
Please use CMD instead of PowerShell for building.
-
Step 1: Install Build Tools
C:\Users\muyang\miniconda3\envs\comfyui\python.exe "G:\ComfyuI\python\python.exe" -m pip install ninja setuptools wheel build
-
Step 2: Clone the Repository
git clone https://github.com/mit-han-lab/nunchaku.git cd nunchaku git submodule init git submodule update
-
Step 3: Set Up Visual Studio Environment
Locate the
VsDevCmd.bat
script on your system. Example path:C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\Common7\Tools\VsDevCmd.bat
Then run:
"C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\Common7\Tools\VsDevCmd.bat" -startdir=none -arch=x64 -host_arch=x64 set DISTUTILS_USE_SDK=1
-
Step 4: Build Nunchaku
"G:\ComfyuI\python\python.exe" setup.py develop
Verify with:
"G:\ComfyuI\python\python.exe" -c "import nunchaku"
You can also run a test (requires a Hugging Face token for downloading the models):
"G:\ComfyuI\python\python.exe" -m huggingface-cli login "G:\ComfyuI\python\python.exe" -m nunchaku.test
-
(Optional) Step 5: Building wheel for Portable Python
If building directly with portable Python fails, you can first build the wheel in a working Conda environment, then install the
.whl
file using your portable Python:set NUNCHAKU_INSTALL_MODE=ALL "G:\ComfyuI\python\python.exe" python -m build --wheel --no-isolation
Clone the ComfyUI-Nunchaku plugin into the custom_nodes
folder:
cd ComfyUI/custom_nodes
git clone https://github.com/mit-han-lab/ComfyUI-nunchaku.git
Alternatively, install using ComfyUI-Manager or comfy-cli
.
-
Standard FLUX.1-dev Models
Start by downloading the standard FLUX.1-dev text encoders and VAE. You can also optionally download the original BF16 FLUX.1-dev model. An example command:
huggingface-cli download comfyanonymous/flux_text_encoders clip_l.safetensors --local-dir models/text_encoders huggingface-cli download comfyanonymous/flux_text_encoders t5xxl_fp16.safetensors --local-dir models/text_encoders huggingface-cli download black-forest-labs/FLUX.1-schnell ae.safetensors --local-dir models/vae huggingface-cli download black-forest-labs/FLUX.1-dev flux1-dev.safetensors --local-dir models/diffusion_models
-
SVDQuant 4-bit FLUX.1-dev Models
Next, download the SVDQuant 4-bit models:
- For 50-series GPUs, use the FP4 model.
- For other GPUs, use the INT4 model.
Make sure to place the entire downloaded folder into
models/diffusion_models
. For example:huggingface-cli download mit-han-lab/svdq-int4-flux.1-dev --local-dir models/diffusion_models/svdq-int4-flux.1-dev
-
(Optional): Download Sample LoRAs
You can test with some sample LoRAs like FLUX.1-Turbo and Ghibsky. Place these files in the
models/loras
directory:huggingface-cli download alimama-creative/FLUX.1-Turbo-Alpha diffusion_pytorch_model.safetensors --local-dir models/loras huggingface-cli download aleksa-codes/flux-ghibsky-illustration lora.safetensors --local-dir models/loras
To use the official workflows, download them from the ComfyUI-nunchaku and place them in your ComfyUI/user/default/workflows
directory. The command can be
# From the root of your ComfyUI folder
cp -r custom_nodes/ComfyUI-nunchaku/workflows user/default/workflows/nunchaku_examples
You can now launch ComfyUI and try running the example workflows.
If you encounter issues, refer to our: