-
Notifications
You must be signed in to change notification settings - Fork 62
Description
@brjathu Hi,
I’m trying to run the PHALP tracker demo on a video (and I observe the same error in related projects like 4D-Humans and TokenHMR when they invoke PHALP via pip install git+https://github.com/brjathu/PHALP.git). The code fails while loading the pre-trained HMR weights, reporting a corrupted or unreadable archive.
Reproduction Steps
Create and activate a clean conda environment (e.g. Python 3.10).
Install PHALP from GitHub:
pip install git+https://github.com/brjathu/PHALP.git
Clone and enter the 4D-Humans demo folder (or TokenHMR, PHALP examples, etc.):
git clone https://github.com/brjathu/4D-Humans.git
cd 4D-Humans
conda activate 4D-humans
pip install -r requirements.txt
Run the tracking demo on the provided video:
python track.py video.source="example_data/videos/gymnasts.mp4"
Observed Behavior
The script crashes with the following error:
RuntimeError: PytorchStreamReader failed reading zip archive: failed finding central directory
Full trace:
[07/19 05:38:12] INFO No OpenGL_accelerate module loaded: No module named 'OpenGL_accelerate'
Error executing job with overrides: ['video.source=example_data/videos/gymnasts.mp4']
Traceback (most recent call last):
...
File ".../phalp/models/hmar/hmar.py", line 47, in load_weights
checkpoint_file = torch.load(path)
...
RuntimeError: PytorchStreamReader failed reading zip archive: failed finding central directory
Expected Behavior
The demo should download or access the correct HMR (.pth/.zip) weights and load them without error, then proceed to track keypoints/meshes on the input video.
Troubleshooting Attempts
Verified that cfg.hmr.hmar_path points to a valid file (I printed it before loading).
Manually downloaded the checkpoint archive and re-supplied it to hmr.hmar_path. Still the same error.
Checked file size and permissions—file is non-zero and readable.
Upgraded/downgraded PyTorch; no change.
Any guidance on how to resolve this archive-loading error would be greatly appreciated!
Best regards,
Peilun