Load models directly from IPFS for Hugging Face Transformers.
- 🌐 Direct integration with local IPFS nodes (preferred method)
- 🔄 Automatic fallback to IPFS gateways when local node isn't available
- 🔍 Simple URI format:
ipfs://CID
for easy model sharing - ⚡ Zero configuration required - works automatically once installed
- 🧩 Compatible with any version of Transformers
pip install transformers-ipfs
transformers-ipfs activate
Once installed and activated, the transformers_ipfs
integration will be loaded automatically whenever you use Python.
Use the Transformers library with IPFS model URIs:
from transformers import AutoModel, AutoTokenizer
# Load a model directly from IPFS
tokenizer = AutoTokenizer.from_pretrained("ipfs://bafybeichqdarufyutqc7yd43k77fkxbmeuhhetbihd3g32ghcqvijp6fxi")
# Equivalent HuggingFace model: "riturajpandey739/gpt2-sentiment-analysis-tweets"
model = AutoModel.from_pretrained("ipfs://bafybeichqdarufyutqc7yd43k77fkxbmeuhhetbihd3g32ghcqvijp6fxi")
The transformers_ipfs
package prioritizes connectivity in the following order:
-
Local IPFS Node (Recommended): If you have an IPFS daemon running locally (
ipfs daemon
), the package will automatically detect and use it. This method:- Is much faster for repeated downloads
- More reliably loads complex model directories with multiple files
- Contributes to the IPFS network by providing content to others
-
IPFS Gateway (Fallback): If a local node isn't available, the package will fall back to public gateways. This method:
- Works without installing IPFS
- May be less reliable for complex model directories
- Downloads can be interrupted more easily
# Activate the auto-loading
transformers-ipfs activate
# Check if the integration is active
transformers-ipfs status
# Test the integration
transformers-ipfs test
# Deactivate the integration
transformers-ipfs deactivate
- Python 3.7+
- transformers
MIT