Change the repository type filter
All
Repositories list
89 repositories
- [ICML2025] SpargeAttention: A training-free sparse attention that accelerates any model inference.
- A toolkit for processing raw embodied data into standardized formats and converting between embodied dataset schemas.
- SLA: Beyond Sparsity in Diffusion Transformers via Fine-Tunable Sparse–Linear Attention
Causal-Forcing
PublicOfficial codebase for "Causal Forcing: Autoregressive Diffusion Distillation Done Right for High-Quality Real-Time Interactive Video Generation"TurboDiffusion
PublicTurboDiffusion: 100–200× Acceleration for Video Diffusion ModelsCausalForcing.github.io
Public- Official implementation for "RIFLEx: A Free Lunch for Length Extrapolation in Video Diffusion Transformers" (ICML 2025) and UltraViCo (ICLR 2026)
i-DODE
PublicSageAttention
Public[ICLR2025, ICML2025, NeurIPS2025 Spotlight] Quantized Attention achieves speedup of 2-5x compared to FlashAttention, without losing end-to-end metrics across la…MLA-Trust
PublicA toolbox for benchmarking Multimodal LLM Agents trustworthiness across truthfulness, controllability, safety and privacy dimensions through 34 interactive task…Motus
PublicOfficial code of Motus: A Unified Latent Action World Modelvidar-robotwin
Publicvidar
PublicUltraViCo.github.io
Publictianshou
PublicUniCardio
PublicFrameBridge
PublicDiffusionBridge
PublicSparseDM
PublicEmbodiedActiveDefense
Publicoddefense
PublicAdaptive-Sparse-Trainer
PublicMMTrustEval
PublicA toolbox for benchmarking trustworthiness of multimodal large language models (MultiTrust, NeurIPS 2024 Track Datasets and Benchmarks)TetraJet-MXFP4Training
PublicDDO
PublicGFT
Publiccond-image-leakage
Public