Skip to content

Commit 4f28705

Browse files
authored
Update README.md
1 parent 01ba716 commit 4f28705

File tree

1 file changed

+11
-0
lines changed

1 file changed

+11
-0
lines changed

README.md

Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -29,6 +29,7 @@ Masked Image Modeling (MIM) with Vector Quantization (VQ) has achieved great suc
2929
To push the limits of this paradigm, we propose MergeVQ, which incorporates token merging techniques into VQ-based autoregressive generative models to bridge the gap between visual generation and representation learning in a unified architecture. During pre-training, MergeVQ decouples top-k semantics from latent space with a token merge module after self-attention blocks in the encoder for subsequent Look-up Free Quantization (LFQ) and global alignment and recovers their fine-grained details through cross-attention in the decoder for reconstruction. As for the second-stage generation, we introduce MergeAR, which performs KV Cache compression for efficient raster-order prediction.
3030
Experiments on ImageNet verify that MergeVQ as an AR generative model achieves competitive performance in both representation learning and image generation tasks while maintaining favorable token efficiency and inference speed.
3131

32+
Huggingface: [https://huggingface.co/papers](https://huggingface.co/papers/2504.00999) (Welcome to upvote⬆️)
3233
## Catalog
3334

3435
We plan to release implementations of MergeVQ in a few months (before CVPR2025 taking place). Please watch us for the latest release and welcome to open issues for discussion!
@@ -43,6 +44,16 @@ We plan to release implementations of MergeVQ in a few months (before CVPR2025 t
4344
booktitle={Conference on Computer Vision and Pattern Recognition (CVPR)},
4445
year={2025}
4546
}
47+
48+
@misc{li2025mergevqunifiedframeworkvisual,
49+
title={MergeVQ: A Unified Framework for Visual Generation and Representation with Disentangled Token Merging and Quantization},
50+
author={Siyuan Li and Luyuan Zhang and Zedong Wang and Juanxi Tian and Cheng Tan and Zicheng Liu and Chang Yu and Qingsong Xie and Haonan Lu and Haoqian Wang and Zhen Lei},
51+
year={2025},
52+
eprint={2504.00999},
53+
archivePrefix={arXiv},
54+
primaryClass={cs.CV},
55+
url={https://arxiv.org/abs/2504.00999},
56+
}
4657
```
4758

4859
<p align="right">(<a href="#top">back to top</a>)</p>

0 commit comments

Comments
 (0)