Commit 1911d49
update gpt-oss-0.12 changes (#293)
* fix: bump up v0 moe dp implementation to v1
- remove DP padding support in v1 worker
- add validation for DP implementation constraints in v1 worker
- apply token mask to custom MOE kernel router logits
- update default environment variables:
- VLLM_RBLN_DP_IMPL: "dummy_prefill" -> "padded_decode"
- VLLM_RBLN_USE_MOE_TOKENS_MASK: False -> True
- fix DP metadata handling in forward context
- add is_prefills field to RBLNFlashAttentionMetadata
* fix: mxfp4 kernel for model parallel
+ add expert_map to handle vllm model parallel
Signed-off-by: wonsub kim <subang0@rebellions.ai>
* modify expert_map position
Signed-off-by: wonsub kim <subang0@rebellions.ai>
* fix gpt_oss tensor parallel all_reduce
+ gpt_oss MLPBlock tp missing
Signed-off-by: wonsub kim <subang0@rebellions.ai>
* disable shared fused moe overlap for RBLN
Signed-off-by: wonsub kim <subang0@rebellions.ai>
* reference torch impl for gpt-oss ops
* apply VLLM_RBLN_USE_MOE_TOKENS_MASK to mxfp4 MOE
* adjust available dram size based on target arch
+ change available dram size for REBEL architecture
- ATOM - 16GB
- REBEL - 140GB
Signed-off-by: wonsub kim <subang0@rebellions.ai>
* fix v1 dp online serving
refactor: improve intermediate tensors management and dummy run logic
- add prepare_dummy_run and dummy_run methods for v1 dp online serving
- remove unused sync_and_slice_intermediate_tensors method
- separate intermediate_tensors into prefill_intermediate_tensors
and decode_intermediate_tensors
- improve RBLNWorker device environment initialization
- add support for Ray backend
- add local_world_size calculation
- improve device environment variable setup logic
- make RBLN_DEVICES not coupled with VLLM_RBLN_TP_SIZE
- change LOCAL_RANK to rank in init_worker_distributed_environment
* add additional params for data_parallel.py script
+ add necessary parameters
--max-model-len, --block-size, --num-hidden-layers, --decode-batch
Signed-off-by: wonsub kim <subang0@rebellions.ai>
* fix calculation of maximum num blocks
+ consider sliding window attention
- DO NOT count sliding window attention block
since it shares kv cache block with full attention
+ calculate max num blocks based on assumption that
entire layers have full attention
- when calculating available memory, count full attention layer
not sliding window attention
Signed-off-by: wonsub kim <subang0@rebellions.ai>
* fix: port v0.12 scheduler code
* fix: limit decode bs to (max num seqs // pp size)
* tmp: pad decode inputs to max_num_seqs // pp_size
* add: simple offline benchmark script
* fix DPMetadata for tokens mask
- remove unused attn_metadata parameter from RBLNDPMetadata.make()
- remove is_prefills field and related logic from DP metadata
- fix get_tokens_mask() for non-DP case
* fix dp with pp dummy run logic
- refactor dummy run execution with DummyRunState and prepare_dummy_run
- update batch size calculation to account for pipeline parallel size
- add batch_pad parameter to attention metadata builder for PP support
* fix max_num_blocks calculation
+ consider following issues when calculating max_num_blocks
- consider gpt-oss-20b scale merge for dequantized version
- consider SWA(sliding window attention) block share with full attention
- consider word_embedding param when calculating kernel size
it is not included into device
Signed-off-by: wonsub kim <subang0@rebellions.ai>
* add optimized batch attention kernel
+ batch_attention kernel is optimized version of flash attention kernel for large batch
- batch attention kernel takes original sequence index
- in compiler lowering, original sequence index is lowered into following itmes
- seq_idx - cache target block index
- seq_offset - cache target block offset
- dyn_batch - valid batch count for each partition
Signed-off-by: wonsub kim <subang0@rebellions.ai>
* resolve conflict between bucketing and dp
+ replace max_batch_size with decode_batch_bucket size
+ by default, disable batch bucketing
- change limit of bucket
Signed-off-by: wonsub kim <subang0@rebellions.ai>
* fix num_runtimes
+ num_runtimes fix up
- ATOM num_runtimes = 2 * VLLM_RBLN_TP_SIZE
- REBEL num_runtimes = 2 * 4 (quad chiplet)
Signed-off-by: wonsub kim <subang0@rebellions.ai>
* pad seq_idx for batch attention
+ seq_idx SHOULD be padded if num_reqs < decode_batch size
Signed-off-by: wonsub kim <subang0@rebellions.ai>
* fixed batched decode func call
* remove unused code
* fix up RBLN_METRICS
+ DO NOT count model warm up (prefill & decode batch bucket)
Signed-off-by: wonsub kim <subang0@rebellions.ai>
* fix typo
Signed-off-by: wonsub kim <subang0@rebellions.ai>
* add specialized MoE decode optimization for DP
- implement specialized decode path that uses optimized padding when
all requests are in decode stage
- add VLLM_RBLN_SPECIALIZE_MOE_DECODE environment variable to enable
specialized handling for decode-only batches in MoE models
- refactor RBLNDPMetadata.max_pads_across_dp from int to torch.Tensor
to differentiate speicalized decode and normal decode
- add num_padded_tokens parameter to RBLNDPMetadata.make() and
_set_forward_context()
- add specialized decode path to batch bucketing
---------
Signed-off-by: wonsub kim <subang0@rebellions.ai>
Co-authored-by: Youngkyu Choi <youngkyu.choi@rebellions.ai>
Co-authored-by: Jaehwang Jung <jaehwang.jung@rebellions.ai>
Co-authored-by: Huijong JEONG <huijong.jeong@squeezebits.com>
Co-authored-by: JaehunRyu <jaehun.ryu@rebellions.ai>1 parent 9cc2ca5 commit 1911d49
13 files changed
Lines changed: 1501 additions & 422 deletions
File tree
- examples/experimental
- vllm_rbln
- model_executor/layers
- fused_moe
- quantization
- models
- v1
- attention/backends
- worker
- worker
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
54 | 54 | | |
55 | 55 | | |
56 | 56 | | |
57 | | - | |
58 | | - | |
59 | | - | |
60 | | - | |
61 | | - | |
62 | 57 | | |
63 | | - | |
| 58 | + | |
| 59 | + | |
64 | 60 | | |
65 | 61 | | |
66 | 62 | | |
67 | 63 | | |
68 | 64 | | |
69 | 65 | | |
70 | 66 | | |
71 | | - | |
72 | | - | |
73 | | - | |
74 | | - | |
75 | | - | |
76 | | - | |
77 | | - | |
78 | | - | |
79 | | - | |
80 | | - | |
81 | | - | |
82 | | - | |
83 | | - | |
84 | | - | |
85 | | - | |
86 | | - | |
87 | | - | |
88 | | - | |
89 | | - | |
90 | | - | |
91 | | - | |
92 | | - | |
93 | | - | |
94 | 67 | | |
95 | 68 | | |
96 | 69 | | |
97 | 70 | | |
98 | 71 | | |
99 | 72 | | |
100 | | - | |
| 73 | + | |
101 | 74 | | |
102 | 75 | | |
103 | 76 | | |
104 | 77 | | |
105 | | - | |
| 78 | + | |
106 | 79 | | |
107 | 80 | | |
108 | 81 | | |
| |||
119 | 92 | | |
120 | 93 | | |
121 | 94 | | |
| 95 | + | |
| 96 | + | |
| 97 | + | |
| 98 | + | |
| 99 | + | |
| 100 | + | |
| 101 | + | |
122 | 102 | | |
123 | 103 | | |
124 | 104 | | |
125 | | - | |
126 | | - | |
127 | | - | |
| 105 | + | |
| 106 | + | |
| 107 | + | |
128 | 108 | | |
129 | 109 | | |
130 | | - | |
| 110 | + | |
131 | 111 | | |
132 | 112 | | |
133 | 113 | | |
| |||
166 | 146 | | |
167 | 147 | | |
168 | 148 | | |
| 149 | + | |
| 150 | + | |
| 151 | + | |
| 152 | + | |
| 153 | + | |
| 154 | + | |
| 155 | + | |
| 156 | + | |
| 157 | + | |
| 158 | + | |
| 159 | + | |
| 160 | + | |
| 161 | + | |
| 162 | + | |
| 163 | + | |
| 164 | + | |
169 | 165 | | |
170 | 166 | | |
171 | 167 | | |
| |||
189 | 185 | | |
190 | 186 | | |
191 | 187 | | |
| 188 | + | |
| 189 | + | |
| 190 | + | |
| 191 | + | |
192 | 192 | | |
193 | 193 | | |
194 | 194 | | |
| |||
200 | 200 | | |
201 | 201 | | |
202 | 202 | | |
203 | | - | |
204 | | - | |
205 | | - | |
206 | | - | |
207 | | - | |
208 | | - | |
209 | | - | |
210 | | - | |
211 | | - | |
212 | | - | |
213 | | - | |
214 | | - | |
215 | | - | |
216 | | - | |
217 | | - | |
218 | | - | |
219 | | - | |
220 | | - | |
221 | | - | |
222 | | - | |
223 | | - | |
224 | 203 | | |
225 | 204 | | |
226 | 205 | | |
227 | 206 | | |
228 | 207 | | |
229 | 208 | | |
230 | 209 | | |
231 | | - | |
| 210 | + | |
| 211 | + | |
232 | 212 | | |
233 | 213 | | |
234 | 214 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
47 | 47 | | |
48 | 48 | | |
49 | 49 | | |
| 50 | + | |
50 | 51 | | |
51 | 52 | | |
52 | 53 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
15 | 15 | | |
16 | 16 | | |
17 | 17 | | |
18 | | - | |
| 18 | + | |
19 | 19 | | |
20 | 20 | | |
| 21 | + | |
21 | 22 | | |
22 | | - | |
23 | | - | |
24 | | - | |
| 23 | + | |
| 24 | + | |
| 25 | + | |
| 26 | + | |
| 27 | + | |
| 28 | + | |
25 | 29 | | |
26 | 30 | | |
27 | 31 | | |
| |||
31 | 35 | | |
32 | 36 | | |
33 | 37 | | |
34 | | - | |
| 38 | + | |
| 39 | + | |
| 40 | + | |
| 41 | + | |
| 42 | + | |
| 43 | + | |
| 44 | + | |
| 45 | + | |
| 46 | + | |
| 47 | + | |
| 48 | + | |
| 49 | + | |
| 50 | + | |
| 51 | + | |
| 52 | + | |
| 53 | + | |
| 54 | + | |
| 55 | + | |
| 56 | + | |
| 57 | + | |
| 58 | + | |
| 59 | + | |
| 60 | + | |
| 61 | + | |
| 62 | + | |
| 63 | + | |
| 64 | + | |
| 65 | + | |
| 66 | + | |
| 67 | + | |
| 68 | + | |
| 69 | + | |
| 70 | + | |
| 71 | + | |
| 72 | + | |
| 73 | + | |
| 74 | + | |
| 75 | + | |
| 76 | + | |
| 77 | + | |
| 78 | + | |
| 79 | + | |
| 80 | + | |
| 81 | + | |
35 | 82 | | |
36 | 83 | | |
37 | 84 | | |
38 | | - | |
39 | | - | |
| 85 | + | |
40 | 86 | | |
41 | | - | |
| 87 | + | |
| 88 | + | |
42 | 89 | | |
43 | | - | |
44 | | - | |
45 | 90 | | |
46 | | - | |
47 | 91 | | |
48 | | - | |
49 | | - | |
| 92 | + | |
| 93 | + | |
| 94 | + | |
| 95 | + | |
| 96 | + | |
| 97 | + | |
| 98 | + | |
50 | 99 | | |
51 | | - | |
52 | | - | |
53 | | - | |
54 | | - | |
55 | | - | |
| 100 | + | |
| 101 | + | |
56 | 102 | | |
57 | | - | |
58 | | - | |
59 | | - | |
60 | | - | |
61 | | - | |
62 | | - | |
| 103 | + | |
| 104 | + | |
| 105 | + | |
| 106 | + | |
| 107 | + | |
| 108 | + | |
| 109 | + | |
| 110 | + | |
| 111 | + | |
63 | 112 | | |
64 | | - | |
65 | | - | |
66 | | - | |
67 | | - | |
| 113 | + | |
| 114 | + | |
| 115 | + | |
68 | 116 | | |
69 | 117 | | |
70 | 118 | | |
71 | 119 | | |
72 | | - | |
73 | | - | |
74 | | - | |
75 | | - | |
76 | | - | |
77 | | - | |
78 | | - | |
| 120 | + | |
| 121 | + | |
| 122 | + | |
| 123 | + | |
| 124 | + | |
| 125 | + | |
| 126 | + | |
| 127 | + | |
| 128 | + | |
| 129 | + | |
79 | 130 | | |
80 | 131 | | |
81 | 132 | | |
82 | 133 | | |
83 | 134 | | |
84 | 135 | | |
85 | 136 | | |
86 | | - | |
| 137 | + | |
| 138 | + | |
87 | 139 | | |
88 | 140 | | |
89 | 141 | | |
90 | 142 | | |
91 | | - | |
| 143 | + | |
92 | 144 | | |
93 | | - | |
94 | | - | |
95 | | - | |
96 | | - | |
97 | | - | |
98 | | - | |
99 | | - | |
100 | | - | |
101 | | - | |
102 | | - | |
103 | | - | |
| 145 | + | |
| 146 | + | |
| 147 | + | |
| 148 | + | |
| 149 | + | |
| 150 | + | |
| 151 | + | |
| 152 | + | |
| 153 | + | |
| 154 | + | |
| 155 | + | |
104 | 156 | | |
105 | 157 | | |
106 | 158 | | |
107 | | - | |
| 159 | + | |
| 160 | + | |
108 | 161 | | |
109 | 162 | | |
110 | | - | |
111 | | - | |
112 | | - | |
113 | | - | |
114 | | - | |
115 | | - | |
116 | | - | |
| 163 | + | |
117 | 164 | | |
118 | 165 | | |
119 | 166 | | |
120 | 167 | | |
| 168 | + | |
121 | 169 | | |
122 | 170 | | |
123 | 171 | | |
| |||
141 | 189 | | |
142 | 190 | | |
143 | 191 | | |
144 | | - | |
145 | | - | |
146 | 192 | | |
147 | 193 | | |
0 commit comments