Conversation
0270368 to
6fca7af
Compare
| def _build_output( | ||
| all_data: Union[torch.Tensor, list[torch.Tensor]], ) -> PoolerOutput: | ||
| """Wrap tensor data into vLLM's PoolerOutput format.""" | ||
| all_outputs = [PoolingSequenceGroupOutput(data) for data in all_data] |
There was a problem hiding this comment.
Is PoolingSequenceGroupOutput deleted from vllm?
There was a problem hiding this comment.
The PoolingSequenceGroupOutput function has been removed.
Each class now handles the hidden_states tensor directly, eliminating the need for the build_output function.
v0.10.2) https://github.com/vllm-project/vllm/blob/v0.10.2/vllm/model_executor/layers/pooler.py
v0.13.0) https://github.com/vllm-project/vllm/blob/v0.13.0/vllm/model_executor/layers/pooler.py
|
|
||
| if image_embeds is not None: | ||
| if not isinstance(image_embeds, (torch.Tensor, list)): | ||
| if not isinstance(image_embeds, torch.Tensor | list): |
There was a problem hiding this comment.
Are we deprecating py3.9?
There was a problem hiding this comment.
vLLM has not supported Python 3.9 since version 0.11.1.
https://github.com/vllm-project/vllm/blob/v0.11.1/pyproject.toml
We are dropping support for Python 3.9, and the other SDKs are being updated accordingly.
1448f6e to
4952138
Compare
| block_size: int = 16, | ||
| max_model_len: Optional[int] = None, | ||
| async_scheduling: bool = False, | ||
| is_torch_compile: bool = False, |
There was a problem hiding this comment.
Great. We'll manage the tests for the torch.compile path separately, without relying on the optimum path codes. So you don’t need to worry about the torch.compile side in tests!
rebel-jaehwang
left a comment
There was a problem hiding this comment.
I have reviewed files under v1/.
9fbd809 to
5f60ee7
Compare
🚀 Summary of Changes
📌 Related Issues / Tickets
✅ Type of Change
feature)model)core)bug-fix)perf)refactor)docs)other): please describe🧪 How to Test
.........📸 Screenshots / Logs (if applicable)
📋 Checklist
💬 Notes