Open
Description
thanks for maintaining these neat implementations!
would be great to support transformers 4.48 as the previous versions have a security issue https://github.com/nvidia-holoscan/holohub/security/dependabot/35
currently llm-awq supports 4.46 and
Line 18 in 52d3c26
stderr | File "/workspace/llm-awq/tinychat/models/llama.py", line 146, in __init__
stderr | self.rotary_emb = LlamaRotaryEmbedding(
stderr | TypeError: LlamaRotaryEmbedding.__init__() got an unexpected keyword argument 'max_position_embeddings'
Metadata
Metadata
Assignees
Labels
No labels