Skip to content

Remove support for extras (flash attention, iQ quants) #284

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Dec 2, 2024

Conversation

amakropoulos
Copy link
Collaborator

No description provided.

@amakropoulos amakropoulos merged commit c4c7488 into release/v2.4.0 Dec 2, 2024
1 check passed
@amakropoulos amakropoulos deleted the general/remove_LlamaLib_extras branch December 2, 2024 15:33
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant