Skip to content

Conversation

@rom1504
Copy link
Contributor

@rom1504 rom1504 commented Jan 21, 2024

I created the all_clip module in order to have a single place to support all kind of clip models. It is already used in clip retrieval and I propose to use it here too.

@rom1504 rom1504 requested a review from mehdidc January 21, 2024 22:04
@rom1504
Copy link
Contributor Author

rom1504 commented Jan 21, 2024

what do you think @mehdidc ?

I created the all_clip module in order to have a single place to support all kind of clip models.
It is already used in clip retrieval and I propose to use it here too.
@mehdidc
Copy link
Collaborator

mehdidc commented Jan 22, 2024

@rom1504 I think this is super cool! it makes very much sense to have a separate package for that.
Currently, there are two ways to specifiy which model to use, first by providing --model and --pretrained and --model_type separately, or by using --pretrained_model, where we provide both model type and pretrained together separeted by comma (https://github.com/LAION-AI/CLIP_benchmark/blob/main/clip_benchmark/cli.py#L128). If --pretrained_model is used, it takes precedence over the case where --model / --pretrained. Maybe, in the case where --pretrained_model is provided, we can additionally detect and support allclip string format (i.e.,open_clip:ViT-B-32/laion2b_s34b_b79k) to make it easy to use the same string format that allclip use. What do you think?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants