-
Notifications
You must be signed in to change notification settings - Fork 146
[OpenVINO] Support openbmb/MiniCPM-o-2_6 for image-text-to-text task #1454
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from 28 commits
3b366e6
66416ab
bc2ca88
8574228
b485b83
faa8378
4533a70
0c99715
6b50d6d
14e4309
3810295
3e642d2
0e8c00e
72a3052
72c7a97
977647e
3297035
d6f63d0
e9e7e68
91239a9
2bd642a
69571e9
d873661
d14820a
92c6325
2c5861e
f70dfa0
348ac8b
8b4afa3
34ad21d
681fb2d
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -116,6 +116,7 @@ | |
"minicpm": "katuni4ka/tiny-random-minicpm", | ||
"minicpm3": "katuni4ka/tiny-random-minicpm3", | ||
"minicpmv": "katuni4ka/tiny-random-minicpmv-2_6", | ||
"minicpmo": "rkazants/tiny-random-MiniCPM-o-2_6", | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. this model will slow down our ci greatly, it is 400MB 🫨 There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. It is a minimal size I managed to receive. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. should be reduced as well There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I reduced to 144MB. Minimal hidden_size is 128 for llm part: https://huggingface.co/rkazants/tiny-random-MiniCPM-o-2_6/blob/main/modeling_minicpmo.py#L209 @IlyasMoutawwakil, @echarlaix, I propose to do further reduction in further PR(s) if any ideas. Now my other colleagues anticipate this PR merge, let us not block PR merge due to tiny model size. We know that the implemented logic are passing the tests in GHA. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. completely agree with @IlyasMoutawwakil comment, we should be super careful with our tiny random models size to not slow down the ci, could you extend on the different models parameters constraint @rkazants https://huggingface.co/rkazants/tiny-random-MiniCPM-o-2_6/blob/main/config.json#L20 for example I see There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Also if the PR really needs to be merged asap I'm ok with keeping this model but would like to have a following PR to change it to a smaller model or if that cannnot be done due to modeling constraint then would like to have more information on what are the constraints / why it cannot be done, would that sound reasonable @rkazants ? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Discussed offline with @echarlaix to proceed with the merge. |
||
"mistral": "echarlaix/tiny-random-mistral", | ||
"mistral-nemo": "katuni4ka/tiny-random-mistral-nemo", | ||
"mixtral": "TitanML/tiny-mixtral", | ||
|
@@ -327,6 +328,12 @@ | |
"clip": {"model": 130}, | ||
"mamba": {"model": 386}, | ||
"falcon-mamba": {"model": 194}, | ||
"minicpmo": { | ||
"lm_model": 16, | ||
"text_embeddings_model": 1, | ||
"vision_embeddings_model": 8, | ||
"resampler_model": 6, | ||
}, | ||
} | ||
|
||
TEST_IMAGE_URL = "http://images.cocodataset.org/val2017/000000039769.jpg" | ||
|
Uh oh!
There was an error while loading. Please reload this page.