Skip to content

Add Stable Diffusion 3.5 Medium support#4072

Open
zlaazlaa wants to merge 9 commits intoalibaba:masterfrom
zlaazlaa:support_sd35
Open

Add Stable Diffusion 3.5 Medium support#4072
zlaazlaa wants to merge 9 commits intoalibaba:masterfrom
zlaazlaa:support_sd35

Conversation

@zlaazlaa
Copy link
Contributor

No description provided.

Signed-off-by: zlaazlaa <2889827787@qq.com>
Imported from https://github.com/google/sentencepiece at tag v0.2.1

Signed-off-by: zlaazlaa <2889827787@qq.com>
Signed-off-by: zlaazlaa <2889827787@qq.com>
Signed-off-by: zlaazlaa <2889827787@qq.com>
@zlaazlaa
Copy link
Contributor Author

Successfully ran on the Linux platform with NVIDIA GeForce RTX 3090 Ti and Qualcomm Snapdragon 8 Elite Gen5 24G (CPU, 8-bit quantization) .

@wangzhaode
Copy link
Collaborator

Do not introduce SentencePiece as a third-party library. It is preferable to use MNN's internal implementation for the tokenizer.

@wangzhaode wangzhaode self-assigned this Dec 22, 2025
Signed-off-by: zlaazlaa <2889827787@qq.com>
Signed-off-by: zlaazlaa <2889827787@qq.com>
@zlaazlaa
Copy link
Contributor Author

Do not introduce SentencePiece as a third-party library. It is preferable to use MNN's internal implementation for the tokenizer.

Hi, SentencePiece was previously introduced due to the lack of Unigram model support in the internal MNN tokenizer.
Now I add Unigram support internally, eliminating the need for the external dependency.

Signed-off-by: zlaazlaa <2889827787@qq.com>
@wangzhaode
Copy link
Collaborator

I'm working on an optimized refactor of the MNN tokenizer. I recommend replacing the current approach with MNN's internal implementation once it's available. Would it be possible to wait for the tokenizer update to be released before merging this PR?

@zlaazlaa
Copy link
Contributor Author

I'm working on an optimized refactor of the MNN tokenizer. I recommend replacing the current approach with MNN's internal implementation once it's available. Would it be possible to wait for the tokenizer update to be released before merging this PR?

That's great. Will the optimized tokenizer merge the current two tokenizers?

./transformers/llm/engine/src/tokenizer.cpp
./transformers/diffusion/engine/src/tokenizer.cpp

@wangzhaode
Copy link
Collaborator

I'm working on an optimized refactor of the MNN tokenizer. I recommend replacing the current approach with MNN's internal implementation once it's available. Would it be possible to wait for the tokenizer update to be released before merging this PR?

That's great. Will the optimized tokenizer merge the current two tokenizers?

./transformers/llm/engine/src/tokenizer.cpp
./transformers/diffusion/engine/src/tokenizer.cpp

Yes.

Signed-off-by: zlaazlaa <2889827787@qq.com>
Signed-off-by: zlaazlaa <2889827787@qq.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants