Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make Microsoft.ML.OnnxRuntimeGenAI.Tokenizer a Microsoft.ML.Tokenizers.Tokenizer #970

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

stephentoub
Copy link
Member

This enables an ONNX Runtime GenAI tokenizer instance to be used anywhere a Microsoft.ML.Tokenizers tokenizer is accepted. If we'd prefer, rather than having Tokenizer be a base class for the ONNX Runtime one, we could instead expose some sort of public Microsoft.ML.Tokenizer.Tokenizer AsTokenizer() conversion method that returns a wrapper object (though that's a bit confusing given the names of the type are the same, just different namespaces).

cc: @luisquintanilla, @tarekgh

while (*(byte*)(nativeUtf8 + len) != 0) ++len;

if (len == 0)
int byteCount = Encoding.UTF8.GetByteCount(pStr, str.Length);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

int byteCount = Encoding.UTF8.GetByteCount(pStr, str.Length);

Can use GetMaxCount instead with ArrayPool? just to avoid processing the text twice.

Copy link
Member

@tarekgh tarekgh left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@azchohfi
Copy link
Contributor

@stephentoub is this unblocked now that #987 merged?

@@ -121,4 +121,8 @@
<PackageReference Include="System.Memory" Version="4.5.5" />
</ItemGroup>

<ItemGroup>
<PackageReference Include="Microsoft.ML.Tokenizers" Version="0.22.0-preview.24378.1" />
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Update package version

Suggested change
<PackageReference Include="Microsoft.ML.Tokenizers" Version="0.22.0-preview.24378.1" />
<PackageReference Include="Microsoft.ML.Tokenizers" Version="2.0.0-preview.1.25125.4" />

…s.Tokenizer

This enables an ONNX Runtime GenAI tokenizer instance to be used anywhere a Microsoft.ML.Tokenizers tokenizer is accepted. If we'd prefer, rather than having Tokenizer be a base class for the ONNX Runtime one, we could instead expose some sort of `public Microsoft.ML.Tokenizer.Tokenizer AsTokenizer()` conversion method that returns a wrapper object (though that's a bit confusing given the names of the type are the same, just different namespaces).
@baijumeswani
Copy link
Collaborator

Closing and reopening the PR to trigger all the pipelines

@baijumeswani baijumeswani reopened this Mar 5, 2025
@luisquintanilla
Copy link
Member

luisquintanilla commented Mar 13, 2025

@baijumeswani looks like the builds are failing because of issues with the feed. Can you please help unblock.

@tarekgh
Copy link
Member

tarekgh commented Mar 21, 2025

@stephentoub looks like you need to resolve the conflicts.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants