Skip to content

Clarification: Context length #521

@JonasWeinert

Description

@JonasWeinert

https://huggingface.co/google/gemma-3n-E4B-it-litert-lm?utm_source=chatgpt.com

States a context length of 32k.
The ReadMe of this Repo states supported context is 4096.

Hence: Is it not possible to run inference with more than 4096 tokens here, or is this a legacy description?

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions