Skip to content

[Questions] Context length extension. #1327

Open
@iPRET

Description

@iPRET

Hi!

I have a couple questions about increasing context lengths during training.

  1. Does the framework support increasing the context length during training when using rotary position embeddings?
  2. If it does, does it support position embedding interpolation? Or just extrapolation?

Thanks!
Ingus

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions