Skip to content

MoE layers combined with LoRA (MoLoRA) #570

@Joao-L-S-Almeida

Description

@Joao-L-S-Almeida

Is your feature request related to a problem? Please describe.
It isn't a problem, but a proposal of extension. Currently TerraTorch supports updating the backbone during the finetuning stage using the LoRA algorithm a MoE variant called MoLoRA (arXiv:2309.05444v1), which allows to extend it for multiple experts.

Describe the solution you'd like
A MoLoRA algorithm available for finetuning tasks in TerraTorch.

Describe alternatives you've considered (optional)
The algorithm MoV (derivated from (IA)^3), also described in the aforementioned paper is a possible alternative.

Metadata

Metadata

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions