Skip to content

[dpo] Add LoRA-DPO support to Levanter#4637

Open
ahmeda14960 wants to merge 3 commits intomainfrom
dpo-lora-clean
Open

[dpo] Add LoRA-DPO support to Levanter#4637
ahmeda14960 wants to merge 3 commits intomainfrom
dpo-lora-clean

Commits

Commits on Apr 10, 2026