Skip to content

Does torchtune support multi-node training? #2018

Closed
@tginart

Description

@tginart

Does torchtune support multi-node training? For example, in a SLURM environment?

If so, would it be possible to get an example config?

Metadata

Metadata

Assignees

Labels

discussionStart a discussiondistributedAnything related to distributed env (multi-GPU, multi-node)

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions