You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It is not clear if it is possible to convert a MOE model from hf, say Mixtral 8x7B into dmoe layers and exploit expert parallelism. Is it possible to start from such architecture, or the model must be defined from scratch (as in "testing_moe.yaml" ?
thank you.