Replies: 1 comment 1 reply
-
|
Kohya_ss and Runpod will do best, 8GB VRAM aren't enough for (fast) training SDXL LoRAs (i had 42h for a SDXL LoRA with 3800 steps in total when using a 3080 10GB VRAM as it started outsourcing to RAM). EDIT: this is why i stopped local training and moved to a friend of mine with a 4090, which only took 1h 20min afaik to generate it with my config. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi
So have been using Foocus now for a while and really want to try to make my own LORA. Have been looking at youtube and Google and there seems to be a million and one way. Why not ask in this community what you think is the best/easiest way to start. Happy to spend some funds if need be. I have a RTX3070 Ti not sure if that is enough to try to do it locally or if i rent something like Runpod with Stable Diffusion Kohya_ss ComfyUI Ultimate setup. Any ideas as to how to start would be great.
Thank you
J
Beta Was this translation helpful? Give feedback.
All reactions