Bad train loss and good generate image result when use train_dreambooth_lora_flux script.py #12088
-
Beta Was this translation helpful? Give feedback.
Replies: 2 comments
-
Hi, there's a difference between showing the steps and the epochs, depending on how big your dataset, this makes a big difference, in my understanding the way we show the loss is the correct one (and it is raw data), since the training depends on noise, we get some variations but if you display the epochs you will probably see a much better curve, overall the values should still go down. Do you have access to the modelscope loss function? Do they smooth the curve? Are you also training a dreambooth lora with the same hyperparameters? I think this article provides helpful insights on the loss curve. It seems to me they're doing something similar to what that article points in the EveryDream2 trainer. |
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
Hi, there's a difference between showing the steps and the epochs, depending on how big your dataset, this makes a big difference, in my understanding the way we show the loss is the correct one (and it is raw data), since the training depends on noise, we get some variations but if you display the epochs you will probably see a much better curve, overall the values should still go down.
Do you have access to the modelscope loss function? Do they smooth the curve? Are you also training a dreambooth lora with the same hyperparameters?
I think this article provides helpful insights on the loss curve.
It seems to me they're doing something similar to what that article points in the EveryDrea…