WGAN - discriminator and generator updates inconsistency #11500
-
Hi, When training a WGAN we update the discriminator several times for each update of the generator (a typical choice is 5 to 1). In PL we control this by setting the "frequency" parameter within the configure_optimizers function: Now, if the number of batches for each epoch is not divisible by the sum of frequencies (6 in this case), the generator will end up being trained less than the discriminator. If, for example, there are 11 batches in our dataset, it will result in the discriminator being updated 10 times and the generator only 1 for each epoch because the optimizers' order is reset at the beginning of each epoch. Is there a workaround for this? The most useful solution would be to be able to save the number of updates across epochs. Thanks for any suggestion. |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
Solved using |
Beta Was this translation helpful? Give feedback.
Solved using
limit_train_batches
parameter in Trainer.