How can I optimize the generator every five batches and optimize the discriminator every batch in GAN? #11626
Unanswered
Luciennnnnnn
asked this question in
code help: CV
Replies: 1 comment 1 reply
-
@LuoXin-s Apparently, users are required to call the closure function (which calls forward, zero_grad, backward) inside Maybe, we should make the closure call optional in the hook, e.g., change |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
The solution listed in the doc seems wasteful since when there is no need to update the discriminator, the training_step / loss.backward() still be called.
I know I can use manual optimization, but how can I achieve this based on �automatic optimization?
Beta Was this translation helpful? Give feedback.
All reactions