Using multiple optimizers #14791
Unanswered
shim94kr
asked this question in
code help: CV
Replies: 1 comment
-
I found the forward pass is done multiple times for each optimizer... Sorry for the confusion. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello! I have some questions about using multiple optimizers.
This page, https://pytorch-lightning.readthedocs.io/en/stable/common/optimization.html, seems written that when using multiple optimizers backward pass is done for each optimizer by the following code.
But I noticed that the forward pass is done for once, and pytorch limits to perform backward pass for each forward pass limiting the backward pass only once in this case.
Then, how are the backward passes done when using the multiple optimizers?
If the backward pass was done multiple times, as the code shows, I would have the following question.
Do optimizers sequentially update the model so that an optimizer in the middle uses the updated parameters of the previous one but uses the same loss for all? If it does, I would forward it multiple times and then update the model to synchronize the model parameters and the loss.
I think I got confused about how multiple optimizers work. I would appreciate that if someone would answer these questions and explain the general process of using multiple optimizers.
Thank you!
Beta Was this translation helpful? Give feedback.
All reactions