-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix test compatibility as AdamW became subclass of Adam #20574
base: master
Are you sure you want to change the base?
Conversation
so this is for PT 2.6+ correct? |
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## master #20574 +/- ##
=========================================
- Coverage 88% 79% -9%
=========================================
Files 267 264 -3
Lines 23380 23325 -55
=========================================
- Hits 20481 18366 -2115
- Misses 2899 4959 +2060 |
No, PyTorch 2.6 did not include the commit. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is looking at post-2.6 but it's good with the current releases
Hi @lantiga , is there a next-step for me to get this PR merged? Thanks! |
What does this PR do?
As pytorch/pytorch#143710 got merged, which made
AdamW
a subclass ofAdam
, the testtest_multiple_optimizers_basefinetuning
fails as it adds moreparam_group
s to theAdamW
in addition toAdam
.This PR fixes the test by enforcing a strict check on the type of the optimizer to be exactly
Adam
not its subclasses.Before submitting
PR review
Anyone in the community is welcome to review the PR.
Before you start reviewing, make sure you have read the review guidelines. In short, see the following bullet-list:
Reviewer checklist
📚 Documentation preview 📚: https://pytorch-lightning--20574.org.readthedocs.build/en/20574/