Fix torch.jit.ScriptModule.zero_grad. #1478
Open
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
TorchSharp 0.105.0 doesn't have
torch.jit.ScriptModule.zero_grad
and falls back intotorch.nn.Module.zero_grad
incorrectly, then terminates silently.Most probably, because
JITModule
is not compatible toNNModule
in LibTorchSharp.And as reported in pytorch/pytorch#27144, libtorch also doesn't have
torch::jit::Module::zero_grad
.As a workaround, manually loop over the parameters and zero them out like optimizer does.
Note;
RELEASENOTES.md
update ATM.foreach
loop ofScriptModule.zero_grad
insrc/TorchSharp/JIT/ScriptModule.cs
is actually needed.Module.zero_grad
insrc/TorchSharp/NN/Module.cs
does.