-
Notifications
You must be signed in to change notification settings - Fork 29k
Support multiple timm
as backbones with forward_intermediates
method
#30307
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Hi @ioangatop, thanks for raising this! Sounds like a good idea to me. Would you like to open a PR to add these changes? |
@amyeroberts thank you for the fast response Will open a PR :) |
@ioangatop discussed this with @amyeroberts ... the quickest / safest approach right now would be to check the feature extractor wrapped model for In forward, if return_layers / all_layers is None should probably emit some sort of error if output_hidden_states is set to True, I'm not 100% if if the preferred approach to that is asserts or exceptions? This would support the new FeatureGetter and also existing Feature wrappers (like Fx or Hooks) other than the default that also wouldn't work with this current approach. It's debatable that the output_hidden_states makes sense as implemented but that can be addresssed later, maybe providing better support with the new API for supported models... |
@rwightman indeed, with a quick read (although I'm not that familiar with the I guess it has to be something like (?): def __init__(...):
...
# These are used to control the output of the model when called. If output_hidden_states is True, then
# return_layers is modified to include all layers.
self._backbone.return_layers = getattr(self._backbone, "return_layers", None)
self._return_layers = self._backbone.return_layers
self._all_layers = {layer["module"]: str(i) for i, layer in enumerate(self._backbone.feature_info.info)}
...
def forward(...):
...
if output_hidden_states and self._return_layers is None:
raise ValueError("Cannot output hidden states for timm backbones")
... Also for this, I guess we have to install PS I haven't started the PR yet, but if someone has a more clear idea how to do it fast, I would be happy to review it |
Feature request
Currently, in
TimmBackbone
requires that thetimm
model has areturn_layers
, which I'm not sure if any model implements:transformers/src/transformers/models/timm_backbone/modeling_timm_backbone.py
Lines 80 to 83 in c15aad0
Lately,
timm
implemented aforward_intermediates
method to almost all networks for feature extraction - issue and pr.Motivation
Thus, will little changes, we will be able to do
as currently we get the error:
Your contribution
Review and/or implemented the PR
The text was updated successfully, but these errors were encountered: