Description
Feature request
From my understanding of the current implementation, the modules_to_save wrappers are currently limited to copying only one specific layer of the model (reference:
Line 192 in 859fd88
Motivation
This feature is particularly useful for the final classifier layer. Currently, I have a model with multiple LoRAs attached for a classification task, but the classifier layers are not all the same size. As a result, I need to maintain several models, grouping LoRAs with the same classifier size into the same base model. However, since the core model remains identical, it should be possible to use a single base model for all of them, especially since we are training the classifier layers from scratch. A potential solution could be to introduce an additional option, allowing users to specify the modules_to_save
class for the classifier layer, instead of simply copying the existing layer.
Your contribution
I'm happy to explore possible solutions and potentially contribute a PR if this is considered a valuable addition to the library.