Replies: 1 comment
-
You can change the shapes of the params that you pass to the model. However, Flax has a safety check to make sure that hyper params like the num_features passed to dense are consistent with the shapes of the params by passing the correct number of features to Dense:
|
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I'd like to take a large network and select a smaller sub-network out of it by only selecting to keep certain units. Is there a way to accomplish such a thing in flax while avoiding
ScopeParamShapeError
?Here's a quick code example that builds a small model and then attempts to remove a single unit from the hidden layer:
All of the shapes of
sub_params
arrays check out: mathematically they should compute the subnetwork just as desired. However flax gives me an error:I know that I could zero out weights to achieve something similar, but I'm interested in benchmarking the actual wallclock difference between the large and small networks.
Relevant code appears to be here:
flax/flax/core/scope.py
Lines 607 to 624 in bbbbdcc
How can I re-use the same network structure but adjust the shapes of intermediate arrays with flax?
Beta Was this translation helpful? Give feedback.
All reactions