Submodule | Maintainers | Contact Info |
---|---|---|
gelu | @AakashKumarNain | [email protected] |
maxout | @fsx950223 | [email protected] |
normalizations | @smokrow | [email protected] |
opticalflow | @fsx950223 | [email protected] |
poincare | @rahulunair | [email protected] |
sparsemax | @AndreasMadsen | [email protected] |
wrappers | @seanpmorgan | [email protected] |
Submodule | Layer | Reference |
---|---|---|
gelu | GeLU | https://arxiv.org/abs/1606.08415 |
maxout | Maxout | https://arxiv.org/abs/1302.4389 |
normalizations | GroupNormalization | https://arxiv.org/abs/1803.08494 |
normalizations | InstanceNormalization | https://arxiv.org/abs/1607.08022 |
opticalflow | CorrelationCost | https://arxiv.org/abs/1504.06852 |
poincare | PoincareNormalize | https://arxiv.org/abs/1705.08039 |
sparsemax | Sparsemax | https://arxiv.org/abs/1602.02068 |
wrappers | WeightNormalization | https://arxiv.org/abs/1602.07868 |
In order to conform with the current API standard, all layers must:
- Inherit from either
keras.layers.Layer
or its subclasses. - Register as a keras global object so it can be serialized properly:
@tf.keras.utils.register_keras_serializable(package='Addons')
- Add the addon to the
py_library
in this sub-package's BUILD file.
- Simple unittests that demonstrate the layer is behaving as expected.
- When applicable, run all unittests with TensorFlow's
@run_in_graph_and_eager_modes
(for test method) orrun_all_in_graph_and_eager_modes
(for TestCase subclass) decorator. - Run
layer_test
on the layer. - Add a
py_test
to this sub-package's BUILD file.
- Update the table of contents in this sub-package's README.