Register
Login
Resources
Docs Blog Datasets Glossary Case Studies Tutorials & Webinars
Product
Data Engine LLMs Platform Enterprise
Pricing Explore
Connect to our Discord channel

TLU.py 912 B

You have to be logged in to leave a comment. Sign In
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
  1. from core.leras import nn
  2. tf = nn.tf
  3. class TLU(nn.LayerBase):
  4. """
  5. Tensorflow implementation of
  6. Filter Response Normalization Layer: Eliminating Batch Dependence in theTraining of Deep Neural Networks
  7. https://arxiv.org/pdf/1911.09737.pdf
  8. """
  9. def __init__(self, in_ch, dtype=None, **kwargs):
  10. self.in_ch = in_ch
  11. if dtype is None:
  12. dtype = nn.floatx
  13. self.dtype = dtype
  14. super().__init__(**kwargs)
  15. def build_weights(self):
  16. self.tau = tf.get_variable("tau", (self.in_ch,), dtype=self.dtype, initializer=tf.initializers.zeros() )
  17. def get_weights(self):
  18. return [self.tau]
  19. def forward(self, x):
  20. if nn.data_format == "NHWC":
  21. shape = (1,1,1,self.in_ch)
  22. else:
  23. shape = (1,self.in_ch,1,1)
  24. tau = tf.reshape ( self.tau, shape )
  25. return tf.math.maximum(x, tau)
  26. nn.TLU = TLU
Tip!

Press p or to see the previous file or, n or to see the next file

Comments

Loading...