Register
Login
Resources
Docs Blog Datasets Glossary Case Studies Tutorials & Webinars
Product
Data Engine LLMs Platform Enterprise
Pricing Explore
Connect to our Discord channel

#604 fix master installation

Merged
Ghost merged 1 commits into Deci-AI:master from deci-ai:feature/SG-000_fix_master_inastallation
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
  1. import torch
  2. from torch import nn
  3. from super_gradients.training.utils.quantization.selective_quantization_utils import SelectiveQuantizer
  4. def non_default_calibrators_example():
  5. class MyModel(nn.Module):
  6. def __init__(self) -> None:
  7. super().__init__()
  8. self.conv1 = nn.Conv2d(3, 8, kernel_size=3, padding=1)
  9. def forward(self, x):
  10. return self.conv1(x)
  11. module = MyModel()
  12. # Initialize the quantization utility, with different calibrators, and quantize the module
  13. q_util = SelectiveQuantizer(
  14. default_quant_modules_calib_method_weights="percentile",
  15. default_quant_modules_calib_method_inputs="entropy",
  16. default_per_channel_quant_weights=False,
  17. default_learn_amax=False,
  18. )
  19. q_util.quantize_module(module)
  20. print(module) # You should expect to see QuantConv2d, with Histogram calibrators
  21. x = torch.rand(1, 3, 32, 32)
  22. with torch.no_grad():
  23. y = module(x)
  24. torch.testing.assert_close(y.size(), (1, 8, 32, 32))
  25. if __name__ == "__main__":
  26. non_default_calibrators_example()
Discard
Tip!

Press p or to see the previous file or, n or to see the next file