Awesome Knowledge Distillation
-
Updated
May 30, 2021
Add a description, image, and links to the knowledge-distillation topic page so that developers can more easily learn about it.
To associate your repository with the knowledge-distillation topic, visit your repo's landing page and select "manage topics."
We also need to benchmark the Lottery-tickets Pruning algorithm and the Quantization algorithms. The models used for this would be the student networks discussed in #105 (ResNet18, MobileNet v2, Quantization v2).
Pruning (benchmark upto 40, 50 and 60 % pruned weights)
Quantization