What Are The Conditions For Effective Teachers In Knowledge Distillation? What Are The Conditions For Effective Teachers In Knowledge Distillation? 27/09/2022 Knowledge Distillation
Model Compression For Unconditional-GAN Model Compression For Unconditional-GAN 19/11/2021 GAN (Hostile Generation Network)
A Smoothing Method For Robust Overfitting Suppression A Smoothing Method For Robust Overfitting Suppression 15/06/2021 Adversarial Perturbation
ReLabel, A Method For Relabeling ImageNet With Multiple Labeling Using Local Labels! ReLabel, A Method For Relabeling ImageNet With Multiple Labeling Using Local Labels! 13/05/2021 Image Recognition
Can We Protect The Privacy Of Deep Learning Models? Can We Protect The Privacy Of Deep Learning Models? 17/02/2021 Deep Learning