[MLP-Mixer] The Day MLP Surpasses CNN And Transformer [MLP-Mixer] The Day MLP Surpasses CNN And Transformer 10/06/2021 Image Recognition
Transformer Without Attention: GMLP Will Be Active! Transformer Without Attention: GMLP Will Be Active! 07/06/2021 Transformer
Meet DialoGPT, A Powerful ChatBOT From Microsoft Meet DialoGPT, A Powerful ChatBOT From Microsoft 02/06/2021 Natural Language Processing
Searching For Network Structures Robust To Hostile Samples Searching For Network Structures Robust To Hostile Samples 31/05/2021 NAS
Time Series Anomaly Detection SOTA Survey Time Series Anomaly Detection SOTA Survey 26/05/2021 Survey
Are We As Closer To Artificial General Intelligence(AGI) Than We Think? Are We As Closer To Artificial General Intelligence(AGI) Than We Think? 25/05/2021 Neural Network
Don't Need A Big Model For Pruning? Automatically Find Sparse Networks! Don't Need A Big Model For Pruning? Automatically Find Sparse Networks! 21/05/2021 Pruning
You Can Tell GANs What Kind Of Image To Produce Using Text Input! You Can Tell GANs What Kind Of Image To Produce Using Text Input! 18/05/2021 GAN (Hostile Generation Network)
Super Accelerated Translation Of High Resolution Images: ASAP-Net Super Accelerated Translation Of High Resolution Images: ASAP-Net 17/05/2021 Image2image
A Way To Create A GPT-3 Equivalent For Vision Transformers? A Way To Create A GPT-3 Equivalent For Vision Transformers? 12/05/2021 Transformer
Are Transformers Meant For Computer Vision? Are Transformers Meant For Computer Vision? 06/05/2021 Transformer
A Survey Of Traffic Prediction Modeling Methods Using Deep Learning A Survey Of Traffic Prediction Modeling Methods Using Deep Learning 29/04/2021 Survey
Modal-independent Transformer: Perceiver Model Modal-independent Transformer: Perceiver Model 26/04/2021 Transformer
ResNets Learning And Scaling Strategy For SOTA Performance! ResNets Learning And Scaling Strategy For SOTA Performance! 23/04/2021 Deep Learning
Transformers As Universal Computation Engines: Language-pretrained Transformers Help On Non-linguistic Tasks! Transformers As Universal Computation Engines: Language-pretrained Transformers Help On Non-linguist ... 20/04/2021 Transformer
How To Prevent Overfitting In Adversarial Training How To Prevent Overfitting In Adversarial Training 16/04/2021 Adversarial Perturbation
A New Self-Supervised Learning Algorithm From Facebook AI: Barlow Twins A New Self-Supervised Learning Algorithm From Facebook AI: Barlow Twins 12/04/2021 Self-supervised Learning