2020 Transformer Top Stories Ranked!
With 2020 just around the corner, here's a ranking of the Transformer-related articles that got a lot of attention this year!
Here at AI-SCHOLAR, we knew that Transformer was definitely coming in January 2020! That's when it all started.「Will, it is the breakthrough to Transformer massive scale? The highly efficient Reformer is here!」 This article.
When this article came up for selection as an authored paper in our community, the writers caught on that we could expect it to be pretty hot. From there, End-to-End Object Detection with Transformers, which was published on May 26, was also written up soon after its publication, and Transformer started to get a lot of attention. Since then, Transformers have been applied to a variety of fields, from images to audio.
We will present a ranking of Transformer, which may become a common technology for AI in the future.
It is no exaggeration to say that this paper has made Transformer widely recognized in the world. The essence of the paper is that it made object detection a true end-to-end process, but there was a movement to adopt the Transformer to other fields based on this paper! I think that's where the Transformer era came in a big way. I think the Transformer era has come a long way since then. In fact, the first successful applications of the Transformer are now emerging.
No.2 Convolution vs Transformer! To the next stage! A new image recognition model Vision Transformer
This is the content that became a topic as an application of Transformer to the image. It is a content that succeeded in improving accuracy by applying Transformer to the part where convolution is originally common. However, it is not known exactly why using a transformer without convolution, which should be a general idea in images, caused the improvement inaccuracy.
No.3 Conformer: Transformer applied to speech recognition! Transformer x CNN by Google is too awesome!
This is the topic of the application of Transformer to voice. The content said that if even convolution is fused, the accuracy can be improved. In fact, it has won SOTA, and it is said that it is able to improve accuracy enough.
Although Transformer was originally researched, it tended to be basically on a large scale and had the major disadvantage that it could only be researched in research institutions. For this reason, There is a lot of research going on to make the Transformer more efficient. Here are some survey papers of such transformers.
in the end
What did you think? These were the three Transformer articles to watch in 2020. I think a lot of people recognize the momentum of Transformer in 2020! I'm not! However, these applied technologies are still the latest and newest and future development is important.
I think 2020 is a technology war. In the future, for the companies that were there preparing for the data collection infrastructure, operations, algorithm selection, security, and edge in 2020, I think the main focus in 2021 will be implemented in society.
We look forward to working with you in 2021.
Categories related to this article