Catch up on the latest AI articles

How Far Has Google's BERT, Google's Signature Technology For Natural Language Processing, Gone In Research? The Frontiers Of BERTology

How Far Has Google's BERT, Google's Signature Technology For Natural Language Processing, Gone In Research? The Frontiers Of BERTology

Natural Language Processing

3 main points
✔️ Introduction to the survey paper on BERT, a central player in natural language processing 
✔️ A comprehensive description of research directions and issues related to BERT in two parts

✔️ The study that looks at what BERT captures

A Primer in BERTology: What we know about how BERT works
written by Anna RogersOlga KovalevaAnna Rumshisky
(Submitted on 27 Feb 2020)

Comments: Published by arXiv
Subjects: Commputation and Language(cs.CL)
Paper  
Official Code COMM Code

Introduction

The presence of a transformer-based pre-training language model called BERT, a transformer-based pre-training language model, is indispensable for recent developments in natural language processing technology; at the time of its release in 2018, it was the most accurate at the time in the GLUE task, a benchmark for measuring overall natural language understanding It surprised us by improving our OpenAI GPT score by 7%. Since then, BERT's general-purpose, high language processing power has become a benchmark and a central part of its work in various tasks. In this article, we'll present the extensive research on BERT and look at the challenges it faces.

To read more,

Please register with AI-SCHOLAR.

Sign up for free in 1 minute

OR
  • メルマガ登録(ver
  • ライター
  • エンジニア_大募集!!

If you have any suggestions for improvement of the content of the article,
please contact the AI-SCHOLAR editorial team through the contact form.

Contact Us