Catch up on the latest AI articles

Even Just Learning Music And Source Code Can Lead To Understanding Natural Language!

Even Just Learning Music And Source Code Can Lead To Understanding Natural Language!

Natural Language Processing

3 main points 
✔️ Even using music and source code for pre-training can increase accuracy against natural language. 
✔️ Even artificial languages with simple structures can be useful for transfer learning 
✔️ 
Grammatical structures can be learned without learning vocabulary.

Learning Music Helps You Read: Using Transfer to Study Linguistic Structure in Language Models
written by 
Isabel PapadimitriouDan Jurafsky
(Submitted on 30 Apr 2020 (v1), last revised 30 Oct 2020 (this version, v3))
Comments: Accepted at EMNLP2020
Subjects: Computation and Language (cs.CL)
   

first of all

Understanding how neural networks learn and represent the syntactic structure of natural language has been an important challenge for natural language processing. Existing research has looked directly at the internal activation of the model to reveal syntactic processing, feeding complex syntactic input to the model, and so on.

In this paper, we take a new approach. We measure the structural awareness of a language model by examining how effective prior learning of another language is as an induction bias before learning one language.

To read more,

Please register with AI-SCHOLAR.

Sign up for free in 1 minute

OR
Tak avatar
Ph. D (informatics)

If you have any suggestions for improvement of the content of the article,
please contact the AI-SCHOLAR editorial team through the contact form.

Contact Us