Even Just Learning Music And Source Code Can Lead To Understanding Natural Language!
3 main points
✔️ Even using music and source code for pre-training can increase accuracy against natural language.
✔️ Even artificial languages with simple structures can be useful for transfer learning
✔️ Grammatical structures can be learned without learning vocabulary.
Learning Music Helps You Read: Using Transfer to Study Linguistic Structure in Language Models
written by Isabel Papadimitriou, Dan Jurafsky
(Submitted on 30 Apr 2020 (v1), last revised 30 Oct 2020 (this version, v3))
Comments: Accepted at EMNLP2020
Subjects: Computation and Language (cs.CL)
first of all
Understanding how neural networks learn and represent the syntactic structure of natural language has been an important challenge for natural language processing. Existing research has looked directly at the internal activation of the model to reveal syntactic processing, feeding complex syntactic input to the model, and so on.
In this paper, we take a new approach. We measure the structural awareness of a language model by examining how effective prior learning of another language is as an induction bias before learning one language.
To read more,
Please register with AI-SCHOLAR.OR
Categories related to this article