Preview

Informatics

Advanced search

Applying the language acquisition model to the solution small language processing tasks

https://doi.org/10.37661/1816-0301-2022-19-1-96-110

Abstract

The problem of building a computer model of a small language was under solution. The relevance of this task is due to the following considerations: the need to eliminate the information inequality between speakers of different languages; the need for new tools for the study of poorly understood languages, as well as innovative approaches to language modeling in the low-resource context; the problem of supporting and developing small languages.

There are three main objectives in solving the problem of small natural language processing at the stage of describing the problem situation: to justify the problem of modeling language in the context of resource scarcity as a special task in the field of natural languages processing, to review the literature on the relevant topic, to develop the concept of language acquisition model with a relatively small number of available resources. Computer modeling techniques using neural networks, semi-supervised learning and reinforcement learning were involved.

The paper provides a review of the literature on modeling the learning of vocabulary, morphology, and grammar of a child's native language. Based on the current understanding of the language acquisition and existing computer models of this process, the architecture of the system of small language processing, which is taught through modeling of ontogenesis, is proposed. The main components of the system and the principles of their interaction are highlighted. The system is based on a module built on the basis of modern dialogical language models and taught in some rich-resources language (e.g., English). During training, an intermediate layer is used which represents statements in some abstract form, for example, in the symbols of formal semantics. The relationship between the formal recording of utterances and their translation into the target low-resource language is learned by modeling the child's acquisition of vocabulary and grammar of the language. One of components stands for the non-linguistic context in which language learning takes place.

This article explores the problem of modeling small languages. A detailed substantiation of the relevance of modeling small languages is given: the social significance of the problem is noted, the benefits for linguistics, ethnography, ethnology and cultural anthropology are shown. The ineffectiveness of approaches applied to large languages in conditions of a lack of resources is noted. A model of language learning by means of ontogenesis simulation is proposed, which is based both on the results obtained in the field of computer modeling and on the data of psycholinguistics.

About the Author

Dz. I. Kachkou
Belarusian State University
Belarus

Dzmitry I. Kachkou - Postgraduate Student of Department of Multiprocessor Systems and Networks of the Faculty of Applied Mathematics and Informatics, Belarusian State University.

Nezavisimosti av., 4, Minsk, 220030.



References

1. Hedderich M. A., Lange L., Adel H., Strötgen J., Klakow D. A Survey on Recent Approaches for Natural Language Processing in Low-Resource Scenarios, 2020. Available at: https://arxiv.org/abs/2010.12309 (accessed 12.10.2021).

2. Dai A. M., Le Q. V. Semi-supervised sequence learning. Proceedings of the 28th International Conference on Neural Information Processing Systems, 2015, vol. 2, pp. 3079–3087. https://doi.org/10.18653/v1/P17-1161

3. Anastasopoulos A., Cattelan A., Dou Z.-Y., Federico M., Federman C., …, Tur S. TICO-19: the translation initiative for Covid-19. Proceedings of the 1st Workshop on NLP for COVID-19 (Part 2) at EMNLP 2020. December 2020. Available at: https://aclanthology.org/2020.nlpcovid19-2.5/ (accessed 12.10.2021). https://doi.org/10.18653/v1/2020.nlpcovid19-2.5

4. Spangher A., Peng N., May J., Ferrara E. Enabling low-resource transfer learning across Covid-19 corpora by combining event-extraction and co-training. Proceedings of the 1st Workshop on NLP for COVID-19 at ACL 2020. July 2020. Available at: https://aclanthology.org/2020.nlpcovid19-acl.4/ (accessed 12.10.2021).

5. Vaswani A., Shazeer N., Parmar N., Uszkoreit J., Jones L., …, Polosukhin I. Attention is all you need. Proceedings of the 31st International Conference on Neural Information Processing Systems, Long Beach, California, USA, 4–9 December 2017. Long Beach, 2017, pp. 6000–6010.

6. Kachkou D. I. Language modeling and bidirectional coders representations: an overview of key technologies. Informatika [Informatics], 2020, vol. 17, no. 4, pp. 61−72 (In Russ.). https://doi.org/10.37661/1816-0301-2020-17-4-61-72

7. Baevski A., Edunov S., Liu Y., Zettlemoyer L., Auli M. Cloze-driven pretraining of self-attention networks. Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), Hong Kong, China, 3–7 November 2019. Hong Kong, 2019, pp. 5360–5369. https://doi.org/10.18653/v1/D19-1539

8. Liu Y., Ott M., Goyal N., Du J., Joshi M., …, Stoyanov V. RoBERTa: A Robustly Optimized BERT Pretraining Approach, 2019. Available at: https://arxiv.org/abs/1907.11692 (accessed 12.10.2021).

9. Zamyatin K., Pasanen A., Saarikivi Ya. Kak i zachem sohranyat’ yazyki narodov Rossii. How and Why to Save Languages of Ethnic Groups in Russia. Helsinkim, 2012, 181 p. (In Russ.).

10. Meisel J. M. First and Second Language Acquisition (Cambridge Textbooks in Linguistics). Cambridge University Press, 2011, 318 р.

11. Clark E. V. First Language Acquisition. Cambridge University Press, 2nd ed., 2009, 490 р.

12. Luriya A. R. Yazyk i soznanie. Language and Conscience. In Homskaya E. D. (ed.). Moscow, Izdatel'stvo Moskovskogo universtiteta, 1979, 320 p. (In Russ.).

13. Burlak S. A. Proishozhdenie yazyka. Fakty, issledovaniya, gipotezy. Origin of the language. Facts, Researches, Hypothesis. Moscow, Alpina digital, 2019, 609 р. (In Russ.).

14. Nemov R. S. Obshchaya psihologiya. General Psychology. Vol. 2, kniga 4. Rech’. Psihicheskie sostoyaniya: uchebnik i praktikum dlya akademicheskogo bakalavriata. Speech. Psychological States: Textbook and Workshop for Bachelors. 6th ed., Moscow, Yurait, 2017, 243 p. (In Russ.).

15. Evans V. The Language Myth Why Language Is Not an Instinct. Cambridge University Press, 2014, 314 р.

16. Peirce C. S. Collected Papers of Charles Sanders Peirce, Volumes I and II: Principles of Philosophy and Elements of Logic. Belknap Press, 1932, vol. II, 535 p.

17. Winograd T. Programma, ponimaushchaya estestvennyj yazyk. Understanding Natural Language. Moscow, Mir, 1976, 296 p. (In Russ.).

18. Antol S., Agrawal A., Lu J., Mitchell M., Batra D., …, Parikh D. VQA: visual question answering. IEEE International Conference on Computer Vision (ICCV). Santiago, Chile, 2015, pp. 2425–2433. https://doi.org/10.1109/ICCV.2015.279

19. Das A., Datta S., Gkioxari G., Lee S., Parikh D., Batra D. Embodied question answering. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Salt Lake City, UT, USA, 18–23 June 2018. Salt Lake City, 2018, pp. 1–10.

20. Luketina J., Nardelli N., Farquhar G., Foerster J., Andreas J., …, Rocktäschel T. A survey of reinforcement learning informed by natural language. Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence, Macao, China, 10–16 August 2019. Macao, 2019, pp. 6309–6317. https://doi.org/10.24963/ijcai.2019/880

21. Janner M., Narasimhan K., Barzilay R. Representation learning for grounded spatial reasoning. Transactions of the Association for Computational Linguistics, 2018, vol. 6, pp. 49–61. https://doi.org/10.1162/tacl_a_00004

22. Côté M.-A., Kádár Á., Yuan X., Kybartas B., Barnes T., …, Trischler A. TextWorld: A learning environment for text-based games. Computer Games. CGW 2018. Communications in Computer and Information Science, Cham, Springer, 2018, vol. 1017, рр. 41–75. https://doi.org/10.1007/978-3-030-24337-1_3

23. Arora S., Doshi P. A survey of inverse reinforcement learning: Challenges, methods and progress. Artificial Intelligence, 2021, vol. 297. Available at: https://arxiv.org/abs/1806.06877 (accessed 12.10.2021). https://doi.org/10.1016/j.artint.2021.103500

24. Silver D., Hubert T., Schrittwieser J., Antonoglou I., Lai M., …, Hassabis D. A general reinforcement learning algorithm that masters chess, shogi, and go through self-play. Science, 2018, vol. 362, no. 6419, pp. 1140–1144. https://dx.doi.org/10.1126%2Fscience.aar6404

25. Freudenthal D., Alishahi A. Computational models of language development. Encyclopedia of Language Development. In Brooks P. J., Kempe V. (eds.). 1st ed., SAGE Publications Inc., 2014, pp. 92–96.

26. Fazly A., Alishahi A., Stevenson S. A probabilistic computational model of cross‐situational word learning. Cognitive Science, 2010, vol. 34, iss. 6, pp. 1017–1063. https://doi.org/10.1111/j.1551-6709.2010.01104.x

27. Christiansen M. H., Chater N. Connectionist natural language processing: the state of the art. Cognitive Science, 1999, vol. 23, iss. 4, pp. 417–437. https://doi.org/10.1207/s15516709cog2304_2

28. Buttery P. J. Computational models for first language acquisition. Technical Report UCAM-CL-TR-675, University of Cambridge, 2006. Available at: https://www.cl.cam.ac.uk/techreports/UCAM-CL-TR-675.pdf (accessed 21.03.2021).

29. MacWhinney, B. The CHILDES Project: Tools for Analyzing Talk: Transcription Format and Programs. 3rd ed., Lawrence Erlbaum Associates Publishers, 2000.

30. Jones G., Gobet F., Pine J. M. A process model of children’s early verb use. Proceedings of the 22th Annual Conference of the Cognitive Science Society, Philadelphia, PA, 13–15 August 2000. Philadelphia, 2000, pp. 723–728.

31. Alishahi A. Computational Modeling of Human Language Acquisition. Morgan & Claypool, 2010, 107 р.

32. Andersen E. S, Dunlea A., Kekelis L. The impact of input: language acquisition in the visually impaired. First Language, 1993, vol. 13, no. 37, pp. 23–49. https://doi.org/10.1177/014272379301303703

33. Vlasov V., Mosig J. E. M., Nicho A. Dialogue Transformers, 2019. Available at: https://arxiv.org/abs/1910.00486 (accessed 12.10.2021).

34. Andreev A. V., Mitrofanova O. A., Sokolov K. V. Vvedenie v formal’nuyu semantiky: uchebnoe posobie. Introduction Into Formal Semantics: Handbook. Saint-Petersburg, Saint-Petersburg State University, 2014, 88 p. (In Russ.).

35. Goddard C. The search for the shared semantic core of all languages. In Goddard C., Wierzbicka A. (eds.). Meaning and Universal Grammar – Theory and Empirical Findings. Amsterdam, John Benjamins, 2002, vol. I, pp. 5–40.

36. Barnes J. Evidentials in the tuyuca verb. International Journal of American Linguistics, 1984, vol. 50, no. 3, pp. 255–271.


Supplementary files

Review

For citations:


Kachkou D.I. Applying the language acquisition model to the solution small language processing tasks. Informatics. 2022;19(1):96-110. (In Russ.) https://doi.org/10.37661/1816-0301-2022-19-1-96-110

Views: 356


Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 License.


ISSN 1816-0301 (Print)
ISSN 2617-6963 (Online)