Skip to main navigation menu Skip to main content Skip to site footer

Philology and Journalism

March 1, 2024; Paris, France: VI International Scientific and Practical Conference «DÉBATS SCIENTIFIQUES ET ORIENTATIONS PROSPECTIVES DU DÉVELOPPEMENT SCIENTIFIQUE»


SOURCE BASE FOR THE STUDY OF NEURAL NETWORK MODELING OF LINGUISTIC UNITS: PROBLEM STATEMENT


DOI
https://doi.org/10.36074/logos-01.03.2024.065
Published
11.03.2024

Abstract

Today, the rapid development of digitalization processes is unstable, primarily reflected in the specifics of using certain innovative practices. For example, the latter functions alongside traditional sources of data acquisition, processing, analysis, and generation (including textual data). In this context, neural network modeling should be positioned as a manifestation of the above processes, designed to dynamize modern linguistic research. To a greater extent, they (linguistic research) are focused on natural language processing, which, in turn, greatly impacts language analysis in general (from syntactic recognition to machine translation and text data generation) [1].

References

  1. Gurnee, W., Horsley, T., Guo, Z. C., Kheirkhah, T. R., Sun, Q., Hathaway, W., ... & Bertsimas, D. (2024). Universal Neurons in GPT2 Language Models. arXiv preprint arXiv:2401.12181. URL: https://goo.su/hT2WOW (date of application: 26.02.24).
  2. Qin, Z., Yang, S., & Zhong, Y. (2024). Hierarchically gated recurrent neural network for sequence modeling. Advances in Neural Information Processing Systems, 36. URL: https://goo.su/sDxSpkd (date of application: 26.02.24).