Abstract and keywords
Abstract (English):
The article presents a comparative analysis of neural network architectures for processing text data. The features of various models (RNN, CNN, transformers) and their learning algorithms are considered, as well as their effectiveness in extracting semantic information from large text corpora. The main focus is on identifying the advantages and disadvantages of each architecture to optimize text analysis tasks.

Keywords:
neural networks, neural network architectures, text data analysis
References

1. Berezhnov, N.I. Sovershenstvovanie mehanizmov vnimaniya dlya arhitektury transformer v zadachah povysheniya kachestva izobrazheniy / N.I. Berezhnov, A.A. Sirota // Komp'yuternaya optika. – 2024. – №5. – S. 726-734.

2. Alchinov, V.I. A55 Osnovy neyrosetevogo iskusstvennogo intellekta. Kratkiy vvodnyy kurs : ucheb.-metod. posobie / V.I. Alchinov, A.I. Ivanov. – Penza : Izd-vo PGU, 2024. – 68 s.

3. Kozlov, S.V. Analiz LSTM i GRU modeley dlya postroeniya prognozov vremennyh ryadov / S.V. Kozlov, S.A. Sedenkov // International Journal of Open Information Technologies. –2024. – T. 12, №7. – S. 43-50.

4. Gaynetdinov, A.F. Issledovanie vliyaniya transformerov na uluchshenie generacii izobrazheniy / A.F. Gaynetdinov // Universum: tehnicheskie nauki. – 2024. – №4 (121). – S. 44-49.

5. Bobkov, I.A. Modelirovanie neopredelennosti pri pomoschi neyronnyh setey / I.A. Bobkov, A.A. Burdina, A.A. Nehrest-Bobkova // Ars Administrandi. – 2023. – T. 15, №1. – S. 45-59.

Login or Create
* Forgot password?