The study of the influence of learning speed (η) on the learning process of a multilayer neural network is carried out. The program for a multilayer neural network was written in Python. The learning speed is considered as a constant value and its optimal value at which the best learning is achieved is determined. To analyze the impact of learning speed, a logistic function, which describes the learning process, is used. It is shown that the learning error function is characterized by bifurcation processes that lead to a chaotic state at η> 0.8. The optimal value of the learning speed is determined. The value determines the appearance of the process of doubling the number of local minima, and is η = 0.62 for a three-layer neural network with 4 neurons in each layer. Increasing the number of hidden layers (3 ÷ 30) and the number of neurons in each layer (4 ÷ 150) does not lead to a radical change in the diagram of the logistic function (xn, η), and hence, in the optimal value of the learning speed.
- O. Rudenko and E. Bodianskyy, Artificial neural networks. Kharkiv, Ukraine: SMIT Company, 2006. (Ukrainian)
- S. Subotin, Neural networks: theory and practice Zhytomyr, Ukraine: Publisher О. О. Evenok, 2020. (Ukrainian)
- V. Kruglov and V. Borisov Artificial neural networks. Theory and practice. Moscow, Russia: Hotline - Telecom, 2001. (Russian)
- S. Subotin and A. Olijnyk Neural networks: tutorial Zaporizhzhia, Ukraine: ZNTU, 2014. (Ukrainian)
- Yurij Olenych, Sergiy Sveleba, Ivan Katerynchuk, Ivan Kunio, and Ivan Karpa, “Features of deep study neural network” https://openreviewhub.org/lea /paper-2019/features-deep-study-neural-network#
- Yu Taranenko, “Information entropy of chaos”, https:// habr.com/ru/post/447874/
- A. Kuznetsov, Dynamical systems and bifurcations. Saratov, Russia: LLC Publishing Center "Science", 2015. (Russian)