The article is devoted to the problem of excessive traffic of base station cells. In order to reduce the
impact of this problem on the quality of services of mobile network operators, it is proposed to use
artificial intelligence (AI) technology to analyze and predict the load on the network. AI is great for
wireless environments, as it has a lot of data available for analysis and obtaining certain patterns.
The article proposes a model of machine learning and neural network architecture for forecasting
the load on 5G cells.
[1] R. Li, Z. Zhao, X. Zhou, G. Ding, Y. Chen, Z. Wang, and H. Zhang, “Intelligent 5G: When cellular networks
meet artificial intelligence”, IEEE Wireless Commun., vol. 24, no. 5, pp. 175–183, Oct. 2017.
[2] Koutník, J., Gomez, F., &Schmidhuber, J. Evolving neural networks in compressed weight space, 2010.
[3] B. Bojović, E. Meshkova, N. Baldo, J. Riihijärvi, and M. Petrova, “Machine learning-based dynamic
frequency and bandwidth allocation in self-organized LTE dense small cell deployments”, EURASIP J.
Wireless Commun. Netw., vol. 2016, no. 1, 2016, Art. no. 183.
[4] A. Rosebrock, Deep Learning for Computer Vision, 1st ed. Baltimore, MD, USA: PyImageSearch, 2017.
[Online]. Available: https://www.pyimagesearch.com/deep-learning-computer-vision- python-book/
[5] H. Sun et al., “Learning to optimize: Training deep neural networks for interference management”, IEEE
Trans. Signal Process., vol. 66, no. 20, pp. 5438–5453, Oct. 2018.