Encryption of text messages using multilayer neural networks

: pp. 1-6
Institute of Technical Engineering the State Higher School of Technology and Economics in Jarosław
Department of Sensory and Semiconductor Electronics Ivan Franko National University of Lviv
Department of Radio Physics and Computer Technologies Ivan Franko National University of Lviv

The article considers an algorithm for encrypting / decrypting text messages using multilayer neural networks (MLNN). The algorithm involves three steps: training a neural network based on the training pairs formed from a basic set of characters found in the text; encryption of the message using the weight coefficients of the hidden layers; its decryption using the weight coefficients of the output layer. The conditions necessary for successful encryption / decryption with this algorithm are formed, its limitations are emphasized. The MLNN architecture and training algorithm are described. The results of experimental research done by using the NeuralNet program are given: training the MLNN employing the BP (Sequential), BP (Batch), Rprop, QuickProp methods; an example of encrypting / decrypting a text message.

  1.  B.Schneier, Applied cryptography: Protocols, Algorithms, Source Code in C, Triumf, p. 815, 2012.
  2. E.Volna, M.Kotyrba, V.Kocian, and M.Janosek, “Cryptography Based On Neural Network” // in Proc. 26th European Conference on Modeling and Simulation, pp. 386–391, 2012.
  3. V.Sagar and K.Kumar, “A Symmetric Key Cryptographic Algorithm Using Counter Propagation Network (CPN)”, in Proc. 2014 ACM International Conference on Information and Communication Technology for Competitive Strategies, vol. ISBN, no. 978-1-4503-3216-3, 2014.
  4. K.Shihab, “A backpropagation neural network for computer network security”, Journal of Computer Science, vol. 2, no. 9, pp. 710–715, 2006.
  5. Choi-Kuen Chan, Chi-Kwong Chan, L.P. Lee, L.M. Cheng, Encryption system based on neural network, Communications and Multimedia Security Issues of the New Century, Springer, pp. 117–122, 2001.
  6. M.Arvandi, S.Wu, A.Sadeghian, W.W.Melek, and I.Woungang, “Symmetric cipher design using recurrent neural networks”, in Proc. IEEE International Joint Conference on Neural Networks, pp. 2039– 2046, 2006.
  7. V.Bihday, V.Brygilevych, , Y.Hychka, N. Pelypets, V.Rabyk, “Recognition of Handwritten Images Using Multilayer Neural Networks IEEE 2019”, in Proc. 11th International Scientific and Practical Conference on Electronics and Information Technologies, ELIT 2019 – Proceedings.
  8. Simon Haykin, Neural Networks: A Comprehensive Foundation 2nd edition, Prentice Hall, NJ, USA ©1998, 842p, ISBN:0132733501.
  9. M.Riedmiller and H.Brawn, RPROP - a fast adaptive learning algorithms. Technacal Report // Karlsruhe: University Karlsruhe. 1992. http://citeseerx.ist.psu.edu/viewdoc/download;jsessionid= 6A4F81B00868291D27499A6AADC6C330?doi=
  10. S. E. Fahlman, “Faster Learning Variations on Backpropagation: An Empirical Study”, in Proc. 1988 Connectionist Models Summer School, pp. 38 -51, 1988.