Distorted character recognition by an incompatible single-layer dipole neural network

2022;
: pp. 199 - 207
1
Lviv Polytechnic National University
2
Lviv Polytechnic National University, Lviv, Ukraine
3
Ivan Franko Drogobych State Pedagogical University
4
Lviv Polytechnic National University
5
Lviv Polytechnic National University, Ukraine

This paper solves two problems: the first problem is devoted to the recognition of distorted symbolic images by a single-layer incompatible dipole neural network, and the second - the optimization of computing resources in the recognition of distorted symbolic images. In particular, the architecture of an incompatible single-layer network with dipole neurons is proposed. Incompatibility of synaptic connections between neurons is based on the fact that significant interaction between dipole neurons exists in their immediate environment. Synaptic connections between dipole neurons are taken into account only between the nearest neighboring neurons, because the synaptic tensor $\lambda_{i j}$ between the $i$-th and $j$-th dipole neurons is inversely proportional to the distance $r_{i j}$ between neighboring $i$-th and $j$-th dipole neurons, therefore $\lambda_{i j+1}<<\lambda_{i j}$ . An algorithm for recognizing incoming distorted symbolic images using an incompatible dipole neural network has been developed and implemented in the Matlab application system. It is shown that for the recognition of input symbol images by an incompatible dipole neural network the computational resource time is shorter compared to a fully connected neural network by $ n(n + 1)/4$ times ( $n$ is the number of pixels in columns and rows, respectively, used for encoding of input images). Numerical experiments have shown that the computational time to recognize $0,4n^2$ distorted characters, which is described by a 5×5 matrix, is 7,5 times less than the recognition time of a fully connected neural network.

  1. Peleshchak R., Lytvyn V., Peleshchak I., Doroshenko M., Olyvko R. (2019). Hechth-Nielsen theorem for a modified neural network with diagonal synaptic connections. Mathematical modeling and computing, 6 (1), 101–108. https://doi.org/10.23939/mmc2019.01.101
  2. Lytvyn, V., Peleshchak, I., Peleshchak, R. (2017). The compression of the input images in neural network that using method diagonalization the matrices of synaptic weight connections. 2nd International Conference on Advanced Information                 and         Communication           Technologies (AICT),                              66–70. https://doi.org/10.1109/AIACT.2017.8020067.
  3. Haykin S. (2006). Neural networks. Complete Course 2nd Ed. Translation from English. Williams Publishing House, 1104.
  4. Penrouz R. (2005). Shadows of the Mind: In Search of a Science of Consciousness. IKI, 690.
  5. Hameroff S. (1994). Quantum coherence in microtubules: A neural basis for emergent consciousness? Journal of Consciousness Studies, 1, 91–118.
  6. Peleshchak R. М., Lytvyn, V. V., Cherniak О. І., Peleshchak І. R., Doroshenko М. V. (2021). Stochastic pseudospin neural network with tridiagonal synaptic connections. Radio Electronics, Computer Science, Control, 2, 114–122.    https://doi.org/10.15588/1607-3274-2021-2-12.
  7. Brown J. A., Tuszynski J. A. (1999). A review of the ferroelectric model of microtubules. Ferroelectrics, 220, 141–155.    https://doi.org/10.1080/00150199908216213.
  8. Tuszynski J. A., Hameroff S. H., Sataric M. V., Trpisova B. T., Nip M. L. A. (1995). Ferroelectric behavior in microtubule dipole lattices: implications for information processing, signaling and assembly/disassembly. Journal of Theoretical Biology, 174, 371–380. https://doi.org/10.1006/jtbi.1995.0105.
  9. Hunt C., Stebbing H. (1994). Role of MAPs and  motors in the bundling and shimmering of native microtubules from insect ovarioles. Cell Motility and Cytoskeleton, 27, 6978–6985. https://doi.org/10.1002/cm.970270108.
  10. Slyadnikov Е. Е. (2007). Physical model and associative memory of the cytoskeletal microtubule dipole system. Journal of Technical Physics, 77 (7), 77–86.
  11. Hopfield J. J. (1982). Neural networks and physical systems with emergent collective computational abilities.Proceedings of the National Academy of Sciences (PNAS), 79 (8), 2554–2558. https://doi.org/10.1073/pnas.79.8.2554.
  12. Schmidhuber Jürgen (2020). Generative Adversarial Networks are special cases of Artificial Curiosity (1990) and also closely related to Predictability Minimization (1991). Neural Networks, Vol. 124, 58–66.
  13. Logan G. Wright, Tatsuhiro Onodera, Martin M. Stein, Tianyu Wang, Darren T. Schachter, Zoey Hu &Peter L. McMahon (2022). Deep physical neural networks trained with backpropagation. Nature, Vol. 601, 549–555.
  14. Hendrik Poulsen Nautrup, Tony Metger, Raban Iten, Sofiene Jerbi, Lea M.  Trenkwalder, Henrik Wilming, Hans J. Briegel, Renato Renner (2020). Operationally meaningful representations of physical systems in neural networks Аrxiv. https://arxiv.org/abs/2001.00593.
  15. Tai-hoon Kim (2010). Pattern Recognition Using Artificial Neural Network: A Review International Conference on Information Security and Assurance, Vol. 76: Information Security and Assurance, 138–148.