Model Analysis of Fast Analogue Neural Circuit of Largest Value Signal Set Identification

Tymoshchuk P.

Національний університет “Львівська політехніка”, кафедра систем автоматизованого проектування

An analysis of continuous-time model of high speed analogue K-winners-take-all (KWTA) neural circuit which is capable of identifying the K largest of unknown finite value N distinct inputs, where 1K N ≤ < is presented. The model is described by a state equation with discontinuous righthand side and by an output equation. Existence and uniqueness of the steady-states, convergence of state-variable trajectories and convergence time to the KWTA operation are analyzed. The model comparison with other close analogs is given. According to obtained results, the model possesses a higher convergence speed to the KWTA operation than other comparable analogs.

1. E. Majani, R. Erlanson, and Y. Abu-Mostafa, “On the k-winners-take-all network,” In Advances in Neural Information Processing Systems, vol. 1, D. S. Touretzky, Ed. San Mateo, CA: Morgan Kaufmann, 1989, pp. 634–642. 2. R. P. Lippmann, “An introduction to computing with neural nets,” IEEE Acoustics, Speech and Signal Processing Magazine, vol. 3, no. 4, pp. 4-22, Apr. 1987. 3. P. Tymoshchuk and E. Kaszkurewicz, “A Winner-take-all circuit based on second order Hopfield neural networks as building blocks,” in Proc. IEEE Int. Joint Conf. Neural Networks, Portland OR, vol. 2, 2003, pp. 891-896. 4. M. Atkins, “Sorting by Hopfield nets,” in Proc. IEEE Int. Joint Conf. Neural Networks, Washington DC, vol. 2, 1989, pp. 65-68. 5. K. Urahama and T. Nagao, “K-winners-take-all circuit with 0(N) complexity,” IEEE Trans. Neural Networks, vol. 6, pp. 776-778, May 1995. 6. T. M. Kwon and M. Zervakis, “KWTA networks and their applications”. Multidimensional Syst. and Signal Processing, vol. 6, pp. 333-346, Apr. 1995. 7. L. N. Binh and H. C. Chong, “”A neural-network contention controller for packet switching networks”, IEEE Trans. Neural Networks”, vol. 6, no. 6, pp. 1402-1410, Nov. 1995. 8. L. Itti, C. Koch, and E. Niebur, “A model of saliency-based visual attention for rapid scene analysis,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 20, no. 11, pp. 1254 – 1259, Nov. 1998. 9. U. Cilingiroglu and T. L. E. Dake, “Rank-order filter design with a sampled-analog multiple-winners-take-all core,” IEEE J. Solid-State Circuits, vol. 37, no. 2, pp. 978-984, Aug. 2002. 10. R. Erlanson and Y. Abu-Mostafa, “Analog neural networks as decoders,” In Advances in Neural Information Processing Systems, vol. 1, D. S. Touretzky, Ed. San Francisco, FL: Morgan Kaufmann, 1991, pp. 585–588. 11. A. Fish, D. Akselrod, and O. Yadid-Pecht, “High precision image centroid computation via an adaptive k-winner-take-all circuit in conjunction with a dynamic element matching algorithm for star tracking applications,” Analog Integrated Circuits and Signal Processing, vol. 39, pp. 251-266, June 2004. 12. B. J. Jain and F. Wysotzki, “Central clustering of attributed graphs,” Machine Learning, vol. 56, pp. 169-207, July 2004. 13. S. Chartier, G. Giguere, D. Langlois, and R. Sioufi, “Bidirectional associative memories, selforganizing maps and k-winners-take-all; uniting feature extraction and topological principles,” in Proc. IEEE Int. Joint Conf. Neural Networks, Atlanta GA, 2009, pp. 503 - 510. 14. B. G. Jain and F. Wysotzki, “A k-winner-takes-all classifier for structured data”, in Proc. 26th Int. Conf. AI LNAI 2821, Hamburg, 2003, pp. 342–354. 15. S. Liu and J. Wang, “A simplified dual neural network for quadratic programming with its KWTA application,” IEEE Trans. Neural Networks, vol. 17, no. 6, pp. 1500-1510, Nov. 2006. 16. G. N. DeSouza and A. C. Zak, “Vision for mobile robot navigation: a survey,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 24, no. 2, pp. 237-267, Feb. 2002. 17. R. C. O’Reilly and Y. Munakata, Computational explorations in cognitive neuroscience: understanding the mind by simulating the brain. Cambridge, MA: MIT Press, 2000. 18. A. Lazar, G. Pipa, and J. Triesch, Fading memory and time series prediction in recurrent networks with different forms of plasticity. Neural Networks, vol. 20, no. 3, pp. 312-322, Apr. 2007. 19. A. Cichocki and R. Unbehauen, Neural Networks for Optimization and Signal Processing. Chichester: Wiley, 1993. 20. J. Wang, “Analysis and design of a k-winners-take-all model with a single state variable and the Heaviside step activation function,” IEEE Trans. Neural Networks, vol. 21, no. 9, pp. 1496-1506, Sept. 2010. 21. P. V. Tymoshchuk, “A dynamic Kwinners take all analog neural circuit,” in Proc. IVth IEEE Int. Conf. Perspective technologies and methods in MEMS design, L’viv, 2008, pp. 13-18. 22. Q. Liu and J. Wang, “Two k-winners-take-all networks with discontinuous activation functions,” Neural Networks, vol. 21, pp. 406-413, Mar. – Apr. 2008. 23. Тимощук П. Аналогова структурно-функціональна нейронна схема визначення максимальних сигналів// Комп'ютерні науки та інформаційні технології. – 2012. – № 744. – С. 10–17. (Вісн. Нац. ун-ту “Львівська політехніка”). 24. Тимощук П. Математична модель нейронної схеми типу “K-Winners-Take-All” обробки дискретизованих сигналів // Комп'ютерні системи проектування. Теорія і практика. – 2010. – № 685. – C. 45 – 50 (Вісн. Нац. ун-ту “Львівська політехніка”). 25. S. K. Persidskii, “Problem of absolute stability,” Automation and Remote Control, vol. 30, no. 12, pp. 1889-1895, Dec. 1969. 26. E. Kaszkurewicz and A. Bhaya, “A generalized Persidskii theorem and its applications to nonsmooth gradient dynamical systems”, in Proc. 16th IFAC World Congress, vol. 16, part 1, Prague, 2005, pp. 755–761. 27. X. Hu and J. Wang, “An improved dual neural network for solving a class of quadratic programming problems and its k-winners-take-all application,” IEEE Trans. Neural Networks, vol. 19, no. 12, pp. 2022–2031, Dec. 2008.