The general problem of constructing logical recognition and classification trees is considered. The object of this study is logical classification trees. The subject of the research is current methods and algorithms for constructing logical classification trees. The aim of the work is to create a simple and effective method for constructing recognition models based on classification trees for training samples of discrete information, which is characterized by elementary features in the structure of synthesized logical classification trees. A general method for constructing logical classification trees is proposed, which builds a tree structure for a given initial training sample, which consists of a set of elementary features evaluated at each step of building a model for this sample. A method for constructing a logical tree is proposed, the main idea of which is to approximate the initial sample of an arbitrary volume with a set of elementary features. When forming the current vertex of the logical tree, the node provides selection of the most informative, qualitative elementary features from the original set. This approach, when constructing the resulting classification tree, can significantly reduce the size and complexity of the tree, the total number of branches and tiers of the structure, and improve the quality of its subsequent analysis. The proposed method for constructing a logical classification tree makes it possible to build tree-like recognition models for a wide class of problems in the theory of artificial intelligence. The method developed and presented in this paper received a software implementation and was investigated when solving the problem of classifying geological data. The experiments carried out in this paper confirmed the operability of the proposed mathematical support and show the possibility of using it to solve a wide range of practical recognition and classification problems. Prospects for further research may consist in creating a limited method of the logical classification tree, which consists in maintaining a criterion for stopping the procedure for constructing a logical tree according to the depth of the structure, optimizing its software implementations, as well as experimental studies of this method for a wider range of practical tasks.
[1] Bodyanskiy, Y., Vynokurova, O., Setlak, G. & Pliss, I. (2015). Hybrid neuro-neo-fuzzy system and its adaptive learning algorithm, Xth Scien. and Tech. Conf. "Computer Sciences and Information Technologies" (CSIT), Lviv, 111-114.
https://doi.org/10.1109/STC-CSIT.2015.7325445
[2] Breiman, L. L., Friedman, J. H., Olshen, R. A., & Stone, C. J. (1984). Classification and regression trees. Boca Raton, Chapman and Hall/CRC, 368 p.
[3] De Mántaras, R. L. (1991). A distance-based attribute selection measure for decision tree induction. Machine learning, 6(1), 81-92.
https://doi.org/10.1023/A:1022694001379
[4] Deng, H., Runger, G., & Tuv, E. (2011). Bias of importance measures for multi-valued attributes and solutions. Proceedings of the 21st International Conference on Artificial Neural Networks (ICANN), 293-300.
https://doi.org/10.1007/978-3-642-21738-8_38
[5] Hastie, T., Tibshirani, R., & Friedman, J. (2008). The Elements of Statistical Learning. Berlin, Springer, 768 p.
https://doi.org/10.1007/978-0-387-84858-7
[6] Kamiński, B., Jakubczyk, M., & Szufel, P. (2017). A framework for sensitivity analysis of decision trees. Central European Journal of Operations Research, 26 (1), 135-159.
https://doi.org/10.1007/s10100-017-0479-6
[7] Karimi, K., & Hamilton, H. J. (2011). Generation and Interpretation of Temporal Decision Rules. International Journal of Computer Information Systems and Industrial Management Applications, 3, 314-323.
[8] Koskimaki, H., Juutilainen, I., Laurinen, P., & Roning, J. (2008). Two-level clustering approach to training data instance selection: a case study for the steel industry, Neural Networks: International Joint Conference (IJCNN-2008), Hong Kong, 1-8 June 2008: proceedings. Los Alamitos, IEEE, 3044-3049.
https://doi.org/10.1109/IJCNN.2008.4634228
[9] Kotsiantis, S. B. (2007). Supervised Machine Learning: A Review of Classification Techniques. Informatica, 31, 249-268.
[10] Laver, V. O., & Povkhan, I. F. (2019). The algorithms for constructing a logical tree of classification in pattern recognition problems. Scientific notes of the Tauride national University. Series: technical Sciences, 30(69)(4), 100-106.
https://doi.org/10.32838/2663-5941/2019.4-1/18
[11] Miyakawa, M. (1989). Criteria for selecting a variable in the construction of efficient decision trees. IEEE Transactions on Computers, 38(1), 130-141.
https://doi.org/10.1109/12.8736
[12] Painsky, A., & Rosset, S. (2017). Cross-validated variable selection in tree-based methods improves predictive performance, IEEE Transactions on Pattern Analysis and Machine Intelligence, 39(11), 2142-2153.
https://doi.org/10.1109/TPAMI.2016.2636831
[13] Povhan, I. (2016). Designing of recognition system of discrete objects, IEEE First International Conference on Data Stream Mining & Processing (DSMP), Ukraine. Lviv, 226-231.
[14] Povhan, I. (2019). General scheme for constructing the most complex logical tree of classification in pattern recognition discrete objects . Electronics and Information Technologies, 11, 112-117.
https://doi.org/10.30970/eli.11.7
[15] Povhan, I. F. (2019). The problem of general estimation of the complexity of the maximum constructed logical classification tree. Bulletin of the national technical University Kharkiv Polytechnic Institute, 13, 104−117.
https://doi.org/10.20998/2411-0558.2019.13.10
[16] Povkhan, I. F. (2018). The problem of functional evaluation of a training sample in discrete object recognition problems. Scientific notes of the Tauride national University. Series: technical Sciences, 29(68)(6), 217-222.
[17] Povkhan, I. F. (2019). Features of synthesis of generalized features in the construction of recognition systems using the logical tree method, Materials of the international scientific and practical conference "Information technologies and computer modeling ІТКМ-2019". Ivаnо-Frаnkivsk, 169-174.
[18] Povkhan, I. F. (2019). Features random logic of the classification trees in the pattern recognition problems. Scientific notes of the Tauride national University. Series: technical Sciences, 30(69)(5), 152-161.
https://doi.org/10.32838/2663-5941/2019.5-1/22
[19] Quinlan, J. R. (1986). Induction of Decision Trees. Machine Learning, 1, 81 106.
https://doi.org/10.1007/BF00116251
[20] Srikant, R., & Agrawal, R. (1997). Mining generalized association rules. Future Generation Computer Systems, 13(2), 161 180.
https://doi.org/10.1016/S0167-739X(97)00019-8
[21] Subbotin, S. (2013). The neuro-fuzzy network synthesis and simplification on precedents in problems of diagnosis and pattern recognition. Optical Memory and Neural Networks (Information Optics), 22(2), 97-103.
https://doi.org/10.3103/S1060992X13020082
[22] Subbotin, S. A. (2013). Methods of sampling based on exhaustive and evolutionary search. Automatic Control and Computer Sciences, 47(30), 113-121.
https://doi.org/10.3103/S0146411613030073
[23] Subbotin, S. A. (2014). Methods and characteristics of localitypreserving transformations in the problems of computational intelligence. Radio Electronics, Computer Science, Control, (1), 120-128.
https://doi.org/10.15588/1607-3274-2014-1-17
[24] Subbotin, S. A. (2019). Construction of decision trees for the case of low-information features. Radio Electronics, Computer Science, Control, (1), 121-130.
https://doi.org/10.15588/1607-3274-2019-1-12
[25] Subbotin, S., & Oliinyk, A. (2017). The dimensionality reduction methods based on computational intelligence in problems of object classification and diagnosis, Recent Advances in Systems, Control and Information Technology, [eds.: R. Szewczyk, M. Kaliczyńska]. Cham, Springer, 11-19. (Advances in Intelligent Systems and Computing, 543.
https://doi.org/10.1007/978-3-319-48923-0_2
[26] Vasilenko, Y. A., Vashuk, F. G., & Povkhan, I. F. (2011). The problem of estimating the complexity of logical trees recognition and a general method for optimizing them. Scientific and technical journal "European Journal of Enterprise Technologies", 6/4(54), 24-28.
[27] Vasilenko, Y. A., Vashuk, F. G., & Povkhan, I. F. (2012). General estimation of minimization of tree logical structures. European Journal of Enterprise Technologies, 1/4(55), 29-33.
[28] Vasilenko, Y. A., Vashuk, F. G., Povkhan, I. F., Kovach, M. Y., & Nikarovich, O. D. (2004). Minimizing logical tree structures in image recognition tasks. European Journal of Enterprise Technologies, 3(9), 12-16.
[29] Vasilenko, Y. A., Vasilenko, E. Y., & Povkhan, I. F. (2002). Defining the concept of a feature in pattern recognition theory. Artificial Intelligence, 4, 512-517.
[30] Vasilenko, Y. A., Vasilenko, E. Y., & Povkhan, I. F. (2003). Branched feature selection method in mathematical modeling of multi-level image recognition systems. Artificial Intelligence, 7, 246−249.
[31] Vasilenko, Y. A., Vasilenko, E. Y., & Povkhan, I. F. (2004). Conceptual basis of image recognition systems based on the branched feature selection method. European Journal of Enterprise Technologies, 7(1), 13-15.
[32] Vtogoff, P. E. (1989). Incremental Induction of Decision Trees. Machine Learning, (4), 161−186.