Username   Password       Forgot your password?  Forgot your username? 

 

Decision Tree Incremental Learning Algorithm Oriented Intelligence Data

Volume 14, Number 5, May 2018, pp. 849-856
DOI: 10.23940/ijpe.18.05.p3.849856

Hongbin Wang, Ci Chu, Xiaodong Xie, Nianbin Wang, and Jing Sun

College of Computer Science and Technology, Harbin Engineering University, Harbin, 150001, China

(Submitted on January 29, 2018; Revised on March 12, 2018; Accepted on April 23, 2018)

Abstract:

Decision tree is one of the most popular classification methods because of its advantages of easy comprehension. However, the decision tree constructed by existed methods is usually too large and complicated. So, in some applications, the practicability is limited. In this paper, combining NOLCDT with IID5R algorithm, an improved hybrid classifier algorithm, HCS, is proposed. HCS algorithm consists of two phases: building initial decision tree and incremental learning. The initial decision tree is constructed according to the NOLCDT algorithm, and then the incremental learning is performed with IID5R. The NOLCDT algorithm selects the candidate attribute with the largest information gain and divides the node into two branches, which avoids generating too many branches. Thus, this prevents the decision tree is too complex. The NOLCDT algorithm also improves on the selection of the next node to be split, which computes the corresponding nodal splitting measure for all candidate splits, and always selects the node which has largest information gain from all candidate split nodes as the next split node, so that each split has the greatest information gain. In addition, based on ID5R, an improved algorithm IID5R is proposed to evaluate the quality of classification attributes and estimates a minimum number of steps for which these attributes are guaranteed such a selection. HCS takes advantage of the decision tree and the incremental learning method, which is easy to understand and suitable for incremental learning. The contrast experiment between the traditional decision tree algorithm and HCS algorithm with UCI data set is proposed; the experimental results show that HCS can solve the increment problem very well. The decision tree is simpler so that it is easy to understand, and so the incremental phase consumes less time.

 

References: 11

      1. K. Adhatrao, A. Gaykar, A. Dhawan, R. Jha, V. Honrao, "Predicting Students' Performance Using ID3 And C4.5 Classification Algorithms," International Journal of Data Mining & Knowledge Management Process, vol. 3, no. 5, pp., 2013
      2. J. M. Cherry, "The Saccharomyces Genome Database: Advanced Searching Methods and Data Mining," Cold Spring Harb Protoc, vol. 2015, no. 12, pp. pdb.prot088906, 2015
      3. D. U. Hong-Le, Y. Zhang, "Incremental Support Vector Machine Algorithm on Dynamic Cost," Journal of Shangluo University, vol., no., pp., 2017
      4. D. Kalles, T. Morris, "Efficient Incremental Induction of Decision Trees," Machine Learning, vol. 24, no. 3, pp. 231-242, 1996
      5. Liu, Huan, Motoda, Hiroshi, "Feature Selection for Knowledge Discovery and Data Mining," Springer International, vol., no. 4, pp. xviii, 1998
      6. Z. Q. J. Lu, "The Elements of Statistical Learning: Data Mining, Inference, and Prediction," Mathematical Intelligencer, vol. 27, no. 2, pp. 83-85, 2005
      7. C. J. Mantas, J. Abellán, "Credal-C4.5: Decision Tree Based on Imprecise Probabilities to Classify Noisy Data," Expert Systems with Applications, vol. 41, no. 10, pp. 4625-4637, 2014
      8. J. R. Quinlan, "Induction of Decision Trees," Machine Learning, vol. 1, no. 1, pp. 81-106, 1986
      9. L. Rutkowski, M. Jaworski, L. Pietruczuk, P. Duda, "The CART Decision Tree for Mining Data Streams," Information Sciences, vol. 266, no. 5, pp. 1-15, 2014
      10. C. W. Tsai, C. F. Lai, M. C. Chiang, L. T. Yang, "Data Mining for Internet of Things: A Survey," IEEE Communications Surveys & Tutorials, vol. 16, no. 1, pp. 77-97, 2014
      11. C. C. Wu, Y. L. Chen, Y. H. Liu, X. Y. Yang, "Decision Tree Induction with a Constrained Number of Leaf Nodes," Applied Intelligence, vol. 45, no. 3, pp. 1-13, 2016

           

          Please note : You will need Adobe Acrobat viewer to view the full articles.Get Free Adobe Reader

          Attachments:
          Download this file (IJPE-2018-05-03.pdf)IJPE-2018-05-03.pdf[Decision Tree Incremental Learning Algorithm Oriented Intelligence Data]499 Kb
           
          This site uses encryption for transmitting your passwords. ratmilwebsolutions.com