Username   Password       Forgot your password?  Forgot your username? 

 

Bayesian Network Model for Learning Arithmetic Concepts

Volume 15, Number 3, March 2019, pp. 939-948
DOI: 10.23940/ijpe.19.03.p23.9399481

Yali Lva,b, Tong Jinga, Yuhua Qianb, Jiye Liangb, Jianai Wua, and Junzhong Miaoa

aSchool of Information Management, Shanxi University of Finance and Economics, Taiyuan, 030006, China
bKey Laboratory of Computational Intelligence and Chinese Information Processing of Ministry of Education, Shanxi University, Taiyuan, 030006, China


(Submitted on November 14, 2018; Revised on December 15, 2018; Accepted on January 13, 2019)

Abstract:

An object usually belongs to multiple concepts, but some concepts can be judged directly while other concepts need to be inferred indirectly. To learn some arithmetic concepts from positive integer number sets, we address an arithmetic concept Bayesian network (ACBN) model by taking advantage of Bayesian networks. Specifically, we first give an ACBN model to represent the arithmetic concept knowledge and their direct relationships, and then we design an ACBN model learning algorithm based on domain knowledge. Furthermore, to infer indirectly some arithmetic concepts, we design the learning method of evidence concepts based on the idea of k-nearest neighbors, and then we propose the inference algorithm of the ACBN model. Finally, the experimental results demonstrate that the ACBN model can effectively learn some daily arithmetic concepts.

 

References: 14

        1. J. Pearl, “Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference,” Morgan Kaufmann Publishers, San Mateo, California, 1988
        2. L. W. Zhang and H. P. Guo, “Introduction to Bayesian Networks,” Science Press, 2006
        3. Y. W. Park and D. Klabjan, “Bayesian Network Learning via Topological Order,” Journal of Machine Learning Research, Vol. 18, pp. 1-32, 2017
        4. M. Scanagatta, G. Corani, C. P. de Campos, and M. Zaffalon, “Approximate Structure Learning for Large Bayesian Networks,” Machine Learning, Vol. 107, pp. 1209-1227, 2018
        5. R. Mateescu, K. Kask, V. Gogate, and R. Dechter, “Join-Graph Propagation Algorithms,” Journal of Artificial Intelligence Research, Vol. 37, pp. 279-328, 2010
        6. C. J. Butz, J. S. Oliveira, A. E. D. Santos, and A. L. Madsen, “Inference with Simple Propagation,” in Proceedings of JMLR Workshop and Conference Proceedings, Vol. 52, pp. 62-73, 2016
        7. B. M. Lake, R. Salakhutdinov, and J. B. Tenenbaum, “Human-Level Concept Learning Through Probabilistic Program Induction,” Science, Vol. 350, No. 6266, pp. 1332-1338, 2015
        8. D. George, W. Lehrach, K. Kansky, M. Lázaro-Gredilla, C. Laan, B. Marthi, et al., “A Generative Vision Model that Trains with High Data Efficiency and Breaks Text-based CAPTCHAS,” Science, Vol. 358, No. 6368, 2017
        9. X. Ma, T. Zhao, R. S. Wen, Z. J. Wu, and Q. Wang, “Motion Recognition based on Concept Learning,” in Proceedings of IEEE International Conference on Instrumentation and Measurement Technology (I2MTC), pp. 1-6, 2017
        10. K. P. Murphy, “Machine Learning: A Probabilistic Perspective,” MIT Press, 2012
        11. D. Koller and N. Friedman, “Probabilistic Graphical Models: Principles and Techniques,” MIT Press, 2009
        12. H. LähdesmLäki and I. Shmulevich, “Learning the Structure of Dynamic Bayesian Networks from Time Series and Steady State Measurements,” Machine Learning, Vol. 71, pp. 185-217, 2008
        13. C. P. de Camposab, M. Scanagatta, G. Corani, and M. Zaffalon, “Entropy-based Pruning for Learning Bayesian Networks Using BIC,” Artificial Intelligence, Vol. 260, pp. 42-50, 2018
        14. M. Scutari, “Learning Bayesian Networks with the Bnlearn R Package,” Journal of Statistical Software, Vol. 35, No. 3, pp. 1-22, 2010

               

              Please note : You will need Adobe Acrobat viewer to view the full articles.Get Free Adobe Reader

               
              This site uses encryption for transmitting your passwords. ratmilwebsolutions.com