Username   Password       Forgot your password?  Forgot your username? 

 

A Mongolian Language Model based on Recurrent Neural Networks

Volume 14, Number 7, July 2018, pp. 1580-1589
DOI: 10.23940/ijpe.18.07.p22.15801589

Zhiqiang Ma, Li Zhang, Rui Yang, and Tuya Li

College of Data Science and Application, Inner Mongolia University of Technology, Hohhot, 010080, China

(Submitted on April 9, 2018; Revised on May 21, 2018; Accepted on June 23, 2018)

Abstract:

In view of data sparsity and long-range dependence when training the N-Gram Mongolian language model, a Mongolian Language Model based on Recurrent Neural Networks (MLMRNN) is proposed. The Mongolian classified word vector is designed and used as the input word vector of MLMRNN in the pre-training phase, and the Skip-Gram word vector with context information is used at the input layer so that the input contains not only semantic information, but also rich context information. It effectively avoids the problem of data sparsity and long-range dependence. Finally, the training algorithm of MLMRNN is designed and the perplexity is used as the evaluation index of the language model to test the perplexity of N-Gram, RNNLM and MLMRNN on the training set and test set, respectively. The experimental results show that the perplexity of using MLMRNN is lower than that of other language models, and the performance of the language model is improved.

 

References: 23

          1. X. Ai, “Researching of Mongolian Language Model based on Speech Recognition,” Hohhot: Inner Mongolia University, 2007
          2. H. Bourlard and N. Morgan, “Continuous Speech Recognition by Connectionist Statistical Methods” IEEE Transactions on Neural Networks, vol. 4, no. 6, pp. 893-909, 1993
          3. P. F. Brown, P. V. Desouza, and R. L. Mercer, “Class-based N-gram Models of Natural Language,” Computational Linguistics, vol. 18, no. 4, pp. 467-479, 1997
          4. Y. Bengio, Ducharme and Jean, “A Neural Probabilistic Language Model,” Journal of Machine Learning Research, vol. 3, no. 6, pp. 1137-1155, 2003
          5. A. Coates and A. Y. Ng, “Learning Feature Representations with K-Means,” Lecture Notes in Computer Science, 2012, vol. 7700, pp.561-580
          6. S. F. Chen and J. Goodman, “An Empirical Study of Smoothing Techniques for Language Modeling,” Meeting of the Association for Computational Linguistics, pp. 24-27, June 1996, University of California, Santa Cruz, California, Usa, Proceedings. pp. 359-393, 1996
          7. H. X. Hou, Q. Liu and Z. W. Liu, “Skip-N Mongolian Statistical Language Model,” Journal of Inner Mongolia University (natural edition), vol. 39, no. 2, pp. 220-224, 2008
          8. R. Kuhn and R. M. De “A Cache-Based Natural Language Model for Speech Recognition,” IEEE Transactions on Pattern Analysis & Machine Intelligence, vol. 12, no. 6, pp. 219-228, 1990
          9. R. Kneser and H. Ney, “Improved Backing-off for N-gram Language Modeling,” In Proceedings of the international conference on acoustics, speech and signal processing (ICASSP), pp. 181-184, 1995
          10. H. Li, D. Qu and W. L. Zhang, “Recurrent Neural Network Language Model with Global Word Vector Features,” signal peocessing, vol. 32, no. 6, pp. 715-723, 2016
          11. Y. X. Li, J. Q. Zhang and D. Pan, “A Study of Speech,” Journal of Integrative Plant Biology, vol. 51, no. 9, pp. 1936-1944, 2014
          12. T. A. Mikolov, “Statistical Language Models based on Neural Networks,” 2012
          13. T. Mikolov, M. Karaat and L. Burget, “Recurrent Neural Network based Language Mode,” Proceedings of Interspeech, pp. 1045-1048, 2010
          14. W. D. Mulder S. Bethard and M. F. Moens, “A Survey on the Application of Recurrent Neural Networks to Statistical Language Modeling,” Computer Speech & Language, vol. 30, no. 1, pp. 61-98, 2015
          15. Z. Q. Ma, Z. G. Zhang and R. Yan, “N-Gram Based Language Identification for Mongolian Text,” Journal of Chinese Information Processing, vol. 30, no. 1, pp. 133-139, 2016
          16. I. Sutskever. “Training Recurrent Neural Networks,” Toronto: University of Toronto, 2013
          17. A. Vinciarelli, S. Bengio and H. Bunke, “Offline Recognition of Unconstrained Handwritten Texts Using HMMs and Statistical Language Models,” IEEE Transactions on Pattern Analysis & Machine Intelligence, vol. 26, no. 6, pp. 709-20, 2004
          18. L. Wang, J. A. Yang and L. Chen, “Recurrent Neural Network based Chinese Language Modeling Method,” Acoustic technology, vol. 34, no. 5, pp. 431-436, 2015
          19. P. Xu and F. Jelinek, “Random Forests in Language Modelin,” Conference on Empirical Methods in Natural Language Processing, EMNLP 2004, A Meeting of Sigdat,A Special Interest Group of the Acl,Held in Conjunction with ACL 2004, pp. 25-26 July 2004, Barcelona, Spain. pp. 325-332, 2004
          20. Y. K. Xing and S. P. Ma, “A Survey on Statistical Language Models,” Computer Science, vol. 30, no. 9, pp. 22-26, 2003
          21. C. X. Zhai, “Statistical Language Models for Information Retrieval,” Now Publishers Inc, 2008
          22. J. Zhang, D. Qu and Z. Li, “Recurrent Neural Network Language Model based on Word Vector Features,” Pattern Recognition and Artificial Intelligence, vol. 28, no. 4, pp. 299-305, 2015
          23. L. Zhou, “Exploration of the Working Principle and Application of Word2vec,” Library and Information Guide, no. 2, pp. 145-148, 2015

                   

                  Please note : You will need Adobe Acrobat viewer to view the full articles.Get Free Adobe Reader

                  Attachments:
                  Download this file (22-IJPE-2018-07-22.pdf)IJPE-2018-07-22[A Mongolian Language Model based on Recurrent Neural Networks]1157 Kb
                   
                  This site uses encryption for transmitting your passwords. ratmilwebsolutions.com