International Journal of
Physical Sciences

  • Abbreviation: Int. J. Phys. Sci.
  • Language: English
  • ISSN: 1992-1950
  • DOI: 10.5897/IJPS
  • Start Year: 2006
  • Published Articles: 2572

Full Length Research Paper

Persian handwritten digits recognition: A divide and conquer approach based on mixture of MLP experts

Mohammad Masoud Javidi1*, Reza Ebrahimpour2 and Fatemeh Sharifizadeh3
  1Department of Computer Science, Shahid Bahonar University of Kerman, Kerman, Iran. 2Brain and Intelligent Systems Research Laboratory, Department of Electrical and Computer Engineering, Shahid Rajaee Teacher Training University, P. O. Box: 16785-163, Tehran, Iran. 3Department of Computer Science, Shahid Bahonar University of Kerman, Iran.
Email: [email protected]

  •  Accepted: 12 October 2011
  •  Published: 23 November 2011



In pursuit of Persian handwritten digit recognition, many machine learning techniques have been utilized. Mixture of experts (MOE) is one of the most popular and interesting combining methods which has great potential to improve performance in machine learning. In MOE, during a competitive learning process, the gating networks supervise dividing input space between experts and experts obtain specialization on those subspaces. In this model, simple linear networks are used to form the building blocks of the MOE architecture. But in this study, due to complexity of Persian handwritten digit classification, the multi layer perceptrons, MLP, is used as gating and expert networks. We call this architecture the mixture of multilayer perceptrons experts (MOME). Comparative evaluation is accomplished with two real-world datasets: SRU Persian numerals and a very large dataset of Persian handwritten digit (HODA). In this paper, experiments are conducted to evaluate the performance of MOME with various appraisal criteria and also, classification capabilities of various neural network ensembles are compared with MOME. Our experimental results indicate significant improvement in recognition rate of our investigated method, MOME, in all practical tests.


Key words: Mixture of experts, hand written digit recognition, combining classifiers, mixture of multi-layer perceptrons experts.