Full Length Research Paper
Abstract
In pursuit of Persian handwritten digit recognition, many machine learning techniques have been utilized. Mixture of experts (MOE) is one of the most popular and interesting combining methods which has great potential to improve performance in machine learning. In MOE, during a competitive learning process, the gating networks supervise dividing input space between experts and experts obtain specialization on those subspaces. In this model, simple linear networks are used to form the building blocks of the MOE architecture. But in this study, due to complexity of Persian handwritten digit classification, the multi layer perceptrons, MLP, is used as gating and expert networks. We call this architecture the mixture of multilayer perceptrons experts (MOME). Comparative evaluation is accomplished with two real-world datasets: SRU Persian numerals and a very large dataset of Persian handwritten digit (HODA). In this paper, experiments are conducted to evaluate the performance of MOME with various appraisal criteria and also, classification capabilities of various neural network ensembles are compared with MOME. Our experimental results indicate significant improvement in recognition rate of our investigated method, MOME, in all practical tests.
Key words: Mixture of experts, hand written digit recognition, combining classifiers, mixture of multi-layer perceptrons experts.
Copyright © 2025 Author(s) retain the copyright of this article.
This article is published under the terms of the Creative Commons Attribution License 4.0