Myanmar Warning Board Recognition System

  • Kyi Pyar Zaw University of Computer Studies Mandalay, Myanmar
  • Zin Mar Kyu University of Computer Studies Mandalay, Myanmar


In any country, warning text is described on the signboards or wallpapers to follow by everybody. This paper present Myanmar character recognition from various warning text signboards using block based pixel count and eight-directions chain code. Character recognition is the process of converting a printed or typewritten or handwritten text image file into editable and searchable text file. In this system, the characters on the warning signboard images are recognized using the hybrid eight direction chain code features and 16-blocks based pixel count features. Basically, there are three steps of character recognition such as character segmentation, feature extraction and classification. In segmentation step, horizontal cropping method is used for line segmentation, vertically cropping method and bounding box is used for connected component character segmentation. In the classification step, the performance accuracy is measured by two ways such as KNN (K’s Nearest Neivour) classifier and feature based approach of template matching on 150 warning text signboard images.


Download data is not yet available.


[1] AL-Hashemi and Alsharari, “Instant Arabic Translation System for Signboard Images Based on Printed Character Recognition”, International Journal of Machine Learning and Computing, vol. 3, N0. 4, Augest 2013.
[2] E.E. Phyu, Z.C. Aye, E.P. Khaing and Y. Thein, 2008, “Recognition of Myanmar Handwritten Compound Words based on MICR”, the 29th Asian Conference on Remote Sensing .
[3] Emmanuel, Rosemol, and Jilu George. "Automatic detection and recognition of Malayalam text from natural scene images." IOSR Journal of VLSI and Signal Processing 3.2 (2013): 55-61.
[4] H.P.P.Win, P.T.T. Khine and K.N.N.Tun, 2011 “Bilingual OCR System for Myanmar and English Scripts with Simultaneous Recognition”, International Journal of Scientific & Engineering Research Volume 2, Issue 10, October, ISSN 2229-5518.
[5] K.P. Zaw and Z.M. Kyu, “Segmentation Method for Myanmar Character Recognition using Block based Pixel Count and Aspect Ratio”, 27th International Conference on Computer Theory and Application (ICCTA), October 2017.
[6] M. Sayed and S.A. Angadi, “Mobile Application for Reading Display Boards having Kannada Text”, International Journal of Recent Trends in Engineering and Research, Vol. 02, Iss. 08; August- 2016.
[7] S.A. Angadi, and M. M. Kodabagi. "A robust segmentation technique for line, word and character extraction from Kannada text in low resolution display board images." Signal and Image Processing (ICSIP), 2014 Fifth International Conference on. IEEE, 2014.
[8] S. B. Ahmed, S. Naz, M. I. Razzak, & R. Yousaf. 2017. Deep Learning based Isolated Arabic Scene Character Recognition. arXiv preprint arXiv:1704.06821.
[9] T. Swe and P. Tin, 2006, “Recognition and Translation of Myanmar Printed Text based on Hopfield Neural Network”, IEEE.
[10] Y.Thein and S.S.S. Yee, 2010, “High Accuracy Myanmar Handwritten Character Recognition using Hybrid Approach through the MICR and Neural Network”, International Journal of Computer Science Issues, Vol. 7, Issue-6, November -2010.
[11] Emmanuel, Rosemol, and Jilu George. "Automatic detection and recognition of Malayalam text from natural scene images." IOSR Journal of VLSI and Signal Processing 3.2 (2013): 55-61.
[12] S. Sharma and C. Roxanne, 2007, Extraction of Text Region from Natural Images, Master Project Report Book.
[13] N. A. A. Htwe, T. T. Soe and M. M. Sein, "Automatic Extraction of Payee's Name and Legal Amount from Myanmar Bank Cheque by using Hidden Markov Model", Proceedings of the Fifth International Conference on Computer Application, February 8-9, Myanmar, pp. 411-414, February, 2007
How to Cite
PYAR ZAW, Kyi; MAR KYU, Zin. Myanmar Warning Board Recognition System. International Journal of Research and Engineering, [S.l.], v. 5, n. 8, p. 480-485, sep. 2018. ISSN 2348-7860. Available at: <>. Date accessed: 17 jan. 2021. doi: