A Normalization Methods for Backpropagation: A Comparative Study

Authors

  • Adel S. Eesa University of Zakho
  • Wahab Kh. Arabo University of Zakho

DOI:

https://doi.org/10.25271/2017.5.4.381

Keywords:

Normalization, Neural network, Back propagation

Abstract

Neural Networks (NN) have been used by many researchers to solve problems in several domains including classification and pattern recognition, and Backpropagation (BP) which is one of the most well-known artificial neural network models. Constructing effective NN applications relies on some characteristics such as the network topology, learning parameter, and normalization approaches for the input and the output vectors. The Input and the output vectors for BP need to be normalized properly in order to achieve the best performance of the network. This paper applies several normalization methods on several UCI datasets and comparing between them to find the best normalization method that works better with BP. Norm, Decimal scaling, Mean-Man, Median-Mad, Min-Max, and Z-score normalization are considered in this study. The comparative study shows that the performance of Mean-Mad and Median-Mad is better than the all remaining methods. On the other hand, the worst result is produced with Norm method.

Author Biographies

Adel S. Eesa, University of Zakho

Dept. of Computer Science, Faculty of Science, University of Zakho, Kurdistan Region - Iraq (adel.eesa@uoz.edu.krd).

Wahab Kh. Arabo, University of Zakho

Dept. of Computer Science, Faculty of Science, University of Zakho, Kurdistan Region - Iraq.

References

B. Dębska, B. G.-Ś. (2011). Application of artificial neural network in food classification. Paper presented at the 12th International Conference on Chemometrics in Analytical Chemistry.
Birendra Biswal, A. J. P. (2013). A New Approach to Time–Time Transform and Pattern Recognition of Non-stationary Signal Using Fuzzy Wavelet Neural Network. Paper presented at the International Conference on Frontiers of Intelligent Computing: Theory and Applications (FICTA).
Bishop, C. M. (1995). Neural Networks for Pattern Recognition: Oxford University Press, Inc. New York, NY, USA.
Cal, Y. (1995). Soil classification by neural network. Advances in Engineering Software, Elsevier, 22(2), 95-97. doi: 10.1016/0965-9978(94)00035-H.
California, U. o. (1999). KDD Cup 99 Retrieved 10-2-2017, 2017, from http://kdd.ics.uci.edu/databases/kddcup99/kddcup99.html.
Chen, Y. W. a. H.-J. (2012). Handbook of Anthropometry V. R. Preedy (Ed.) doi:10.1007/978-1-4419-1788-1.
Christophe Leys, C. L. U., Olivier Klein, Philippe Bernard and Laurent Licata. (2013). Detecting outliers: do not use standard deviation around the mean, use absolute deviation around the median. Journal of Experimental Social Psychology, 40(4), 764-766. doi: 10.1016/j.jesp.2013.03.013.
D. E. Rumelhart, G. E. H., R. J. Williams. (1986). Learning internal representations by error propagation. . Paper presented at the In D. E. Rumelhart, & J. L. McClelland (Eds.), Parallel distributed processing: Explorations in the microstructure of cognition, Cambridge, MA: MIT Press.
Diane M. Miller, E. J. K., Soraya Rana. (1995). Neural network classification of remote-sensing data. Computers & Geosciences, Elsevier, 21(3), 377-386. doi: 10.1016/0098-3004(94)00082-6.
HELM. (2004). Workbook Level 1 30.4: Introduction to Numerical Methods, VERSION 1.
Jiří Grim, J. H. (2008). Iterative principles of recognition in probabilistic neural networks. Paper presented at the 17th International Conference on Artificial Neural Networks (ICANN).
Kim, D. (1999). Normalization methods for input and output vectors in backpropagation neural networks. International Journal of Computer Mathematics, 71(2), 161-171. doi: 10.1080/00207169908804800.
Kyung Whan Kim, D. K., Hun Young Jung. (2005). Normalization methods on backpropagation for the estimation of driver's route choice. KSCE Journal of Civil Engineering, 9(5). doi: 10.1007/BF02830631.
Lichman, M. (2013). UCI Machine Learning Repository, from http://archive.ics.uci.edu/ml.
Luai Al Shalabi, Z. S. (2006 ). Normalization as a Preprocessing Engine for Data Mining and the Approach of Preference Matrix. Paper presented at the International Conference on Dependability of Computer Systems (DEPCOS-RELCOMEX '06), Washington, DC, USA.
Markku Siermala, M. J., Erna Kentala. (2008). Neural network classification of otoneurological data and its visualization. Computers in Biology and Medicine, Elsevier 38(8), 858-866. doi: http://dx.doi.org/10.1016/j.compbiomed.2008.05.002.
Norlida, H. (2004). The Impact of Normalization Techniques on Performance Backpropagation Networks. Masters thesis, Universiti Utara Malaysia. Retrieved from http://etd.uum.edu.my/id/eprint/1394.
Pham-Gia, T. L. H. (2011). The mean and median absolute deviations Mathematical and Computer Modelling, 34(7-8), 921-936. doi: 10.1016/S0895-7177(01)00109-1.
Sajad JASHFAR, S. E., Mehdi ZAREIAN-JAHROMI, Mohsen RAHMANIAN. (2013). Classification of power quality disturbances using S-transform and TT-transform based on the artificial neural network. Turk J Elec Eng & Comp Sci, 21, 1528-1538. doi: doi:10.3906/elk-1112-51.
Seref SAGIR OGLU, E. B., Mehmet ERLER. (2000). Control Chart Pattern Recognition Using Artificial Neural Networks. Turk J Elec Eng & Comp Sci, 8(2).
Tamer Ölmez, Z. D. (2003). Classification of heart sounds using an artificial neural network. Pattern Recognition Letters, Elsevier, 24(1-3), 617-629. doi: http://dx.doi.org/10.1016/S0167-8655(02)00281-7.
Teena MITTAL, R. K. S. (2016). Speech recognition using ANN and predator-influenced civilized swarm optimization algorithm. Turk J Elec Eng & Comp Sci 24, 4790 – 4803. doi: doi:10.3906/elk-1412-193

Downloads

Published

2017-12-30

How to Cite

Eesa, A. S., & Arabo, W. K. (2017). A Normalization Methods for Backpropagation: A Comparative Study. Science Journal of University of Zakho, 5(4), 319–323. https://doi.org/10.25271/2017.5.4.381

Issue

Section

Science Journal of University of Zakho