About this Journal  |  Author Guidelines  |   Submit a Manuscript     

International Journal of Reliable Information and Assurance

Volume 2, No. 2, 2014, pp 13-24
http://dx.doi.org/10.21742/ijria.2014.2.2.02

Abstract



On the significance of Non-Extensivity for the Dynamic Growth of Hidden-Layer Neurons



    Chang Young Jung1, Seba Susan2 and Mayank Dwivedi3
    1Graduate School of Advanced Imaging & Multimedia, Chung-Ang University 84, Heukseok-ro, Dongjak-gu, Seoul, Korea
    2,3Department of Information Technology, Delhi Technological University, New Delhi, India

    Abstract

    In this paper we extend our work on the dynamic growth of hidden layer neurons using the weighted sum of non-extensive entropies in [25] by investigating the significance of the non-extensivity of the entropy measure used. The dynamic neural network in the original work dynamically grows the number of the hidden-layer neurons based on an increase in the entropy of the weights during training. We compare the performance of the non-extensive entropy with Gaussian information gain proposed by Susan and Hanmandlu used in the original paper with that of the non-extensive entropies of Pal and Pal and Tsallis and also the extensive Shannon entropy. Experiments on benchmark datasets from the UCI repository further validate the correctness of our approach and establish the non-linearity of the non-extensive Susan and Hanmandlu entropy as being best suited for this purpose


 

Contact Us

  • PO Box 5074, Sandy Bay Tasmania 7005, Australia
  • Phone: +61 3 9028 5994