Please use this identifier to cite or link to this item: https://idr.l2.nitk.ac.in/jspui/handle/123456789/7740
Title: Discovery of optimal neurons and hidden layers in feed-forward Neural Network
Authors: Thomas, L.
Manoj, Kumar, M.V.
Annappa, B.
Issue Date: 2016
Citation: 2016 IEEE International Conference on Emerging Technologies and Innovative Business Practices for the Transformation of Societies, EmergiTech 2016, 2016, Vol., , pp.286-291
Abstract: Identifying the number of neurons in each hidden layers and number of hidden layers in a multi layered Artificial Neural Network (ANN) is a challenge based on the input data. A new hypothesis is proposed for organizing the synapse from x to y neuron. The synapse of number of neurons to fire between the hidden layer is identified. By the introduction of this hypothesis, an effective number of neurons in multilayered Artificial Neural Network can be identified and self organizing neural network model is developed which is referred as cognitron. The normal brain model has 3 layered perceptron; but the proposed model organizes the number of layers optimal for identifying an effective model. Our result proved that the proposed model constructs a neural model directly by identifying the optimal weights of each neurons and number of neurons in each dynamically identified hidden layers. This optimized model is self organized with different range of neurons on different layer of hidden layer, and by comparing the performance based on computational time and error at each iteration. An efficient number of neurons are organized using gradient decent. The proposed model thus train large model to perform the classification task by inserting optimal layers and neurons. � 2016 IEEE.
URI: http://idr.nitk.ac.in/jspui/handle/123456789/7740
Appears in Collections:2. Conference Papers

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.