TY - GEN
T1 - Generalization capability of artificial neural network incorporated with pruning method
AU - Urolagin, Siddhaling
AU - Prema, K. V.
AU - Reddy, N. V.Subba
PY - 2012/4/16
Y1 - 2012/4/16
N2 - In any real world application, the performance of Artificial Neural Networks (ANN) is mostly depends upon its generalization capability. Generalization of the ANN is ability to handle unseen data. The generalization capability of the network is mostly determined by system complexity and training of the network. Poor generalization is observed when the network is over-trained or system complexity (or degree of freedom) is relatively more than the training data. A smaller network which can fit the data will have the k good generalization ability. Network parameter pruning is one of the promising methods to reduce the degree of freedom of a network and hence improve its generalization. In recent years various pruning methods have been developed and found effective in real world applications. Next, it is important to estimate the improvement in generalization and rate of improvement as pruning being incorporated in the network. A method is developed in this research to evaluate generalization capability and rate of convergence towards the generalization. Using the proposed method, experiments have been conducted to evaluate Multi-Layer Perceptron neural network with pruning being incorporated for handwritten numeral recognition.
AB - In any real world application, the performance of Artificial Neural Networks (ANN) is mostly depends upon its generalization capability. Generalization of the ANN is ability to handle unseen data. The generalization capability of the network is mostly determined by system complexity and training of the network. Poor generalization is observed when the network is over-trained or system complexity (or degree of freedom) is relatively more than the training data. A smaller network which can fit the data will have the k good generalization ability. Network parameter pruning is one of the promising methods to reduce the degree of freedom of a network and hence improve its generalization. In recent years various pruning methods have been developed and found effective in real world applications. Next, it is important to estimate the improvement in generalization and rate of improvement as pruning being incorporated in the network. A method is developed in this research to evaluate generalization capability and rate of convergence towards the generalization. Using the proposed method, experiments have been conducted to evaluate Multi-Layer Perceptron neural network with pruning being incorporated for handwritten numeral recognition.
UR - http://www.scopus.com/inward/record.url?scp=84859625584&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84859625584&partnerID=8YFLogxK
U2 - 10.1007/978-3-642-29280-4_19
DO - 10.1007/978-3-642-29280-4_19
M3 - Conference contribution
AN - SCOPUS:84859625584
SN - 9783642292798
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 171
EP - 178
BT - Advanced Computing, Networking and Security - International Conference, ADCONS 2011, Revised Selected Papers
T2 - International Conference on Advanced Computing, Networking and Security, ADCONS 2011
Y2 - 16 December 2011 through 18 December 2011
ER -