Joint Optimization of Structure and Parameters of Neural Networks Using Hybrid Gravitational Search Algorithms in Classification and Function Approximation Tasks

Authors

Abstract

Determining the
optimum number of nodes, number of hidden layers, and synaptic connection
weights in an artificial neural network (ANN) plays an important role in the
performance of this soft computing model. Several methods have been proposed
for weights update (training) and structure selection of the ANNs. For example,
the error back-propagation (EBP) is a traditional method for weights update of
multi-layer networks. In this study, gravitational search algorithm (GSA), as a
modern swarm intelligence optimization method, is used for this purpose. In
this way, GSA and its binary version (BGSA) are used concurrently, as a hybrid
method, to optimize the weights and number of hidden-layer nodes of an ANN,
respectively. The performance of proposed method is compared with other
intelligent and traditional methods such as particle swarm optimization (PSO),
PSO-BPSO hybrid, and EBP in classification and function approximation tasks.
The performance is evaluated over Iris, Breast Cancer, and Glass datasets for
classification. For function approximation task, the performance is evaluated
for a prosody predictor that is used for natural speech synthesis. To reduce
the number of inputs to prosody predictor, a hybrid of an genetic algorithm and
ant colony optimization algorithm is used for feature selection.  Simulation results show that the proposed
method can offer competitive classification and prediction accuracies while
using a reduced number of hidden neurons (25-68 percent reduction) as compared to
other investigated algorithms.    

Keywords