2.8.10. Topology Optimization by Growing Neural Network
Algorithms

An interesting
algorithm for building non-uniform optimized neural network topologies was initially
proposed by Vinod et al. [125]. The algorithm starts
with a feedforward backpropagation neural network, which has no hidden neurons
and no links ("empty network"). The algorithm grows the network by
adding one neuron at a time. The neuron is connected to one output neuron and
to two other neurons whereby these links are selected on the basis of the maximum
estimated error decrease for the calibration data. The insertion of the neurons
is stopped when a prescribed error has been reached. It was demonstrated that
each growing step does improve the calibration error. It was also shown that
the algorithm is able to approximate complex continuous functions (like a sinus
wave) using very small networks.