Evolving neural networks using matrix grammars

Roberts, Ian and Hunter, Andrew (1999) Evolving neural networks using matrix grammars. Project Report. University of Sunderland, Sunderland. (Unpublished)

Documents
Evolving_Neural_Networks_Using_Matrix_Grammars_-_AH.pdf
[img]
[Download]
Request a copy
[img] PDF
Evolving_Neural_Networks_Using_Matrix_Grammars_-_AH.pdf - Whole Document
Restricted to Repository staff only

3MB

Abstract

Methods of evolving Neural Networks using Matrix Grammars are described. Because these methods generate network architectures structurally, reusing symbols to describe sub-sections of the architecture, they tend to produce well-structured networks and are suitable for similarly well-structured problems. Methods which generate the architecture only, and methods which also generate weights, are described. Evolution is combined with backpropagation training. The techniques are compared with previously published work, and show several distinct advantages. The main advantage of all the methods is the ability to overcome the Genetic Algorithm scaling problem. The inclusion of weights gives better convergence. The Matrix Grammars presented here further separate the evolution of weights and architecture than previous methods, widening the search-space. The suitability of the techniques for more substantial problems is discussed. We also show how large improvements can be achieved by progressive evolution: the pretraining of the population on related, simpler problems.

Item Type:Paper or Report (Project Report)
Additional Information:Methods of evolving Neural Networks using Matrix Grammars are described. Because these methods generate network architectures structurally, reusing symbols to describe sub-sections of the architecture, they tend to produce well-structured networks and are suitable for similarly well-structured problems. Methods which generate the architecture only, and methods which also generate weights, are described. Evolution is combined with backpropagation training. The techniques are compared with previously published work, and show several distinct advantages. The main advantage of all the methods is the ability to overcome the Genetic Algorithm scaling problem. The inclusion of weights gives better convergence. The Matrix Grammars presented here further separate the evolution of weights and architecture than previous methods, widening the search-space. The suitability of the techniques for more substantial problems is discussed. We also show how large improvements can be achieved by progressive evolution: the pretraining of the population on related, simpler problems.
Keywords:Neural networks, Matrix grammars, Network architectures
Subjects:G Mathematical and Computer Sciences > G400 Computer Science
Divisions:College of Science > School of Computer Science
ID Code:3387
Deposited By: Tammie Farley
Deposited On:26 Sep 2010 16:44
Last Modified:13 Mar 2013 08:47

Repository Staff Only: item control page