New approach to neural networks using partially ordered sets: new article

Tuesday 18 March 2008

Dr Mike Shields and Dr Matthew Casey have just had an article on the theory of generic combinations of neural networks printed in the highly quality journal Neurocomputing. The article brings together concepts in theoretical computing (partially ordered sets and state transitions) with neural networks to develop a framework in which the properties of multiple network systems can be formalised.

Abstract

"Multiple neural network systems have become popular techniques for tackling complex tasks, often giving improved
performance compared to single network systems. For example, modular systems can provide improvements in generalisation through task decomposition, whereas multiple classifier and regressor systems typically improve generalisation through the ensemble combination of redundant networks. Whilst there has been significant focus on understanding the theoretical properties of some of these multi-net systems, particularly ensemble systems, there has been little theoretical work on understanding the properties of the generic combination of networks, important in developing more complex systems, perhaps even those a step closer to their biological counterparts. In this paper we provide a formal framework in which the generic combination of neural networks can be described, and in which the properties of the system can be rigorously analyzed. We achieve this by describing multi-net systems in terms of partially ordered sets and state transition systems. By way of example, we explore an abstract version of learning applied to a generic multi-net system that can combine an arbitrary number of networks in sequence and in parallel. By using the framework we show with a constructive proof that, under specific conditions, if it is possible to train the generic system, then training can be achieved by the abstract technique described."

A pre-print of the article can be found here.