Document Type

Technical Report

Publication Date

1989-06-01

Filename

WUCS-89-26.pdf

DOI:

10.7936/K7W37TQB

Technical Report Number

WUCS-89-26

Abstract

Backpropagation as learning rule is often confining to multi-layer Perceptions or layered feedforward networks, in which there are no lateral connections among the units of the same layer nor any connections bypassing intermediate layers. We prove algebraically that these restrictions are not necessary, i.e. backpropagation is applicable to any acyclic neural network. Our proof is based on a new formulation of backpropagation called an Acyclic Neural Network (ANN). In ANN, a net is defined as a partially ordered set of processing units where every unit may receive an input value and/or a correction (teaching) value. Therefore, there is no need to differentiate between the input, hidden, and output units.

Comments

Permanent URL: http://dx.doi.org/10.7936/K7W37TQB

Share

COinS