Technical Report Number
Two concurrent implementations of the method of conjugate gradients for training Elman networks are discussed. The parallelism is obtained in the computation of the error gradient and the method is therefore applicable to any gradient descent training technique for this form of network. The experimental results were obtained on a Sun Sparc Center 2000 multiprocessor. The Sparc 2000 is a shared memory machine well suited to coarse-grained distributed computations, but the concurrency could be extended to other architectures as well.
McCann, Peter J. and Kalman, Barry L., "Strategies for the Parallel Training of Simple Recurrent Neural Networks" Report Number: WUCS-94-15 (1994). All Computer Science and Engineering Research.
Permanent URL: http://dx.doi.org/10.7936/K7QZ2865