King's College London

Research portal

Target Propagation in Recurrent Neural Networks

Research output: Contribution to journalArticle

Original languageEnglish
Pages (from-to)1
Number of pages33
JournalJOURNAL OF MACHINE LEARNING RESEARCH
Volume21
Issue number7
DOIs
Publication statusPublished - Feb 2020

King's Authors

Abstract

Recurrent Neural Networks have been widely used to process sequence data, but have long been criticized for their biological implausibility and training difficulties related to vanishing and exploding gradients. This paper presents a novel algorithm for training recurrent networks, target propagation through time (TPTT), that outperforms standard backpropagation through time (BPTT) on four out of the five problems used for testing. The proposed algorithm is initially tested and compared to BPTT on four synthetic time lag tasks, and its performance is also measured using the sequential MNIST data set. In addition, as TPTT uses target propagation, it allows for discrete nonlinearities and could potentially mitigate the credit assignment problem in more complex recurrent architectures.

View graph of relations

© 2018 King's College London | Strand | London WC2R 2LS | England | United Kingdom | Tel +44 (0)20 7836 5454