394
Views
7
CrossRef citations to date
0
Altmetric
Original Articles

Emergent latent symbol systems in recurrent neural networks

&
Pages 193-225 | Received 15 Oct 2012, Accepted 18 Apr 2013, Published online: 13 May 2013

References

  • Cowan , N. , Elliott , E. M. , Scott Saults , J. , Morey , C. C. , Mattox , S. , Hismjatullina , A. and Conway , A. R. A. 2005 . On the capacity of attention: Its estimation and its role in working memory and cognitive aptitudes . Cognitive Psychology , 51 ( 1 ) : 42 – 100 . (doi:10.1016/j.cogpsych.2004.12.001)
  • Elman , J. L. 1990 . Finding structure in time . Cognitive Science , 14 : 179 – 211 . (doi:10.1207/s15516709cog1402_1)
  • Fodor , J. A. and Pylyshyn , Z. W. 1988 . Connectionism and cognitive architecture: A critical analysis . Cognition , 28 ( 1–2 ) : 3 – 71 . (doi:10.1016/0010-0277(88)90031-5)
  • Frank , S. L. , Haselager , W. F. G. and van Rooij , I. 2009 . Connectionist semantic systematicity . Cognition , 110 ( 3 ) : 358 – 379 . (doi:10.1016/j.cognition.2008.11.013)
  • Gayler , R. W. 1998 . “ Multiplicative binding, representation operators, and analogy ” . In Advances in analogy research: Integration of theory and data from the cognitive, computational, and neural sciences , Edited by: Holyoak , K. , Gentner , D. and Kokinov , B. 181 – 191 . Sofia : New Bulgarian University .
  • Gayler , R. W. Vector symbolic architectures answer Jackendoff's challenges for cognitive neuroscience . Proceedings of the ICCS/ASCS international conference on cognitive science. pp. 133 – 138 . Sydney , , Australia : University of New South Wales .
  • Gers , F. A. and Cummins , F. 2000 . Learning to forget: Continual prediction with LSTM . Neural Computation , 12 ( 10 ) : 2451 – 2471 . (doi:10.1162/089976600300015015)
  • Gers , F. A. and Schmidhuber , J. Recurrent nets that time and count . Proceedings of the international joint conference on neural networks . Edited by: Amari , S.-I , Lee Giles , C. , Gori , M. and Piuri , V. Vol. 3 , pp. 189 – 194 . IEEE .
  • Gers , F. A. and Schmidhuber , J. 2001 . LSTM recurrent networks learn simple context-free and context-sensitive languages . IEEE Transactions on Neural Networks , 12 ( 6 ) : 1333 – 1340 . (doi:10.1109/72.963769)
  • Graves , A. , Eck , D. , Beringer , N. and Schmidhuber , J. Biologically plausible speech recognition with LSTM neural nets . Proceedings of the international workshop on biologically inspired approaches to advanced information technology. Edited by: Ijspeert , A. J. , Murata , M. and Wakamiya , N. pp. 127 – 136 . Berlin : Springer .
  • Hadley , R. F. and Cardei , V. C. 1999 . Language acquisition from sparse input without error feedback . Neural Networks , 12 ( 2 ) : 217 – 235 . (doi:10.1016/S0893-6080(98)00139-7)
  • Hadley , R. F. and Hayward , M. 1997 . Strong semantic systematicity from Hebbian connectionist learning . Minds and Machines , 7 ( 1 ) : 1 – 37 . (doi:10.1023/A:1008252408222)
  • Hochreiter , S. and Schmidhuber , J. 1997 . Long short-term memory . Neural Computation , 9 ( 8 ) : 1735 – 1780 . (doi:10.1162/neco.1997.9.8.1735)
  • Jordan , M. I. Attractor dynamics and parallelism in a connectionist sequential machine . Proceedings of the conference of the cognitive science society. pp. 531 – 546 . Hillsdale , NJ : Lawrence Erlbaum Associates .
  • Kanerva , P. Binary spatter-coding of ordered K-tuples . Proceedings of the international conference on artificial neural networks. Edited by: von der Malsburg , C. , von Seelen , W. , Vorbrüggen , J. C. and Sendhoff , B. pp. 869 – 873 . Berlin : Springer .
  • Kanerva , P. Fully distributed representation . Proceedings of the Real World Computing Symposium , pp. 358 – 365 .
  • Kanerva , P. 2009 . Hyperdimensional computing: An introduction to computing in distributed representation with high-dimensional random vectors . Cognitive Computation , 1 ( 2 ) : 139 – 159 . (doi:10.1007/s12559-009-9009-8)
  • Kohonen , T. 1990 . The self-organizing map . Proceedings of the IEEE , 78 : 1464 – 1480 . (doi:10.1109/5.58325)
  • Levy , S. D. and Gayler , R. W. Vector symbolic architectures: A new building material for artificial general intelligence . Proceedings of the first conference on artificial general intelligence . Edited by: Wang , P. , Goertzel , B. and Franklin , S. Amsterdam : IOS Press .
  • Monner , D. and Reggia , J. A. 2012a . Neural architectures for learning to answer questions . Biologically Inspired Cognitive Architectures , 2 : 37 – 53 . (doi:10.1016/j.bica.2012.06.002)
  • Monner , D. and Reggia , J. A. 2012b . A generalized LSTM-like training algorithm for second-order recurrent neural networks . Neural Networks , 25 : 70 – 83 . (doi:10.1016/j.neunet.2011.07.003)
  • Neumann , J. 2002 . Learning the systematic transformation of holographic reduced representations . Cognitive Systems Research , 3 : 227 – 235 . (doi:10.1016/S1389-0417(01)00059-6)
  • Plate , T. A. 1994 . Distributed representations and nested compositional structure , Toronto : University of Toronto . (PhD Dissertation).
  • Plate , T. A. 1995 . Holographic reduced representations . IEEE Transactions on Neural Networks , 6 ( 3 ) : 623 – 641 . (doi:10.1109/72.377968)
  • Plate , T. A. 1997 . “ A common framework for distributed representation schemes for compositional structure ” . In Connectionist systems for knowledge representation and deduction , Edited by: Maire , F. , Hayward , R. and Diederich , J. 15 – 34 . Brisbane , AU : Queensland University of Technology .
  • Smolensky , P. 1990 . Tensor product variable binding and the representation of symbolic structures in connectionist systems . Artificial Intelligence , 46 : 159 – 216 . (doi:10.1016/0004-3702(90)90007-M)
  • St. John , M. F. and McClelland , J. L. 1990 . Learning and applying contextual constraints in sentence comprehension . Artificial Intelligence , 46 ( 1–2 ) : 217 – 257 . (doi:10.1016/0004-3702(90)90008-N)
  • Weems , S. A. and Reggia , J. A. 2006 . Simulating single word processing in the classic aphasia syndromes based on the Wernicke–Lichtheim–Geschwind theory . Brain and Language , 98 ( 3 ) : 291 – 309 . (doi:10.1016/j.bandl.2006.06.001)
  • Werbos , P. J. 1990 . Backpropagation through time: What it does and how to do it . Proceedings of the IEEE , 78 ( 10 ) : 1550 – 1560 . (doi:10.1109/5.58337)
  • Williams , R. J. and Zipser , D. 1989 . A learning algorithm for continually running fully recurrent neural networks . Neural Computation , 1 ( 2 ) : 270 – 280 . (doi:10.1162/neco.1989.1.2.270)
  • Xie , X. and Seung , H. S. 2003 . Equivalence of backpropagation and contrastive Hebbian learning in a layered network . Neural Computation , 15 ( 2 ) : 441 – 454 . (doi:10.1162/089976603762552988)

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.