165
Views
2
CrossRef citations to date
0
Altmetric
Original Articles

Coexistence of multiple continuous attractors for lower-ordered neural networks

ORCID Icon, , , ORCID Icon &
Pages 2462-2473 | Received 17 Sep 2019, Accepted 02 Dec 2019, Published online: 30 Dec 2019

References

  • E. Aksay, G. Gamkrelidze, H.S. Seung, R. Baker, and D.W. Tank, In vivo intracellular recording and perturbation of persistent activity in a neural integrator, Nat. Neurosci. 4(2) (2001), pp. 184–193. doi: 10.1038/84023
  • K. Cai, and J. Shen, Continuous attractor neural network model of multisensory integration, 2011 International Conference on System science, Engineering design and Manufacturing informatization IEEE, 2 (2011), pp. 352–355.
  • Y. Cao, S. Wang, Z. Guo, T. Huang, and S. Wen, Synchronization of memristive neural networks with leakage delay and parameters mismatch via event-triggered control, Neural. Netw. 119 (2019), pp. 178–189. doi: 10.1016/j.neunet.2019.08.011
  • Z. Guo, S. Gong, S. Wen, and T. Huang, Event-based synchronization control for memristive neural networks with time-varying delay, IEEE. Trans. Cybern.49(9) (2019), pp. 3268–3277. doi: 10.1109/TCYB.2018.2839686
  • R.L.T. Hahnloser, On the piecewise analysis of networks of linear threshold neurons, Neural. Netw. 11(8) (1998), pp. 691–697. doi: 10.1016/S0893-6080(98)00012-4
  • J. J. Hopfield, Neurons with graded response have collective computational properties like those of two-state neurons. Proc. Natl. Acad. Sci. USA. 81(10) (1984), pp. 3088–3092. doi: 10.1073/pnas.81.10.3088
  • C.K. Machens and C.D. Brody, Design of continuous attractor networks with monotonic tuning using a symmetry principle, Neural Comput. 20(2) (2008), pp. 452–485. doi: 10.1162/neco.2007.07-06-297
  • G. Major and D. Tank, Persistent neural activity: Prevalence and mechanisms, Curr. Opin. Neurobiol. 14(6) (2004), pp. 675–684. doi: 10.1016/j.conb.2004.10.017
  • A. Meyer-Base and V. Thummler, Local and global stability analysis of an unsupervised competitive neural network, IEEE Trans. Neural Netw. 19(2) (2008), pp. 346–351. doi: 10.1109/TNN.2007.908626
  • H.S. Seung, Learning continuous attractors in recurrent networks, Adv. Neural Info. Proc. Syst. 10 (1998), pp. 654–660.
  • H.S. Seung, How the brain keeps the eyes still, Proc. Nat. Acad. Sci. USA 93 (1996), pp. 13339–13344. doi: 10.1073/pnas.93.23.13339
  • H.S. Seung, Continuous attractors and oculomotor control, Science 290 (2000), pp. 2268–2269. doi: 10.1126/science.290.5500.2268
  • T. Shen and I.R. Petersen, Linear threshold discrete-time recurrent neural networks: Stability and globally attractive sets, IEEE. Trans. Automat. Control 61(9) (2016), pp. 2650–2656. doi: 10.1109/TAC.2015.2503360
  • W. Si, K.Y.M. Wong, and C.C.A. Fung, Continuous attractor neural networks: Candidate of a canonical model for neural information representation. F1000research 5(16) (2016), pp. 209–226.
  • K.C. Tan, H. Tang, and W. Zhang, Qualitative analysis for recurrent neural networks with linear threshold transfer functions, IEEE Trans. Circuit Syst. I 52(5) (2005), pp. 1003–1012. doi: 10.1109/TCSI.2005.846664
  • X.J. Wang, Synaptic reverberation underlying mnemonic persistent activity, Trends Neurosci. 24(8) (2001), pp. 455–463. doi: 10.1016/S0166-2236(00)01868-3
  • C. Wang, F. Wang, and J. Yu, BLF-based asymptotic tracking control for a class of time-varying full state constrained nonlinear systems, T. I. Meas. Control 41(11) (2019), pp. 3043–3052. doi: 10.1177/0142331218818656
  • C. Wang, Y. Wu, and J. Yu, Barrier Lyapunov functions-based dynamic surface control for pure-feedback systems with full state constraints, IET Control Theory A. 11(4) (2017), pp. 524–530. doi: 10.1049/iet-cta.2016.0333
  • Z. Yi, K.K. Tan, and T.H. Lee, Multistability analysis for recurrent neural networks with unsaturating piecewise linear transfer functions, Neural Comput. 15(3) (2003), pp. 639–662. doi: 10.1162/089976603321192112
  • J. Yu, H. Mao, and Z. Yi, Parameter as a switch between dynamical states of a network in population decoding. IEEE. Trans. Neural Netw. Learn. Syst. 28(4) (2017), pp. 911–916. doi: 10.1109/TNNLS.2015.2485263
  • J. Yu, Z. Yi, and J. Leng, Line attractors of coupled ring neural networks with block circulant weight matrix, Int. J. Comput. Math. (2019). doi:10.1080/00207160.2019.1585825
  • J. Yu, Z. Yi, and L. Zhang, Representations of continuous attractors of recurrent neural networks, IEEE T. Neural Netw. 20(2) (2009), pp. 368–372. doi: 10.1109/TNN.2008.2010771
  • G.B. Zaffari, M.M. Dos Santos, and P.L.J. Drews, Effects of water currents in a continuous attractor neural network for SLAM applications, 2016 XIII Latin American Robotics Symposium and IV Brazilian Robotics Symposium (LARS/SBR), IEEE, 2016, pp. 328–333.
  • R. Zhang, D. Zeng, and S. Zhong, Event-triggered sampling control for stability and stabilization of memristive neural networks with communication delays, Appl. Math. Comput. 310 (2017), pp. 57–74.
  • X. Zeng, L. Shu, and J. Jiang, Fuzzy time series forecasting based on Grey model and Markov chain, Int. J. Appl. Math. 46(4) (2016), pp. 464–472.
  • X. Zou, Z. Ji, X. Liu, Learning a continuous attractor neural network from real images, International Conference on Neural Information Processing, Springer, 2017.
  • L. Zou, H. Tang, K.C. Tan, and W. Zhang, Analysis of continuous attractors for 2-D linear threshold neural networks, IEEE T. Neural Netw. 20(1) (2009), pp. 175–180. doi: 10.1109/TNN.2008.2009535

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.