404
Views
0
CrossRef citations to date
0
Altmetric
Editorial

Special issue on “Neural Network Simulation”

Pages 129-130 | Published online: 09 Nov 2012

Simulation of neural models is increasingly used in neuroscience, at various levels of complexity. As a result, over the last 30 years, simulation tools have grown from a set of custom model-specific scripts to a number of full-fledged neural simulators. This mutation in computational neuroscience has come with a number of scientific challenges: how to simulate large scale models? How to cope with the complexity and variety of biological models? How to share models between simulators? How to take advantage of new computing architectures?

In this special issue, leading experts in the field address these challenges. A major question, now that there are several simulators with their own user communities, is how to communicate models between these different communities. This will be become an even more critical issue as models become more complex and driven by experimental data. Crook et al. describe a set of good practices for sharing and describing models. Brette suggests that the design of simulator languages should expose the models in their mathematical form, an already existing universal language. Davison proposes that, as models become more and more complex, computational neuroscience should develop into an open collaborative effort.

Another critical question in this field is how to make simulations faster. A number of teams are currently investigating the potential of graphics processing units (GPUs), which are cheap pieces of accelerating hardware present on most modern computers. Modern GPUs contain hundreds of cores that can be used for parallel simulation of neural networks, an effort by reviewed by Brette and Goodman. Slazynski and Bohte present specific algorithms to simulate the popular Spiking Response model and similar models on GPUs. Dinkelbach et al. provide a performance comparison between different GPU implementations and a conventional architecture for rate-based neural networks.

Finally, two new neural simulators are described in this issue. DANA is a simulator for abstract neural models described in discrete time. Nexa is a simulator for large-scale models of mostly rate-based neural networks. Chessa et al. present strategies to accelerate the simulation of rate-based models in vision.

Together, the papers in this issue witness the increasing availability of simulation technology for computational neuroscientists. Arguably, the race is open for applying them to the most relevant neuroscience questions. It will be interesting to see how these new simulation technologies contribute to the increase of our understanding of brain function and, conversely, how these real-life benchmarks feed back into the further evolution of the simulation frameworks.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.