请输入您要查询的百科知识:

 

词条 Computational neuroscience
释义

  1. History

  2. Major topics

     Single-neuron modeling  Development, axonal patterning, and guidance  Sensory processing  Memory and synaptic plasticity  Behaviors of networks  Cognition, discrimination, and learning  Consciousness  Computational clinical neuroscience 

  3. See also

  4. Notes and references

  5. Bibliography

  6. External links

     Journals  Software  Conferences  Websites 
{{technical|date=March 2014}}Computational neuroscience (also known as theoretical neuroscience or mathematical neuroscience) is a branch of neuroscience which employs mathematical models, theoretical analysis and abstractions of the brain to understand the principles that govern the development, structure, physiology and cognitive abilities of the nervous system.[1][2][3][4]

Computational neuroscience focuses on the description of biologically plausible neurons (and neural systems) and their physiology and dynamics, and it is therefore not concerned with biologically unrealistic disciplines such as connectionism, machine learning, artificial neural networks, artificial intelligence and computational learning theory.[5][6][7][8][9][10][11][12]

Arguably, computational neuroscience would be a sub-field of theoretical neuroscience which employs computational simulations to validate and solve the mathematical models. However, since the biologically plausible mathematical models formulated in neuroscience are in most cases too complex to be solved analytically, the two terms are essentially synonyms and are used interchangeably.[1] The term mathematical neuroscience is also used sometimes, to stress the quantitative nature of the field.[13]

The mathematical models formulated in computational neuroscience are useful since they capture the essential features of the biological system at multiple spatial-temporal scales, from membrane currents, proteins, and chemical coupling to network oscillations, columnar and topographic architecture, and learning and memory. Furthermore, these computational models frame hypotheses that can be directly tested by biological or psychological experiments.

History

The term 'computational neuroscience' was introduced by Eric L. Schwartz, who organized a conference, held in 1985 in Carmel, California, at the request of the Systems Development Foundation to provide a summary of the current status of a field which until that point was referred to by a variety of names, such as neural modeling, brain theory and neural networks. The proceedings of this definitional meeting were published in 1990 as the book Computational Neuroscience.[14] The first open international meeting focused on Computational Neuroscience was organized by James M. Bower and John Miller in San Francisco, California in 1989 and has continued each year since as the annual CNS meeting.[15] The first graduate educational program in computational neuroscience was organized as the Computational and Neural Systems Ph.D. program at the California Institute of Technology in 1985.

The early historical roots of the field can be traced to the work of people such as Louis Lapicque, Hodgkin & Huxley, Hubel & Wiesel, and David Marr, to name a few. Lapicque introduced the integrate and fire model of the neuron in a seminal article published in 1907.[16] This model is still popular today for artificial neural networks studies because of its simplicity (see a recent review[17]).

About 40 years later, Hodgkin & Huxley developed the voltage clamp and created the first biophysical model of the action potential. Hubel & Wiesel discovered that neurons in the primary visual cortex, the first cortical area to process information coming from the retina, have oriented receptive fields and are organized in columns.[18] David Marr's work focused on the interactions between neurons, suggesting computational approaches to the study of how functional groups of neurons within the hippocampus and neocortex interact, store, process, and transmit information. Computational modeling of biophysically realistic neurons and dendrites began with the work of Wilfrid Rall, with the first multicompartmental model using cable theory.

Major topics

Research in computational neuroscience can be roughly categorized into several lines of inquiry. Most computational neuroscientists collaborate closely with experimentalists in analyzing novel data and synthesizing new models of biological phenomena.

Single-neuron modeling

{{main|Biological neuron models}}

Even single neurons have complex biophysical characteristics and can perform computations (e.g.[19]). Hodgkin and Huxley's original model only employed two voltage-sensitive currents (Voltage sensitive ion channels are glycoprotein molecules which extend through the lipid bilayer, allowing ions to traverse under certain conditions through the axolemma), the fast-acting sodium and the inward-rectifying potassium. Though successful in predicting the timing and qualitative features of the action potential, it nevertheless failed to predict a number of important features such as adaptation and shunting. Scientists now believe that there are a wide variety of voltage-sensitive currents, and the implications of the differing dynamics, modulations, and sensitivity of these currents is an important topic of computational neuroscience.[20]

The computational functions of complex dendrites are also under intense investigation. There is a large body of literature regarding how different currents interact with geometric properties of neurons.[21]

Some models are also tracking biochemical pathways at very small scales such as spines or synaptic clefts.

There are many software packages, such as GENESIS and NEURON, that allow rapid and systematic in silico modeling of realistic neurons. Blue Brain, a project founded by Henry Markram from the École Polytechnique Fédérale de Lausanne, aims to construct a biophysically detailed simulation of a cortical column on the Blue Gene supercomputer.

Modeling the richness of biophysical properties on the single-neuron scale can supply mechanisms that serve as the building blocks for network dynamics.[22] However, detailed neuron descriptions are computationally expensive and this can handicap the pursuit of realistic network investigations, where many neurons need to be simulated. As a result, researchers that study large neural circuits typically represent each neuron and synapse with an artificially simple model, ignoring much of the biological detail. Hence there is a drive to produce simplified neuron models that can retain significant biological fidelity at a low computational overhead. Algorithms have been developed to produce faithful, faster running, simplified surrogate neuron models from computationally expensive, detailed neuron models.[23]

Development, axonal patterning, and guidance

Computational neuroscience aims to address a wide array of questions. How do axons and dendrites form during development? How do axons know where to target and how to reach these targets? How do neurons migrate to the proper position in the central and peripheral systems? How do synapses form? We know from molecular biology that distinct parts of the nervous system release distinct chemical cues, from growth factors to hormones that modulate and influence the growth and development of functional connections between neurons.

Theoretical investigations into the formation and patterning of synaptic connection and morphology are still nascent. One hypothesis that has recently garnered some attention is the minimal wiring hypothesis, which postulates that the formation of axons and dendrites effectively minimizes resource allocation while maintaining maximal information storage.[24]

Sensory processing

Early models of sensory processing understood within a theoretical framework are credited to Horace Barlow. Somewhat similar to the minimal wiring hypothesis described in the preceding section, Barlow understood the processing of the early sensory systems to be a form of efficient coding, where the neurons encoded information which minimized the number of spikes. Experimental and computational work have since supported this hypothesis in one form or another.

Current research in sensory processing is divided among a biophysical modelling of different subsystems and a more theoretical modelling of perception. Current models of perception have suggested that the brain performs some form of Bayesian inference and integration of different sensory information in generating our perception of the physical world.[25][26]

Memory and synaptic plasticity

{{main|Synaptic plasticity}}

Earlier models of memory are primarily based on the postulates of Hebbian learning. Biologically relevant models such as Hopfield net have been developed to address the properties of associative (also known as "content-addressable") style of memory that occur in biological systems. These attempts are primarily focusing on the formation of medium- and long-term memory, localizing in the hippocampus. Models of working memory, relying on theories of network oscillations and persistent activity, have been built to capture some features of the prefrontal cortex in context-related memory.[27] Additional models look at the close relationship between the basal ganglia and the prefrontal cortex and how that contributes to working memory.[28]

One of the major problems in neurophysiological memory is how it is maintained and changed through multiple time scales. Unstable synapses are easy to train but also prone to stochastic disruption. Stable synapses forget less easily, but they are also harder to consolidate. One recent computational hypothesis involves cascades of plasticity that allow synapses to function at multiple time scales.[29] Stereochemically detailed models of the acetylcholine receptor-based synapse with the Monte Carlo method, working at the time scale of microseconds, have been built.[30] It is likely that computational tools will contribute greatly to our understanding of how synapses function and change in relation to external stimulus in the coming decades.

Behaviors of networks

Biological neurons are connected to each other in a complex, recurrent fashion. These connections are, unlike most artificial neural networks, sparse and usually specific. It is not known how information is transmitted through such sparsely connected networks, although specific areas of the brain, such as the Visual cortex, are understood in some detail.[31] It is also unknown what the computational functions of these specific connectivity patterns are, if any.

The interactions of neurons in a small network can be often reduced to simple models such as the Ising model. The statistical mechanics of such simple systems are well-characterized theoretically. There has been some recent evidence that suggests that dynamics of arbitrary neuronal networks can be reduced to pairwise interactions.[32] It is not known, however, whether such descriptive dynamics impart any important computational function. With the emergence of two-photon microscopy and calcium imaging, we now have powerful experimental methods with which to test the new theories regarding neuronal networks.

In some cases the complex interactions between inhibitory and excitatory neurons can be simplified using mean field theory, which gives rise to the population model of neural networks [33]. While many neurotheorists prefer such models with reduced complexity, others argue that uncovering structural functional relations depends on including as much neuronal and network structure as possible. Models of this type are typically built in large simulation platforms like GENESIS or NEURON. There have been some attempts to provide unified methods that bridge and integrate these levels of complexity.[34]

Cognition, discrimination, and learning

Computational modeling of higher cognitive functions has only recently{{When|date=February 2016}} begun. Experimental data comes primarily from single-unit recording in primates. The frontal lobe and parietal lobe function as integrators of information from multiple sensory modalities. There are some tentative ideas regarding how simple mutually inhibitory functional circuits in these areas may carry out biologically relevant computation.[35]

The brain seems to be able to discriminate and adapt particularly well in certain contexts. For instance, human beings seem to have an enormous capacity for memorizing and recognizing faces. One of the key goals of computational neuroscience is to dissect how biological systems carry out these complex computations efficiently and potentially replicate these processes in building intelligent machines.

The brain's large-scale organizational principles are illuminated by many fields, including biology, psychology, and clinical practice. Integrative neuroscience attempts to consolidate these observations through unified descriptive models and databases of behavioral measures and recordings. These are the bases for some quantitative modeling of large-scale brain activity.[36]

The Computational Representational Understanding of Mind (CRUM) is another attempt at modeling human cognition through simulated processes like acquired rule-based systems in decision making and the manipulation of visual representations in decision making.

Consciousness

One of the ultimate goals of psychology/neuroscience is to be able to explain the everyday experience of conscious life. Francis Crick and Christof Koch made some attempts to formulate a consistent framework for future work in neural correlates of consciousness (NCC), though much of the work in this field remains speculative.[37]

Computational clinical neuroscience

Computational Clinical Neuroscience is a field that brings together experts in neuroscience, neurology, psychiatry, decision sciences and computational modeling to quantitatively define and investigate problems in neurological and psychiatric diseases, and to train scientists and clinicians that wish to apply these models to diagnosis and treatment.[38][39]

See also

  • Action potential
  • Biological neuron models
  • Bayesian Brain
  • Brain simulation
  • Computational anatomy
  • Connectomics
  • Electrophysiology
  • Goldman equation
  • Hodgkin–Huxley model
  • Information theory
  • Mathematical model
  • Nonlinear dynamics
  • Neural coding
  • Neural decoding
  • Neural oscillation
  • Neurocomputational speech processing
  • Neuroinformatics
  • Neuroplasticity
  • Neurophysiology
  • Systems neuroscience
  • Theoretical Biology
  • Theta model

Notes and references

1. ^{{Cite book|title=Fundamentals of Computational Neuroscience|last=Trappenberg|first=Thomas P.|publisher=Oxford University Press Inc.|year=2002|isbn=978-0-19-851582-1|location=United States|pages=1}}
2. ^What is computational neuroscience? Patricia S. Churchland, Christof Koch, Terrence J. Sejnowski. in Computational Neuroscience pp.46-55. Edited by Eric L. Schwartz. 1993. MIT Press {{cite web |url=http://mitpress.mit.edu/catalog/item/default.asp?ttype=2&tid=7195 |title=Archived copy |accessdate=2009-06-11 |deadurl=yes |archiveurl=https://web.archive.org/web/20110604124206/http://mitpress.mit.edu/catalog/item/default.asp?ttype=2&tid=7195 |archivedate=2011-06-04 |df= }}
3. ^{{Cite web|url=https://mitpress.mit.edu/books/theoretical-neuroscience|title=Theoretical Neuroscience|last=Press|first=The MIT|website=The MIT Press|language=en|access-date=2018-05-24}}
4. ^{{ cite book | author1= Gerstner, W. | author2 = Kistler, W. | author3 = Naud, R. | author4 = Paninski, L.| title = Neuronal Dynamics | publisher = Cambridge University Press | location = Cambridge, UK | year = 2014 | isbn = 9781107447615}}
5. ^{{Cite web|url=http://www.encephalos.gr/48-1-01e.htm|title=Encephalos Journal|website=www.encephalos.gr|access-date=2018-02-20}}
6. ^{{Cite journal|last=Zorzi|first=Marco|last2=Testolin|first2=Alberto|last3=Stoianov|first3=Ivilin P.|date=2013-08-20|title=Modeling language and cognition with deep unsupervised learning: a tutorial overview|journal=Frontiers in Psychology|volume=4|pages=515|doi=10.3389/fpsyg.2013.00515|issn=1664-1078|pmc=3747356|pmid=23970869}}
7. ^{{Cite web|url=https://pdfs.semanticscholar.org/e953/59bc80e624a963a3d8c943e3b2898a397ef7.pdf|title=Organismically-inspired robotics: homeostatic adaptation and teleology beyond the closed sensorimotor loop|last=|first=|date=|website=|access-date=}}
8. ^{{Cite book|url=https://books.google.com/books?id=7pPv0STSos8C&pg=PA63&lpg=PA63&dq=%22biological+realism%22#v=onepage&q=%22biological%20realism%22&f=false|title=Connectionism in Perspective|last=Pfeifer|first=R.|last2=Schreter|first2=Z.|last3=Fogelman-Soulié|first3=F.|last4=Steels|first4=L.|date=1989-08-23|publisher=Elsevier|isbn=9780444598769|location=|pages=|language=en}}
9. ^{{Cite web|url=http://scholarworks.sjsu.edu/cgi/viewcontent.cgi?article=1015&context=comparativephilosophy|title=ANALYTIC AND CONTINENTAL PHILOSOPHY|last=|first=|date=|website=|access-date=}}
10. ^{{Cite journal|last=Shai|first=Adam|last2=Larkum|first2=Matthew Evan|date=2017-12-05|title=Branching into brains|journal=eLife|language=en|volume=6|doi=10.7554/eLife.33066|pmid=29205152|pmc=5716658|issn=2050-084X}}
11. ^{{Cite journal|date=2012-02-22|title=Turing centenary: Is the brain a good model for machine intelligence?|journal=Nature|language=En|volume=482|issue=7386|pages=462–463|doi=10.1038/482462a|pmid=22358812|issn=0028-0836|last1=Brooks|first1=R.|last2=Hassabis|first2=D.|last3=Bray|first3=D.|last4=Shashua|first4=A.}}
12. ^{{Cite book|url=https://books.google.com/?id=uV9TZzOITMwC&pg=PA17&lpg=PA17&dq=%22biological%20plausibility%22#v=onepage&q=%22biological%20plausibility%22&f=false|title=Neural Network Perspectives on Cognition and Adaptive Robotics|last=Browne|first=A.|date=1997-01-01|publisher=CRC Press|isbn=9780750304559|language=en}}
13. ^{{Cite journal|last=Gutkin|first=Boris|last2=Pinto|first2=David|last3=Ermentrout|first3=Bard|date=March 2003|title=Mathematical neuroscience: from neurons to circuits to systems|journal=Journal of Physiology, Paris|volume=97|issue=2–3|pages=209–219|doi=10.1016/j.jphysparis.2003.09.005|issn=0928-4257|pmid=14766142|citeseerx=10.1.1.5.9572}}
14. ^{{cite book |author=Schwartz, Eric |title=Computational neuroscience |publisher=MIT Press |location=Cambridge, Mass |year=1990 |pages= |isbn=978-0-262-19291-0 }}
15. ^{{cite book |author=Bower, James M. |title=20 years of Computational neuroscience |publisher=Springer |location=Berlin, Germany |year=2013 |isbn=978-1461414230}}
16. ^{{cite journal |author=Lapicque L |title= Recherches quantitatives sur l'excitation électrique des nerfs traitée comme une polarisation |journal=J. Physiol. Pathol. Gen. |volume=9 |pages=620–635 |year=1907}}
17. ^{{cite journal |vauthors=Brunel N, Van Rossum MC |title= Lapicque's 1907 paper: from frogs to integrate-and-fire |journal=Biol. Cybern. |volume=97 |pages=337–339 |year=2007 |pmid=17968583 |doi=10.1007/s00422-007-0190-0 |issue=5–6}}
18. ^{{cite journal |vauthors=Hubel DH, Wiesel TN |title=Receptive fields, binocular interaction and functional architecture in the cat's visual cortex |journal=J. Physiol. |volume=160 |pages=106–54 |year=1962 |pmid=14449617 |pmc=1359523 |doi= 10.1113/jphysiol.1962.sp006837|url=http://www.jphysiol.org/cgi/pmidlookup?view=long&pmid=14449617 |issue=1}}
19. ^{{cite journal |author=Forrest MD |title=Intracellular Calcium Dynamics Permit a Purkinje Neuron Model to Perform Toggle and Gain Computations Upon its Inputs. |journal=Frontiers in Computational Neuroscience |volume=8 |pages=86 |year=2014 | doi=10.3389/fncom.2014.00086 |pmid=25191262 |pmc=4138505}}
20. ^{{cite book |author1=Wu, Samuel Miao-sin |author2=Johnston, Daniel |title=Foundations of cellular neurophysiology |publisher=MIT Press |location=Cambridge, Mass |year=1995 |isbn=978-0-262-10053-3 }}
21. ^{{cite book |author=Koch, Christof |title=Biophysics of computation: information processing in single neurons |publisher=Oxford University Press |location=Oxford [Oxfordshire] |year=1999 |isbn=978-0-19-510491-2 }}
22. ^{{cite journal|author=Forrest MD|year=2014|title=Intracellular Calcium Dynamics Permit a Purkinje Neuron Model to Perform Toggle and Gain Computations Upon its Inputs.|journal=Frontiers in Computational Neuroscience|volume=8|pages=86|doi=10.3389/fncom.2014.00086|pmc=4138505|pmid=25191262}}
23. ^{{cite journal |author=Forrest MD |title=Simulation of alcohol action upon a detailed Purkinje neuron model and a simpler surrogate model that runs >400 times faster |journal= BMC Neuroscience | volume=16 |issue=27 |pages=27 | date=April 2015 |doi=10.1186/s12868-015-0162-6 |pmid=25928094 |pmc=4417229 }}
24. ^{{cite journal |vauthors=Chklovskii DB, Mel BW, Svoboda K |title=Cortical rewiring and information storage |journal=Nature |volume=431 |issue=7010 |pages=782–8 |date=October 2004|pmid=15483599 |doi=10.1038/nature03012 |bibcode = 2004Natur.431..782C }}
Review article
25. ^{{cite journal|last1=Weiss|first1=Yair|last2=Simoncelli|first2=Eero P.|last3=Adelson|first3=Edward H.|title=Motion illusions as optimal percepts|journal=Nature Neuroscience|date=20 May 2002|volume=5|issue=6|pages=598–604|doi=10.1038/nn0602-858|pmid=12021763}}
26. ^{{cite journal|last1=Ernst|first1=Marc O.|last2=Bülthoff|first2=Heinrich H.|title=Merging the senses into a robust percept|journal=Trends in Cognitive Sciences|date=April 2004|volume=8|issue=4|pages=162–169|doi=10.1016/j.tics.2004.02.002|pmid=15050512|citeseerx=10.1.1.299.4638}}
27. ^{{cite journal |vauthors=Durstewitz D, Seamans JK, Sejnowski TJ |title=Neurocomputational models of working memory |journal=Nat. Neurosci. |volume=3 |issue=Suppl |pages=1184–91 |year=2000 |pmid=11127836 |doi=10.1038/81460 |url=}}
28. ^{{Cite journal|url=https://link.springer.com/content/pdf/10.3758/CABN.1.2.137.pdf|title=Interactions between frontal cortex and basal ganglia in working memory: A computational model|journal=Cognitive, Affective, & Behavioral Neuroscience|volume=1|issue=2|pages=137–160|doi=10.3758/cabn.1.2.137|archive-url=|archive-date=|dead-url=|access-date=2018-12-06|year=2001|last1=Frank|first1=M. J.|last2=Loughry|first2=B.|last3=O'Reilly|first3=R. C.}}
29. ^{{cite journal |vauthors=Fusi S, Drew PJ, Abbott LF |title=Cascade models of synaptically stored memories |journal=Neuron |volume=45 |issue=4 |pages=599–611 |year=2005 |pmid=15721245 |doi=10.1016/j.neuron.2005.02.001 }}
30. ^{{cite journal |vauthors=Coggan JS, Bartol TM, Esquenazi E |title=Evidence for ectopic neurotransmission at a neuronal synapse |journal=Science |volume=309 |issue=5733 |pages=446–51 |year=2005 |pmid=16020730 |pmc=2915764 |doi=10.1126/science.1108239 |bibcode = 2005Sci...309..446C |display-authors=etal}}
31. ^{{Cite journal|last=Olshausen|first=Bruno A.|last2=Field|first2=David J.|date=1997-12-01|title=Sparse coding with an overcomplete basis set: A strategy employed by V1?|journal=Vision Research|volume=37|issue=23|pages=3311–3325|doi=10.1016/S0042-6989(97)00169-7}}
32. ^{{cite journal |vauthors=Schneidman E, Berry MJ, Segev R, Bialek W |title=Weak pairwise correlations imply strongly correlated network states in a neural population |journal=Nature |volume=440 |issue=7087 |pages=1007–12 |year=2006 |pmid=16625187 |pmc=1785327 |doi=10.1038/nature04701 |bibcode=2006Natur.440.1007S|arxiv = q-bio/0512013 }}
33. ^{{cite journal |author1=Wilson, H. R. |author2=Cowan, J.D. |title=A mathematical theory of the functional dynamics of cortical and thalamic nervous tissue |journal=Kybernetik |volume=13 |issue=2 |pages=55–80 |year=1973 |doi= 10.1007/BF00288786|pmid=4767470 }}
34. ^{{cite book |author1=Anderson, Charles H. |author2=Eliasmith, Chris |title=Neural Engineering: Computation, Representation, and Dynamics in Neurobiological Systems (Computational Neuroscience) |publisher=The MIT Press |location=Cambridge, Mass |year=2004 |pages= |isbn=978-0-262-55060-4 }}
35. ^{{cite journal |vauthors=Machens CK, Romo R, Brody CD |title=Flexible control of mutual inhibition: a neural model of two-interval discrimination |journal=Science |volume=307 |issue=5712 |pages=1121–4 |year=2005 |pmid=15718474 |doi=10.1126/science.1104171 |bibcode = 2005Sci...307.1121M |citeseerx=10.1.1.523.4396 }}
36. ^{{cite journal |vauthors=Robinson PA, Rennie CJ, Rowe DL, O'Connor SC, Gordon E | title=Multiscale brain modelling | journal=Philosophical Transactions of the Royal Society B | volume=360 | issue=1457|pages=1043–1050|year=2005|doi=10.1098/rstb.2005.1638 |pmid=16087447 |pmc=1854922 }}
37. ^{{cite journal |vauthors=Crick F, Koch C |title=A framework for consciousness |journal=Nat. Neurosci. |volume=6 |issue=2 |pages=119–26 |year=2003 |pmid=12555104 |doi=10.1038/nn0203-119|url= https://zenodo.org/record/852680 }}
38. ^{{cite journal |author=Adaszewski S1, Dukart J, Kherif F, Frackowiak R, Draganski B; Alzheimer's Disease Neuroimaging Initiative|title=How early can we predict Alzheimer's disease using computational anatomy?|journal=Neurobiol Aging |volume=34 |issue=12 |pages=2815–26 |year=2013 |doi=10.1016/j.neurobiolaging.2013.06.015 |pmid=23890839}}
39. ^{{cite journal |vauthors=Friston KJ, Stephan KE, Montague R, Dolan RJ |title=Computational psychiatry: the brain as a phantastic organ |journal=Lancet Psychiatry |volume=1 |issue=2 |pages=148–58 |year=2014 |doi=10.1016/S2215-0366(14)70275-5 |pmid=26360579 }}

Bibliography

  • {{cite journal |author=Chklovskii DB |title=Synaptic connectivity and neuronal morphology: two sides of the same coin |journal=Neuron |volume=43 |issue=5 |pages=609–17 |year=2004 |pmid=15339643 |doi=10.1016/j.neuron.2004.08.012 }}
  • {{cite book |author1=Sejnowski, Terrence J. |author2=Churchland, Patricia Smith |title=The computational brain |publisher=MIT Press |location=Cambridge, Mass |year=1992 |isbn=978-0-262-03188-2 }}
  • {{ cite book | author1= Gerstner, W. | author2 = Kistler, W. | author3 = Naud, R. | author4 = Paninski, L.| title = Neuronal Dynamics | publisher = Cambridge University Press | location = Cambridge, UK | year = 2014 | isbn = 9781107447615}}
  • {{cite book |author1=Abbott, L. F. |author2=Dayan, Peter |title=Theoretical neuroscience: computational and mathematical modeling of neural systems |publisher=MIT Press |location=Cambridge, Mass |year=2001 |isbn=978-0-262-04199-7 }}
  • {{cite book |author1=Eliasmith, Chris |author2=Anderson, Charles H. |title=Neural engineering: Representation, computation, and dynamics in neurobiological systems |publisher=MIT Press |location=Cambridge, Mass |year=2003 |isbn=978-0-262-05071-5 }}
  • {{cite journal |vauthors=Hodgkin AL, Huxley AF |title=A quantitative description of membrane current and its application to conduction and excitation in nerve |journal=J. Physiol. |volume=117 |issue=4 |pages=500–44 |date=28 August 1952|pmid=12991237 |pmc=1392413 |url=http://www.jphysiol.org/cgi/pmidlookup?view=long&pmid=12991237 |doi=10.1113/jphysiol.1952.sp004764}}
  • {{cite book |author=William Bialek; Rieke, Fred; David Warland; Rob de Ruyter van Steveninck |title=Spikes: exploring the neural code |publisher=MIT |location=Cambridge, Mass |year=1999 |isbn=978-0-262-68108-7 }}
  • {{cite book |author=Schutter, Erik de |title=Computational neuroscience: realistic modeling for experimentalists |publisher=CRC |location=Boca Raton |year=2001 |isbn=978-0-8493-2068-2 }}
  • {{cite book |author1=Sejnowski, Terrence J. |author2=Hemmen, J. L. van |title=23 problems in systems neuroscience |publisher=Oxford University Press |location=Oxford [Oxfordshire] |year=2006 |isbn=978-0-19-514822-0 }}
  • {{cite book |author1=Michael A. Arbib |author2=Shun-ichi Amari |author3=Prudence H. Arbib | title=The Handbook of Brain Theory and Neural Networks|publisher=The MIT Press|location=Cambridge, Massachusetts |year=2002 |isbn=978-0-262-01197-6}}

External links

{{Commonscat}}

Journals

  • [https://www.springer.com/mathematics/journal/13408 Journal of Mathematical Neuroscience]
  • [https://www.springer.com/10827 Journal of Computational Neuroscience]
  • Neural Computation
  • Cognitive Neurodynamics
  • Frontiers in Computational Neuroscience
  • PLoS Computational Biology
  • Frontiers in Neuroinformatics

Software

  • BRIAN, a Python based simulator
  • Budapest Reference Connectome, web based 3D visualization tool to browse connections in the human brain
  • Emergent, neural simulation software.
  • GENESIS, a general neural simulation system.

Conferences

  • Computational and Systems Neuroscience (COSYNE) – a computational neuroscience meeting with a systems neuroscience focus.
  • Annual Computational Neuroscience Meeting (CNS)– a yearly computational neuroscience meeting.
  • Neural Information Processing Systems (NIPS)– a leading annual conference covering other machine learning topics as well.
  • [https://web.archive.org/web/20070309063503/http://www.iccn2007.org/ International Conference on Cognitive Neurodynamics (ICCN)]– a yearly conference.
  • UK Mathematical Neurosciences Meeting– a new yearly conference, focused on mathematical aspects.
  • [https://web.archive.org/web/20071012155706/http://www.neurocomp.fr/index.php?page=welcome The NeuroComp Conference]– a yearly computational neuroscience conference (France).
  • [https://web.archive.org/web/20110429094455/http://www.nncn.de/Aktuelles-en/bernsteinsymposium/Symposium/view?set_language=en Bernstein Conference on Computational Neuroscience (BCCN)]– a yearly conference in Germany, organized by the Bernstein Network for Computational Neuroscience.
  • AREADNE Conferences– a biennial meeting that includes theoretical and experimental results, held in even years in Santorini, Greece.

Websites

  • Encyclopedia of Computational Neuroscience, part of Scholarpedia, an online expert curated encyclopedia on computational neuroscience and dynamical systems
{{Neuroscience}}{{Nervous system}}{{Portal bar|Neuroscience}}{{DEFAULTSORT:Computational Neuroscience}}

5 : Neuroscience|Computational neuroscience|Cognitive neuroscience|Mathematical and theoretical biology|Computational fields of study

随便看

 

开放百科全书收录14589846条英语、德语、日语等多语种百科知识,基本涵盖了大多数领域的百科知识,是一部内容自由、开放的电子版国际百科全书。

 

Copyright © 2023 OENC.NET All Rights Reserved
京ICP备2021023879号 更新时间:2024/9/27 15:31:30