请输入您要查询的百科知识:

 

词条 Liquid state machine
释义

  1. Universal function approximation

  2. See also

  3. Libraries

  4. References

A liquid state machine (LSM) is a particular kind of spiking neural network. An LSM consists of a large collection of units (called nodes, or neurons). Each node receives time varying input from external sources (the inputs) as well as from other nodes. Nodes are randomly connected to each other. The recurrent nature of the connections turns the time varying input into a spatio-temporal pattern of activations in the network nodes. The spatio-temporal patterns of activation are read out by linear discriminant units.

The soup of recurrently connected nodes will end up computing a large variety of nonlinear functions on the input. Given a large enough variety of such nonlinear functions, it is theoretically possible to obtain linear combinations (using the read out units) to perform whatever mathematical operation is needed to perform a certain task, such as speech recognition or computer vision.

The word liquid in the name comes from the analogy drawn to dropping a stone into a still body of water or other liquid. The falling stone will generate ripples in the liquid. The input (motion of the falling stone) has been converted into a spatio-temporal pattern of liquid displacement (ripples).

LSMs have been put forward as a way to explain the operation of brains. LSMs are argued to be an improvement over the theory of artificial neural networks because:

  1. Circuits are not hard coded to perform a specific task.
  2. Continuous time inputs are handled "naturally".
  3. Computations on various time scales can be done using the same network.
  4. The same network can perform multiple computations.

Criticisms of LSMs as used in computational neuroscience are that

  1. LSMs don't actually explain how the brain functions. At best they can replicate some parts of brain functionality.
  2. There is no guaranteed way to dissect a working network and figure out how or what computations are being performed.
  3. Very little control over the process.

Universal function approximation

If a reservoir has fading memory and input separability, with help of a readout,

it can be proven the liquid state machine is a universal function approximator using Stone-Weierstrass theorem.[1]

See also

  • Echo state network: similar concept in recurrent neural network.
  • Reservoir computing: the conceptual framework.
  • Self-organizing map

Libraries

  • LiquidC#: Implementation of topologically robust liquid state machine [2] with a neuronal network detector [https://bitbucket.org/Hananel/liquid-state-machine]

References

1. ^{{citation |author1=Maass, Wolfgang |author2=Markram, Henry | year = 2004 | title = On the Computational Power of Recurrent Circuits of Spiking Neurons | journal = Journal of Computer and System Sciences | volume = 69 | issue = 4 | pages = 593–616 | doi = 10.1016/j.jcss.2004.04.001}}
2. ^{{Citation |author1=Hananel, Hazan |author2=Larry, M., Manevit |title=Topological constraints and robustness in liquid state machines |journal=Expert Systems with Applications |volume=39 |issue=2 |pages=1597–1606 |date=2012 |doi=10.1016/j.eswa.2011.06.052 |postscript=. }}
  • {{Citation

|author1=Maass, Wolfgang
|author2=Natschläger, Thomas
|author3=Markram, Henry
|title=Real-time computing without stable states: a new framework for neural computation based on perturbations
|journal=Neural Comput
|volume=14
|issue=11
|pages=2531–60
|date=November 2002
|pmid=12433288
|doi=10.1162/089976602760407955
|url=http://ramsesii.upf.es/seminar/Maass_et_al_2002.pdf
|postscript=.
|deadurl=unfit
|archiveurl=https://web.archive.org/web/20120222154641/http://ramsesii.upf.es/seminar/Maass_et_al_2002.pdf
|archivedate=February 22, 2012
|citeseerx=10.1.1.183.2874
  • {{citation

|author1=Wolfgang Maass |author2=Thomas Natschläger |author3=Henry Markram | title = Computational Models for Generic Cortical Microcircuits
| journal = In Computational Neuroscience: A Comprehensive Approach, Ch 18
| volume = 18
| pages = 575–605
| date = 2004
}}

1 : Artificial neural networks

随便看

 

开放百科全书收录14589846条英语、德语、日语等多语种百科知识,基本涵盖了大多数领域的百科知识,是一部内容自由、开放的电子版国际百科全书。

 

Copyright © 2023 OENC.NET All Rights Reserved
京ICP备2021023879号 更新时间:2024/9/20 19:24:55