请输入您要查询的百科知识:

 

词条 Incremental computing
释义

  1. Static versus Dynamic

  2. Specialized versus general-purpose approaches

  3. Static methods

      Program derivatives    View maintenance  

  4. Dynamic methods

  5. Existing systems

     Compiler and language support  Frameworks and libraries 

  6. Applications

  7. See also

  8. References

Incremental computing, also known as incremental computation, is a software feature which, whenever a piece of data changes, attempts to save time by only recomputing those outputs which depend on the changed data.[1][2][3] When incremental computing is successful, it can be significantly faster than computing new outputs naively. For example, a spreadsheet software package might use incremental computation in its recalculation feature, to update only those cells containing formulas which depend (directly or indirectly) on the changed cells.

When incremental computing is implemented by a tool that can implement it for a variety of different pieces of code automatically, that tool is an example of a program analysis tool for optimization.

Static versus Dynamic

{{references|section|date=November 2016}}

Incremental computing techniques can be broadly separated into two types of approaches:

Static approaches attempt to derive an incremental program from a conventional program P using, e.g., either manual design and refactoring, or automatic program transformations. These program transformations occur before any inputs or input changes are provided.

Dynamic approaches record information about executing program P on a particular input (I1) and use this information when the input changes (to I2) in order to update the output (from O1 to O2). The figure shows the relationship between program P, the change calculation function ΔP, which constitutes the core of the incremental program, and a pair of inputs and outputs, I1, O1 and I2, O2.

Specialized versus general-purpose approaches

Some approaches to incremental computing are specialized, while others are general purpose.

Specialized approaches require the programmer to explicitly specify the algorithms and data structures that will be used to preserve unchanged sub-calculations.

General-purpose approaches, on the other hand, use language, compiler, or algorithmic techniques to give incremental behavior to otherwise non-incremental programs.[4]

Static methods

Program derivatives

Given a computation and a potential change , we can insert code before the change (the pre-derivative) and after the change (the post-derivative) to update the value of faster than rerunning . Paige has written down a list of rules for formal differentiation of programs in SUBSETL.[5]

View maintenance

In database systems such as DBToaster, views are defined with relational algebra. Incremental view maintenance statically analyzes relational algebra to create update rules that quickly maintain the view in the presence of small updates, such as insertion of a row.[6]

Dynamic methods

{{more references|section|date=November 2016}}

A conservative (theoretically sub-optimal) implementation technique for incremental computing is for the software to build a dependency graph of all the data elements that may need to be recalculated, and their dependencies. The elements that need to be updated when a single element changes are given by the transitive closure of the dependency relation of the graph. In other words, if there is a path from the changed element to another element, the latter will be updated (even if the value does not actually change).

The dependency graph may need to be updated as dependencies change, or as elements are added to, or removed from, the system. It is used internally by the implementation, and does not typically need to be displayed to the user.

Partial evaluation can be seen as a method for automating the simplest possible case of incremental computing, in which an attempt is made to divide program data into two categories: that which can vary based on the program's input, and that which cannot (and the smallest unit of change is simply "all the data that can vary"). Of course, partial evaluation can be combined with other incremental computing techniques.

Other implementation techniques exist. For example, a topological evaluation order may be used to avoid the precomputation of elements that need to be reevaluated as in FrTime, which also avoids the problem of unnecessary updates. If a language permits cycles in the dependency graph, a single pass through the graph may not be sufficient to reach a fixed point. In many cases, complete reevaluation of a system is semantically equivalent to incremental evaluation, and may be more efficient in practice if not in theory.[7]

Existing systems

Compiler and language support

  • Automatic Incrementalization (also called "Self-Adjusting Computation", and "Adaptive Functional Programming"),[8] Delta ML, [https://hackage.haskell.org/package/Adaptive Haskell Adaptive]
  • Cornell Synthesizer Generator[9]
  • IceDust - a custom domain-specific language.

Frameworks and libraries

  • Adapton[10] - with implementations in several languages
  • One-way Dataflow Constraints (Reactive Computation in C++)
  • Differential Dataflow
  • Jane Street [https://github.com/janestreet/incremental Incremental]
  • Incremental Datalog (Logicblox)
  • Incremental Prolog (XSB)[11]
  • Domain-Specific Approaches:
    • Incremental Type Checking

Applications

{{refimprove|section|date=February 2017}}
  • Databases (view maintenance)
  • Build systems
  • Spreadsheets[12]
  • Development Environments
  • Financial Computations
  • Attribute Grammar Evaluation
  • Graph Computations and Queries
  • GUIs (e.g., React and DOM diffing)
  • Scientific applications

See also

  • Reactive programming
    • Functional reactive programming
  • Memoization
  • Bidirectional transformation

References

1. ^{{cite conference|first=Magnus|last=Carlsson|title=Monads for incremental computing|booktitle=Proceedings of the seventh ACM SIGPLAN international conference on Functional programming|pages=26–35|publisher=ACM|year=2002|location=New York|doi=10.1145/581478.581482|isbn=1-58113-487-8}}
2. ^{{Cite thesis|title=Self-Adjusting Computation|degree=Ph.D.|url=http://www.cs.cmu.edu/~rwh/theses/acar.pdf|chapter=|author=Umut A. Acar|year=2005}}
3. ^{{cite conference|title=Reactive Imperative Programming with Dataflow Constraints|url=http://code.google.com/p/dc-lib/|author1=Camil Demetrescu|author2=Irene Finocchi|author3=Andrea Ribichini|booktitle=Proceedings of the 26th ACM International Conference on Object-Oriented Programming Systems Languages and Applications (OOPSLA 2011)|pages=407–426|publisher=ACM|year=2011|doi=10.1145/2048066.2048100|isbn=978-1-4503-0940-0|arxiv=1104.2293}}
4. ^{{cite conference|author1=Yan Chen|author2=Joshua Dunfield|author3=Matthew A. Hammer|author4=Umut A. Acar|conference=ICFP '11|pages=129–141|title=Implicit self-adjusting computation for purely functional programs|url=https://dl.acm.org/citation.cfm?id=2034792|access-date=2018-03-12|archive-url=https://web.archive.org/web/20161030185650/http://repository.cmu.edu/cgi/viewcontent.cgi?article=3549&context=compsci|archive-date=2016-10-30}}
5. ^{{Cite book|title=Formal Differentiation: A Program Synthesis Technique|last=Paige|first=Robert|publisher=UMI Research Press|year=1981|isbn=978-0-8357-1213-2|location=|pages=|via=}}
6. ^{{Cite journal|last=Ahmad|first=Yanif|last2=Kennedy|first2=Oliver|last3=Koch|first3=Christoph|last4=Nikolic|first4=Milos|date=2012-06-01|title=DBToaster: Higher-order Delta Processing for Dynamic, Frequently Fresh Views|journal=Proc. VLDB Endow.|volume=5|issue=10|pages=968–979|doi=10.14778/2336664.2336670|issn=2150-8097|arxiv=1207.0137}}
7. ^{{cite conference |author1=Kimberley Burchett |author2=Gregory H. Cooper |author3=Shriram Krishnamurthi |title=Lowering: A static optimization technique for transparent functional reactivity |booktitle=In ACM SIGPLAN Symposium on Partial Evaluation and Semantics-Based Program Manipulation|pages=71–80|year=2007|isbn=978-1-59593-620-2|citeseerx=10.1.1.90.5866 }}
8. ^{{cite book|last1=Hammer|first1=Matthew A.|title=Proceedings of the 2009 ACM SIGPLAN conference on Programming language design and implementation - PLDI '09|last2=Acar|first2=Umut A.|last3=Chen|first3=Yan|chapter=CEAL|year=2009|pages=25|doi=10.1145/1542476.1542480|isbn=9781605583921}}
9. ^{{cite book|last1=Reps|first1=Thomas|title=Proceedings of the first ACM SIGSOFT/SIGPLAN software engineering symposium on Practical software development environments - SDE 1|last2=Teitelbaum|first2=Tim|chapter=The synthesizer generator|year=1984|pages=42–48|doi=10.1145/800020.808247|isbn=978-0897911313}}
10. ^{{Cite web|url=http://adapton.org|title=Adapton: Programming Language Abstractions for Incremental Computation|website=adapton.org|access-date=2016-10-07}}
11. ^{{cite book|last1=Saha|first1=Diptikalyan|title=Practical Aspects of Declarative Languages|last2=Ramakrishnan|first2=C. R.|chapter=Incremental Evaluation of Tabled Prolog: Beyond Pure Logic Programs|volume=3819|year=2005|pages=215–229|issn=0302-9743|doi=10.1007/11603023_15|series=Lecture Notes in Computer Science|isbn=978-3-540-30947-5|citeseerx=10.1.1.111.7484}}
12. ^{{cite conference|title=ADAPTON: Composable, Demand-Driven Incremental Computation|last1=Hammer|first1=Matthew|last2=Phang|first2=Khoo|last3=Hicks|first3=Michael|last4=Foster|first4=Jeffrey|conference=PLDI|year=2014|url=http://matthewhammer.org/adapton/adapton-pldi2014.pdf}}

1 : Incremental computing

随便看

 

开放百科全书收录14589846条英语、德语、日语等多语种百科知识,基本涵盖了大多数领域的百科知识,是一部内容自由、开放的电子版国际百科全书。

 

Copyright © 2023 OENC.NET All Rights Reserved
京ICP备2021023879号 更新时间:2024/9/26 0:26:25