请输入您要查询的百科知识:

 

词条 DFA minimization
释义

  1. Minimum DFA

  2. Unreachable states

  3. Nondistinguishable states

     Hopcroft's algorithm  Moore's algorithm  Brzozowski's algorithm 

  4. NFA minimization

  5. See also

  6. Notes

  7. References

  8. External links

In automata theory (a branch of theoretical computer science), DFA minimization is the task of transforming a given deterministic finite automaton (DFA) into an equivalent DFA that has a minimum number of states. Here, two DFAs are called equivalent if they recognize the same regular language. Several different algorithms accomplishing this task are known and described in standard textbooks on automata theory.[1]

Minimum DFA

For each regular language, there also exists a minimal automaton that accepts it, that is, a DFA with a minimum number of states and this DFA is unique (except that states can be given different names).[2][3] The minimal DFA ensures minimal computational cost for tasks such as pattern matching.

There are two classes of states that can be removed or merged from the original DFA without affecting the language it accepts to minimize it.

  • Unreachable states are the states that are not reachable from the initial state of the DFA, for any input string.
  • Nondistinguishable states are those that cannot be distinguished from one another for any input string.

DFA minimization is usually done in three steps, corresponding to the removal or merger of the relevant states. Since the elimination of nondistinguishable states is computationally the most expensive one, it is usually done as the last step.

Unreachable states

The state p of DFA M=(Q, Σ, δ, q0, F) is unreachable if no such string w in Σ* exists for which p=δ*(q0, w). Reachable states can be obtained with the following algorithm:{{cn|date=October 2014}}

let reachable_states := {q0};

let new_states := {q0};

do {

    temp := the empty set;    for each q in new_states do        for each c in Σ do            temp := temp ∪ {p such that p = δ(q,c)};        end;    end;    new_states := temp \\ reachable_states;    reachable_states := reachable_states ∪ new_states;

} while (new_states ≠ the empty set);

unreachable_states := Q \\ reachable_states;

Unreachable states can be removed from the DFA without affecting the language that it accepts.

Nondistinguishable states

Hopcroft's algorithm

One algorithm for merging the nondistinguishable states of a DFA, due to {{harvtxt|Hopcroft|1971}}, is based on partition refinement, partitioning the DFA states into groups by their behavior. These groups represent equivalence classes of the Myhill–Nerode equivalence relation, whereby every two states of the same partition are equivalent if they have the same behavior for all the input sequences. That is, for every two states {{math|p1}} and {{math|p2}} that belong to the same equivalence class within the partition {{mvar|P}}, and every input word {{mvar|w}}, the transitions determined by {{mvar|w}} should always take states {{math|p1}} and {{math|p2}} to equal states, states that both accept, or states that both reject. It should not be possible for {{mvar|w}} to take {{math|p1}} to an accepting state and {{math|p2}} to a rejecting state or vice versa.

The following pseudocode describes the algorithm:

P := {F, Q \\ F};

W := {F};

while (W is not empty) do

     choose and remove a set A from W     for each c in Σ do          let X be the set of states for which a transition on c leads to a state in A          for each set Y in P for which X ∩ Y is nonempty and Y \\ X is nonempty do               replace Y in P by the two sets X ∩ Y and Y \\ X               if Y is in W                    replace Y in W by the same two sets               else                    if |X ∩ Y| <= |Y \\ X|                         add X ∩ Y to W                    else                         add Y \\ X to W          end;     end;

end;

The algorithm starts with a partition that is too coarse: every pair of states that are equivalent according to the Myhill–Nerode relation belong to the same set in the partition, but pairs that are inequivalent might also belong to the same set. It gradually refines the partition into a larger number of smaller sets, at each step splitting sets of states into pairs of subsets that are necessarily inequivalent.

The initial partition is a separation of the states into two subsets of states that clearly do not have the same behavior as each other: the accepting states and the rejecting states. The algorithm then repeatedly chooses a set {{mvar|A}} from the current partition and an input symbol {{mvar|c}}, and splits each of the sets of the partition into two (possibly empty) subsets: the subset of states that lead to {{mvar|A}} on input symbol {{mvar|c}}, and the subset of states that do not lead to {{mvar|A}}. Since {{mvar|A}} is already known to have different behavior than the other sets of the partition, the subsets that lead to {{mvar|A}} also have different behavior than the subsets that do not lead to {{mvar|A}}. When no more splits of this type can be found, the algorithm terminates.

Lemma. Given a fixed character c and an equivalence class Y that splits into equivalence classes B and C, only one of B or C is necessary to refine the whole partition.[4]

Example: Suppose we have an equivalence class Y that splits into equivalence classes B and C. Suppose we also have classes D, E, and F; D and E have states with transitions into B on character c, while F has transitions into C on character c. By the Lemma, we can choose either B or C as the distinguisher, let's say B. Then the states of D and E are split by their transitions into B. But F, which doesn't point into B, simply doesn't split during the current iteration of the algorithm; it will be refined by other distinguisher(s).

Observation. All of B or C is necessary to split referring classes like D, E, and F correctly-- subsets won't do.

The purpose of the outermost if statement (if Y is in W) is to patch up W, the set of distinguishers. We see in the previous statement in the algorithm that Y has just been split. If Y is in W, it has just become obsolete as a means to split classes in future iterations. So Y must be replaced by both splits because of the Observation above. If Y is not in W, however, only one of the two splits, not both, needs to be added to W because of the Lemma above. Choosing the smaller of the two splits guarantees that the new addition to W is no more than half the size of Y; this is the core of the Hopcroft algorithm: how it gets its speed, as explained in the next paragraph.

The worst case running time of this algorithm is {{math|O(ns log n)}}, where {{mvar|n}} is the number of states and {{mvar|s}} is the size of the alphabet. This bound follows from the fact that, for each of the {{math|ns}} transitions of the automaton, the sets drawn from {{mvar|Q}} that contain the target state of the transition have sizes that decrease relative to each other by a factor of two or more, so each transition participates in {{math|O(log n)}} of the splitting steps in the algorithm. The partition refinement data structure allows each splitting step to be performed in time proportional to the number of transitions that participate in it.[5] This remains the most efficient algorithm known for solving the problem, and for certain distributions of inputs its average-case complexity is even better, {{math|O(n log log n)}}.[6]

Once Hopcroft's algorithm has been used to group the states of the input DFA into equivalence classes, the minimum DFA can be constructed by forming one state for each equivalence class. If {{mvar|S}} is a set of states in {{mvar|P}}, {{mvar|s}} is a state in {{mvar|S}}, and {{mvar|c}} is an input character, then the transition in the minimum DFA from the state for {{mvar|S}}, on input {{mvar|c}}, goes to the set containing the state that the input automaton would go to from state {{mvar|s}} on input {{mvar|c}}. The initial state of the minimum DFA is the one containing the initial state of the input DFA, and the accepting states of the minimum DFA are the ones whose members are accepting states of the input DFA.

Moore's algorithm

{{main|Moore reduction procedure}}

Moore's algorithm for DFA minimization is due to {{harvs|first=Edward F.|last=Moore|authorlink=Edward F. Moore|year=1956|txt}}. Like Hopcroft's algorithm, it maintains a partition that starts off separating the accepting from the rejecting states, and repeatedly refines the partition until no more refinements can be made. At each step, it replaces the current partition with the coarsest common refinement of {{math|s + 1}} partitions, one of which is the current one and the others are the preimages of the current partition under the transition functions for each of the input symbols. The algorithm terminates when this replacement does not change the current partition. Its worst-case time complexity is {{math|O(n2s)}}: each step of the algorithm may be performed in time {{math|O(ns)}} using a variant of radix sort to reorder the states so that states in the same set of the new partition are consecutive in the ordering, and there are at most {{mvar|n}} steps since each one but the last increases the number of sets in the partition. The instances of the DFA minimization problem that cause the worst-case behavior are the same as for Hopcroft's algorithm. The number of steps that the algorithm performs can be much smaller than {{mvar|n}}, so on average (for constant {{mvar|s}}) its performance is {{math|O(n log n)}} or even {{math|O(n log log n)}} depending on the random distribution on automata chosen to model the algorithm's average-case behavior.[6]{{sfnp|David|2012}}

Brzozowski's algorithm

As {{harvtxt|Brzozowski|1963}} observed, reversing the edges of a DFA produces a non-deterministic finite automaton (NFA) for the reversal of the original language, and converting this NFA to a DFA using the standard powerset construction (constructing only the reachable states of the converted DFA) leads to a minimal DFA for the same reversed language. Repeating this reversal operation a second time produces a minimal DFA for the original language. The worst-case complexity of Brzozowski's algorithm is exponential, as there are regular languages for which the minimal DFA of the reversal is exponentially larger than the minimal DFA of the language,[7] but it frequently performs better than this worst case would suggest.[6]

NFA minimization

While the above procedures work for DFAs, the method of partitioning does not work for non-deterministic finite automata (NFAs).[8] While an exhaustive search may minimize an NFA, there is no polynomial-time algorithm to minimize general NFAs unless P=PSPACE, an unsolved conjecture in computational complexity theory which is widely believed to be false. However, there are methods of NFA minimization that may be more efficient than brute force search.{{sfnp|Kameda|Weiner|1970}}

See also

  • State encoding for low power

Notes

1. ^{{harvtxt|Hopcroft|Motwani|Ullman|2001}}, Section 4.4.3, "Minimization of DFA's".
2. ^{{harvtxt|Hopcroft|Ullman|1979}}, Section 3.4, Theorem 3.10, p.67
3. ^{{harvtxt|Hopcroft|Motwani|Ullman|2001}}, Section 4.4.3, "Minimization of DFA's", p. 159, and p. 164 (remark after Theorem 4.26)
4. ^Based on Corollary 10 of {{harvtxt|Knuutila|2001}}
5. ^{{harvtxt|Hopcroft|1971}}; {{harvtxt|Aho|Hopcroft|Ullman|1974}}
6. ^{{harvtxt|Berstel|Boasson|Carton|Fagnot|2010}}.
7. ^For instance, the language of binary strings whose {{mvar|n}}th symbol is a one requires only {{math|n + 1}} states, but its reversal requires {{math|2n}} states. {{harvtxt|Leiss|1981}} provides a ternary {{mvar|n}}-state DFA whose reversal requires {{math|2n}} states, the maximum possible. For additional examples and the observation of the connection between these examples and the worst-case analysis of Brzozowski's algorithm, see {{harvtxt|Câmpeanu|Culik|Salomaa|Yu|2001}}.
8. ^{{harvtxt|Hopcroft|Motwani|Ullman|2001}}, Section 4.4, Figure labeled "Minimizing the States of an NFA", p. 163.

References

  • {{citation|pages=157–162|chapter=4.13 Partitioning|title=The Design and Analysis of Computer Algorithms|first1=Alfred V.|last1=Aho|author1-link=Alfred Aho|first2=John E.|last2=Hopcroft|author2-link=John Hopcroft|first3=Jeffrey D.|last3=Ullman|author3-link=Jeffrey D. Ullman|publisher=Addison-Wesley|year=1974}}.
  • {{citation | first1=Jean | last1=Berstel | first2=Luc | last2=Boasson | first3=Olivier | last3=Carton | first4=Isabelle | last4=Fagnot | contribution=Minimization of Automata | arxiv=1010.5318 | year=2010 | title=Automata: from Mathematics to Applications|publisher=European Mathematical Society | bibcode=2010arXiv1010.5318B }}
  • {{citation

| last = Brzozowski | first = J. A.
| authorlink=Janusz Brzozowski (computer scientist)
| contribution = Canonical regular expressions and minimal state graphs for definite events
| mr = 0175719
| pages = 529–561
| publisher = Polytechnic Press of Polytechnic Inst. of Brooklyn, Brooklyn, N.Y.
| title = Proc. Sympos. Math. Theory of Automata (New York, 1962)
| year = 1963}}.
  • {{citation

| last1 = Câmpeanu | first1 = Cezar
| last2 = Culik | first2 = Karel, II
| last3 = Salomaa | first3 = Kai
| last4 = Yu | first4 = Sheng
| contribution = State Complexity of Basic Operations on Finite Languages
| doi = 10.1007/3-540-45526-4_6
| pages = 60–70
| publisher = Springer-Verlag
| series = Lecture Notes in Computer Science
| title = 4th International Workshop on Automata Implementation (WIA '99)
| volume = 2214
| year = 2001| isbn = 978-3-540-42812-1
  • {{citation |title=Average complexity of Moore's and Hopcroft's algorithms |first=Julien |last=David |year=2012 |journal=Theoretical Computer Science|volume=417 |pages=50–65 |doi=10.1016/j.tcs.2011.10.011}}.
  • {{citation

| last = Hopcroft | first = John | authorlink = John Hopcroft
| contribution = An {{math|n log n}} algorithm for minimizing states in a finite automaton
| mr = 0403320
| location = New York
| pages = 189–196
| publisher = Academic Press
| title = Theory of machines and computations (Proc. Internat. Sympos., Technion, Haifa, 1971)
| year = 1971}}. See also preliminary version, Technical Report STAN-CS-71-190, Stanford University, Computer Science Department, January 1971.
  • {{citation | isbn=978-0-201-02988-8 | first1=John E. |last1=Hopcroft |first2= Jeffrey D. |last2=Ullman | title=Introduction to Automata Theory, Languages, and Computation | location=Reading/MA | publisher=Addison-Wesley | year=1979 }}
  • {{citation|title=Introduction to Automata Theory, Languages, and Computation | last1 = Hopcroft | first1 = John E. | author1-link = John Hopcroft | last2 = Motwani | first2 = Rajeev | author2-link = Rajeev Motwani | last3 = Ullman | first3 = Jeffrey D. | author3-link = Jeffrey D. Ullman | publisher = Addison-Wesley | edition = 2nd | year = 2001| title-link = Introduction to Automata Theory, Languages, and Computation }}.
  • {{citation

| last1 = Kameda | first1 = Tsunehiko
| last2 = Weiner | first2 = Peter
| doi = 10.1109/T-C.1970.222994
| issue = 7
| pages = 617–627
| journal = IEEE Transactions on Computers
| title = On the state minimization of nondeterministic finite automata
| volume = 100
| year = 1970}}.
  • {{citation

| last = Knuutila | first = Timo
| doi = 10.1016/S0304-3975(99)00150-4
| issue = 1–2
| journal = Theoretical Computer Science
| mr = 1795249
| pages = 333–363
| title = Re-describing an algorithm by Hopcroft
| volume = 250
| year = 2001}}.
  • {{citation

| last = Leiss | first = Ernst
| doi = 10.1016/S0304-3975(81)80005-9
| mr = 603263
| issue = 3
| journal = Theoretical Computer Science
| pages = 323–330
| title = Succinct representation of regular languages by Boolean automata
| volume = 13
| url = http://www.sciencedirect.com/science/article/pii/S0304397581800059/pdf?md5=ae550d58084acf5f9af3fac6b8b20106&pid=1-s2.0-S0304397581800059-main.pdf
| year = 1981}}.
  • {{citation | last = Leiss | first = Ernst | title=Succinct representation of regular languages by Boolean automata II | journal=Theoretical Computer Science | volume=38 | number= | pages=133–136 | url=http://www.sciencedirect.com/science/article/pii/0304397585902154/pdf?md5=00582bb1bdc5f7a3af33b8c19479d3d3&pid=1-s2.0-0304397585902154-main.pdf | year=1985 | doi=10.1016/0304-3975(85)90215-4}}
  • {{citation

| last = Moore | first = Edward F. | authorlink = Edward F. Moore
| contribution = Gedanken-experiments on sequential machines
| mr = 0078059
| location = Princeton, N. J.
| pages = 129–153
| publisher = Princeton University Press
| series = Annals of mathematics studies, no. 34
| title = Automata studies
| year = 1956}}.
  • {{citation | last=Sakarovitch | first=Jacques | title=Elements of automata theory | others=Translated from French by Reuben Thomas | publisher=Cambridge University Press | year=2009 | isbn=978-0-521-84425-3 | zbl=1188.68177 }}

External links

  • DFA minimization using the Myhill-Nerode theorem
{{DEFAULTSORT:Dfa Minimization}}

2 : Finite automata|Articles with example pseudocode

随便看

 

开放百科全书收录14589846条英语、德语、日语等多语种百科知识,基本涵盖了大多数领域的百科知识,是一部内容自由、开放的电子版国际百科全书。

 

Copyright © 2023 OENC.NET All Rights Reserved
京ICP备2021023879号 更新时间:2024/11/13 8:00:01