请输入您要查询的百科知识:

 

词条 Channel capacity
释义

  1. Formal definition

  2. Additivity of channel capacity

  3. Shannon capacity of a graph

  4. Noisy-channel coding theorem

  5. Example application

  6. Channel capacity in wireless communications

     Bandlimited AWGN channel  Frequency-selective AWGN channel  Slow-fading channel  Fast-fading channel 

  7. See also

     Advanced Communication Topics 

  8. External links

  9. References

{{Information theory}}

Channel capacity, in electrical engineering, computer science and information theory, is the tight upper bound on the rate at which information can be reliably transmitted over a communication channel.

Following the terms of the noisy-channel coding theorem, the channel capacity of a given channel is the highest information rate (in units of information per unit time) that can be achieved with arbitrarily small error probability. [1][2]

Information theory, developed by Claude E. Shannon during World War II, defines the notion of channel capacity and provides a mathematical model by which one can compute it. The key result states that the capacity of the channel, as defined above, is given by the maximum of the mutual information between the input and output of the channel, where the maximization is with respect to the input distribution. [3]

The notion of channel capacity has been central to the development of modern wireline and wireless communication systems, with the advent of novel error correction coding mechanisms that have resulted in achieving performance very close to the limits promised by channel capacity.

Formal definition

The basic mathematical model for a communication system is the following:

where:

  • is the message to be transmitted;
  • is the channel input symbol ( is a sequence of symbols) taken in an alphabet ;
  • is the channel output symbol ( is a sequence of symbols) taken in an alphabet ;
  • is the estimate of the transmitted message;
  • is the encoding function for a block of length ;
  • is the noisy channel, which is modeled by a conditional probability distribution; and,
  • is the decoding function for a block of length .

Let and be modeled as random variables. Furthermore, let be the conditional probability distribution function of given , which is an inherent fixed property of the communication channel. Then the choice of the marginal distribution completely determines the joint distribution due to the identity

which, in turn, induces a mutual information . The channel capacity is defined as

where the supremum is taken over all possible choices of .

Additivity of channel capacity

Channel capacity is additive over independent channels[4]. It means that using two independent channels in a combined manner provides the same theoretical capacity as using them independently.

More formally, let and be two independent channels modelled as above; having an input alphabet and an output alphabet . Idem for .

We define the product channel as

This theorem states:

{{Proof|

We first show that .

Let and be two independent random variables. Let be a random variable corresponding to the output of through the channel , and for through .

By definition .

Since and are independent, as well as and , is independent of . We can apply the following property of mutual information:

For now we only need to find a distribution such that . In fact, and , two probability distributions for and achieving and , suffice:

ie.

Now let us show that .

Let be some distribution for the channel defining and the corresponding output . Let be the alphabet of , for , and analogously and .

By definition of mutual information, we have

Let us rewrite the last term of entropy.

By definition of the product channel, .

For a given pair , we can rewrite as:

By summing this equality over all , we obtain

.

We can now give an upper bound over mutual information:

This relation is preserved at the supremum. Therefore

Combining the two inequalities we proved, we obtain the result of the theorem:


}}

Shannon capacity of a graph

{{main|Shannon capacity of a graph}}

If G is an undirected graph, it can be used to define a communications channel in which the symbols are the graph vertices, and two codewords may be confused with each other if their symbols in each position are equal or adjacent. The computational complexity of finding the Shannon capacity of such a channel remains open, but it can be upper bounded by another important graph invariant, the Lovász number.[5]

Noisy-channel coding theorem

The noisy-channel coding theorem states that for any error probability ε > 0 and for any transmission rate R less than the channel capacity C, there is an encoding and decoding scheme transmitting data at rate R whose error probability is less than ε, for a sufficiently large block length. Also, for any rate greater than the channel capacity, the probability of error at the receiver goes to 0.5 as the block length goes to infinity.

Example application

An application of the channel capacity concept to an additive white Gaussian noise (AWGN) channel with B Hz bandwidth and signal-to-noise ratio S/N is the Shannon–Hartley theorem:

C is measured in bits per second if the logarithm is taken in base 2, or nats per second if the natural logarithm is used, assuming B is in hertz; the signal and noise powers S and N are measured in watts or volts2, so the signal-to-noise ratio here is expressed as a power ratio, not in decibels (dB), but it is (dBm) due to giving exact power; since figures are often cited in dB, a conversion may be needed. For example, 30 dB is a power ratio of .

Channel capacity in wireless communications

This section[6] focuses on the single-antenna, point-to-point scenario. For channel capacity in systems with multiple antennas, see the article on MIMO.

Bandlimited AWGN channel

{{main|Shannon–Hartley theorem}}

If the average received power is [W] and the noise power spectral density is [W/Hz], the AWGN channel capacity is

[bits/s],

where is the received signal-to-noise ratio (SNR). This result is known as the Shannon–Hartley theorem.[7]

When the SNR is large (SNR >> 0 dB), the capacity is logarithmic in power and approximately linear in bandwidth. This is called the bandwidth-limited regime.

When the SNR is small (SNR << 0 dB), the capacity is linear in power but insensitive to bandwidth. This is called the power-limited regime.

The bandwidth-limited regime and power-limited regime are illustrated in the figure.

Frequency-selective AWGN channel

The capacity of the frequency-selective channel is given by so-called water filling power allocation,

where and is the gain of subchannel , with chosen to meet the power constraint.

Slow-fading channel

In a slow-fading channel, where the coherence time is greater than the latency requirement, there is no definite capacity as the maximum rate of reliable communications supported by the channel, , depends on the random channel gain , which is unknown to the transmitter. If the transmitter encodes data at rate [bits/s/Hz], there is a non-zero probability that the decoding error probability cannot be made arbitrarily small,

,

in which case the system is said to be in outage. With a non-zero probability that the channel is in deep fade, the capacity of the slow-fading channel in strict sense is zero. However, it is possible to determine the largest value of such that the outage probability is less than . This value is known as the -outage capacity.

Fast-fading channel

In a fast-fading channel, where the latency requirement is greater than the coherence time and the codeword length spans many coherence periods, one can average over many independent channel fades by coding over a large number of coherence time intervals. Thus, it is possible to achieve a reliable rate of communication of [bits/s/Hz] and it is meaningful to speak of this value as the capacity of the fast-fading channel.

See also

  • Bandwidth (computing)
  • Bandwidth (signal processing)
  • Bit rate
  • Code rate
  • Error exponent
  • Nyquist rate
  • Negentropy
  • Redundancy
  • Sender, Encoder, Decoder, Receiver
  • Shannon–Hartley theorem
  • Spectral efficiency
  • Throughput

Advanced Communication Topics

  • MIMO
  • Cooperative diversity

External links

  • {{springer|title=Transmission rate of a channel|id=p/t093890}}
  • AWGN Channel Capacity with various constraints on the channel input (interactive demonstration)

References

1. ^{{cite web |url=http://www.cs.ucl.ac.uk/staff/S.Bhatti/D51-notes/node31.html |author=Saleem Bhatti |title=Channel capacity |work=Lecture notes for M.Sc. Data Communication Networks and Distributed Systems D51 -- Basic Communications and Networks |deadurl=yes |archiveurl=https://web.archive.org/web/20070821212637/http://www.cs.ucl.ac.uk/staff/S.Bhatti/D51-notes/node31.html |archivedate=2007-08-21 |df= }}
2. ^{{cite web | url = http://www.st-andrews.ac.uk/~www_pa/Scots_Guide/iandm/part8/page1.html | title = Signals look like noise! | author = Jim Lesurf | work = Information and Measurement, 2nd ed.}}
3. ^{{cite book| author = Thomas M. Cover, Joy A. Thomas | title = Elements of Information Theory | publisher = John Wiley & Sons, New York |year=2006}}
4. ^{{cite book |last1=Cover |first1=Thomas M. |last2=Thomas |first2=Joy A. |title=Elements of Information Theory |publisher=Wiley-Interscience |edition=Second |date=2006 |pages=206-207 |chapter=Chapter 7: Channel Capacity |isbn=978-0-471-24195-9}}
5. ^{{citation | first = László | last = Lovász | authorlink = László Lovász | title = On the Shannon Capacity of a Graph | journal = IEEE Transactions on Information Theory | volume = IT-25 | issue = 1 | year = 1979 | doi = 10.1109/tit.1979.1055985 }}.
6. ^{{citation | author = David Tse, Pramod Viswanath | title = Fundamentals of Wireless Communication | publisher = Cambridge University Press, UK | year=2005}}
7. ^{{cite book|title=The Handbook of Electrical Engineering|year=1996|publisher=Research & Education Association|isbn=9780878919819|page=D-149|url=https://books.google.com/books?id=-WJS3VnvomIC&pg=RA1-SL4-PA41&dq=%22Shannon%E2%80%93Hartley+theorem%22&hl=en&sa=X&ei=7PVqUqaCNcPs2wXMmIGADw&ved=0CC4Q6AEwAA#v=onepage&q=%22Shannon%E2%80%93Hartley%20theorem%22&f=false}}
{{Mobile phones}}{{Refimprove|date=January 2008}}

3 : Information theory|Telecommunication theory|Television terminology

随便看

 

开放百科全书收录14589846条英语、德语、日语等多语种百科知识,基本涵盖了大多数领域的百科知识,是一部内容自由、开放的电子版国际百科全书。

 

Copyright © 2023 OENC.NET All Rights Reserved
京ICP备2021023879号 更新时间:2024/9/21 18:38:24