How to set up smartphones and PCs. Informational portal
  • home
  • Iron
  • The bandwidth of the communication channel. Capacity of information transmission systems

The bandwidth of the communication channel. Capacity of information transmission systems

Capacity of information transmission systems

One of the main characteristics of any information transmission system, other than those listed above, is its capacity.

Bandwidth is the maximum possible amount of useful information transmitted per unit of time:

c = max (Imax) / TC,

c = [bps].

Sometimes the information transfer rate is defined as the maximum amount of useful information in one elementary signal:

s = max (Imax) / n,

s = [bit / element].

The considered characteristics depend only on the communication channel and its characteristics and do not depend on the source.

The bandwidth of a discrete communication channel without interference. In a communication channel without interference, information can be transmitted by a non-redundant signal. In this case, the number n = m, and the elementary signal entropy HCmax = logK.

max (IC) = nHCmax = mHCmax.

Chip duration, where is the chip duration.

where FC is the signal spectrum.

Communication channel bandwidth without interference

Let us introduce the concept of the rate of generation of an elementary signal by a source of information:

Then, using the new concept, you can transform the formula for the information transfer rate:

The resulting formula determines the maximum possible information transfer rate in a discrete communication channel without interference. This follows from the assumption that the entropy of the signal is maximum.

If HC< HCmax, то c = BHC и не является максимально возможной для данного канала связи.

The bandwidth of a discrete communication channel with interference. In a discrete communication channel with interference, the situation shown in Fig. 6.

Considering the property of additivity, as well as Shannon's formulas for determining the amount of information discussed above, we can write

IC = TC FC log (AK PC),

IPOM = TP FP log (APP).

For the receiver, the source of useful information and the source of interference are equivalent; therefore, it is impossible to separate the interference component in the signal with the resulting information on the receiving side.

IPES = TC FC log (AK (PP + PC)), if TC = TP, FC = FP.

The receiver can be narrowband, and the interference is in other frequency ranges. In this case, it will not affect the signal.

We will determine the resulting signal for the most “unpleasant” case, when the parameters of the signal and noise are close to each other or coincide. Useful information is defined by the expression

This formula was obtained by Shannon. It determines the rate of information transmission over the communication channel if the signal has the power of PC, and the interference has the power of PП. All messages at this rate will be transmitted with absolute certainty. The formula does not contain an answer to the question of how to achieve such a speed, but gives the maximum possible value of c in a communication channel with interference, that is, such a value of the transmission rate at which the information received will be absolutely reliable. In practice, it is more economical to allow a certain amount of error in the message, although the transmission speed will increase.

Consider the case PC >> PP. If we introduce the concept of signal-to-noise ratio

PC >> PP means that. Then

The resulting formula reflects the maximum speed of a powerful signal in the communication channel. If PC<< PП, то с стремится к нулю. То есть сигнал принимается на фоне помех. В таком канале в единицу времени сигнал получить не удается. В реальных ситуациях полностью помеху отфильтровать нельзя. Поэтому приемник получает полезную информацию с некоторым набором ошибочных символов. Канал связи для такой ситуации можно представить в виде, изображенном на рис. 7, приняв источник информации за множество передаваемых символов {X}, а приемник – за множество получаемых символов {Y}.

Fig. 7 Graph of transition probabilities of the K-ary communication channel

There is a definite one-to-one correspondence between. If there is no interference, then the probability of a one-to-one correspondence is one, otherwise it is less than one.

If qi is the probability of accepting yi as xi, and pij = p (yi / xi) is the probability of error, then

.

The graph of transition probabilities reflects the final result of the influence of interference on the signal. As a rule, it is obtained experimentally.

The useful information can be estimated as IFOL = nH (X · Y), where n is the number of elementary symbols in the signal; H (X Y) - mutual entropy of source X and source Y.

In this case, the source X is the source of useful information, and the source Y is the sink. The relation defining useful information can be obtained from the meaning of mutual entropy: the shaded section of the diagram defines the messages transmitted by the source X and received by the receiver Y; open areas represent the signals of the source X that did not reach the receiver and that the receiver received extraneous signals that were not transmitted by the source.

B is the rate of generation of elementary symbols at the source output.

To get max, you need to increase H (Y) and decrease H (Y / X) if possible. Graphically, this situation can be represented by aligning the circles on the diagram (Fig. 2d).

If the circles do not intersect at all, X and Y exist independently of each other. In what follows, it will be shown how you can use the general expression for the maximum transmission rate when analyzing specific communication channels.

When characterizing a discrete channel, two concepts of speed are used: technical and informational.

The technical transmission rate RT, also called the keying rate, is the number of symbols (chips) transmitted over the channel per unit of time. It depends on the properties of the communication line and the speed of the channel equipment.

Taking into account the differences in the duration of symbols, the technical speed is determined as

where is the average symbol duration.

The unit of measure is "baud", which is the rate at which one character is transmitted per second.

The information rate or information transmission rate is determined by the average amount of information that is transmitted over the channel per unit of time. It depends both on the characteristics of a particular channel (such as the size of the alphabet of the symbols used, the technical speed of their transmission, the statistical property of noise in the line), and on the probabilities of the symbols arriving at the input and their statistical relationship.

With a known manipulation rate, the information transfer rate over the channel is set by the ratio:

,

where is the average amount of information carried by one symbol.



For practice, it is important to find out to what limit and in what way it is possible to increase the speed of information transfer over a specific channel. The limiting capabilities of a channel for transmitting information are characterized by its bandwidth.

The bandwidth of the channel with the given transition probabilities is equal to the maximum of the transmitted information over all input distributions of symbols of the source X:

From a mathematical point of view, the search for the throughput of a discrete memory-free channel is reduced to the search for the probability distribution of the input symbols of the source X, which provides the maximum of the transmitted information. At the same time, a restriction is imposed on the probabilities of input symbols: , .

In the general case, the determination of the maximum under the given constraints is possible using the multiplicative Lagrange method. However, such a solution is prohibitively expensive.

In the special case for discrete symmetric channels without memory, the bandwidth (maximum, is achieved with a uniform distribution of the input symbols of the source X.

Then, for a DSC without memory, assuming a given error probability ε and for equiprobable input symbols = = = = 1/2, we can obtain the throughput of such a channel by the known expression for:

where = is the entropy of the binary symmetric channel for a given error probability ε.

The boundary cases are of interest:

1. Transmission of information via a silent channel (no interference):

, [bit / character].

With fixed basic technical characteristics of the channel (for example, frequency band, average and peak transmitter power), which determine the value of the technical speed, the throughput of the channel without interference will be equal to [bit / s].

In any communication system, information is transmitted through the channel. Its transmission rate was defined in § 4.2. As can be seen from (4.25), this rate depends not only on the channel itself, but also on the properties of the signal supplied to its input, and therefore cannot characterize the channel as a means of information transmission. Let's try to find a way to estimate the channel's ability to transmit information. Let us first consider a discrete channel through which v symbols from an alphabet of size m are transmitted per unit time. When transmitting each symbol, on average, the amount of information passes through the channel

I (A, B) = H (A) - H (A | B) = H (B) - H (B | A), (4.35)

where A and B are random symbols at the input and output of the channel. Of the four entropies appearing here, H (A) - the intrinsic information of the transmitted symbol is determined by the source of the discrete signal * and does not depend on the properties of the channel. The other three entropies generally depend on both the signal source and the channel.

* (The source of a discrete signal in a communication system (see Fig. 1.5) is a combination of a message source and an encoder.)

Let us imagine that symbols from different sources, characterized by different probability distributions P (A), can be fed to the channel input (but, of course, for the same values ​​of m and v). For each such source, the amount of information transmitted over the channel takes on its own meaning. The maximum amount of transmitted information, taken from all possible sources of the input signal, characterizes the channel itself and is called the channel bandwidth per symbol

where maximization * is performed over all multivariate probability distributions P (A). You can also determine the bandwidth of the C channel per unit of time (for example, a second):

* (If such a maximum does not exist (which can be with an infinite number of possible sources), then the throughput is defined as the smallest upper bound sup I (A, B), that is, such a value to which I (A, B) can be arbitrarily come close, but cannot surpass it.)

Equality (4.37) follows from the additivity of the entropy. In what follows, wherever it is not specifically stated, we mean throughput as the throughput per second.

As an example, let us calculate the throughput of a symmetric memoryless channel for which the transition probabilities are given (3.36). According to (4.36)

The magnitude


in this case it is easy to calculate, since the conditional (transitional) probability P (b j | a i) takes only two values: p / (m-1) if b j ≠ a i and 1-p if b j = a i. The first of these values ​​occurs with probability p, and the second with probability 1-p. In addition, since a memoryless channel is considered, the results of receiving individual symbols are independent of each other. So

Consequently, H (B | A) does not depend on the probability distribution in the ensemble A, but is determined only by the transition probabilities of the channel. This property is retained for all channel models with additive noise.

Substituting (4.38) into (4.37), we obtain

Since on the right side only the H (B) term depends on the probability distribution P (A), it is necessary to maximize it. The maximum value of Н (В) according to (4.6) is equal to log m and it is realized when all received symbols b j are equally probable and independent of each other. It is easy to verify that this condition is satisfied if the input symbols are equally probable and independent, since in this case

Moreover, Н (В) = log m and

Hence the throughput per unit of time

For a binary symmetric channel (m = 2) the bandwidth in binary units per unit of time

С = v (4.42)

The dependence of C / v on p according to (4.42) is shown in Fig. 4.3.

For p = 1/2, the bandwidth of the binary channel C = 0, since with such a probability of error, the sequence of output binary symbols can be obtained without transmitting signals at all through the channel, but by choosing them at random (for example, according to the results of a coin toss), i.e., at p = 1/2 sequences at the output and input of the channel are independent. The case C = 0 is called a channel break. The fact that the throughput at p = 1 in a binary channel is the same as at p = 0 (a channel without noise) is explained by the fact that at p = 1 it is sufficient to invert all the output symbols (i.e., replace 0 with 1 and 1 to 0) to correctly restore the input signal.

The bandwidth of the continuous link is calculated in a similar way. Let, for example, the channel has a limited bandwidth F. Then the signals U (t) and Z (t), respectively, at the input and output of the channel, according to the Kotelnikov theorem, are determined by their readings taken at an interval of 1 / (2F), and therefore the information passing through over the channel for some time T, is equal to the sum of the amount of information transmitted for each such sample *. Channel bandwidth per one such sample

Here U and Z are random variables - the cross sections of the processes U (t) and Z (t) at the input and output of the channel, respectively, and the maximum is taken over all admissible input signals, i.e., over all distributions U.

* (Instead of the Kotelnikov series, one can use the decomposition of signals in any orthogonal basis and consider the amount of transmitted information for each member of the series.)

Throughput C is defined as the sum of Sotsch values ​​taken over all counts per second. In this case, of course, the differential entropies in (4.43) must be calculated taking into account the probabilistic connections between the samples.

Let us calculate, for example, the bandwidth of a continuous channel without memory with additive white Gaussian noise having a bandwidth of width F, if the average signal power (variance U) does not exceed a given value P s. The power (dispersion) of the noise in the F band is denoted by P w. The samples of the input and output signals, as well as the noise N, are related by the equality

Z = U + N. (4.44)

Since N has a normal distribution with zero mathematical expectation, then the conditional probability density w (z | u) at fixed and will also be normal - with the mathematical expectation and and variance P w.

Let's find the throughput per sample (4.43):

According to (4.34), the differential entropy h (Z | U) of the normal distribution w (Z | U) does not depend on the expectation and is equal to


Therefore, to find C reference, one should find such a distribution density w (U) at which h (Z) is maximized. From (4.44), taking into account that U and N are independent random variables, we have for the variances:

D (Z) = D (U) + D (N) = P c + P w. (4.45)

Thus, the variance Z is fixed, since P c and P w are given. As noted (see page 114), for a fixed variance, the maximum differential entropy is provided by a normal distribution. It can be seen from (4.44) that for a normal one-dimensional distribution U, the distribution of Z will also be normal and, therefore, the maximum differential entropy (4.34) is provided:

Moving on to the throughput C per second, we note that the information transmitted over several samples is maximum in the case when the signal samples are independent. This can be achieved if the signal U (t) is chosen so that its spectral density is uniform in the F band. As was shown in § 2.2 [cf. (2.48)], the samples, separated by intervals divisible by 1 / (2F), are mutually uncorrelated, and for Gaussian values, uncorrelatedness means independence.

Therefore, the throughput C (per second) can be found by adding the throughputs (4.46) for 2F independent samples:

C = 2FC count = F log (1 + P s / P w). (4.47)

It is realized if U (t) is a Gaussian process with a uniform spectral density in the frequency band F (quasi-white noise).

From (4.47) it can be seen that if the signal power P c were not limited, then the throughput would be arbitrarily large. The throughput is zero if the signal-to-noise ratio P s / P w in the channel is zero. With the growth of this ratio, the throughput increases indefinitely, but slowly, due to the logarithmic dependence.

Relation (4.47) is often called Shannon's formula. This formula is important in information theory, since it determines the dependence of the throughput of the considered continuous channel on such technical characteristics as the bandwidth and signal-to-noise ratio. Shannon's formula indicates the ability to exchange bandwidth for signal strength, and vice versa. However, since C depends on F linearly, and on P c / P w - according to the logarithmic law, it is usually not profitable to compensate for a possible reduction in bandwidth by increasing the signal power. The reverse exchange of signal power for bandwidth is more efficient.

Note that for P c / P w >> 1 expression (4.50) coincides with the characteristic (1.2), called in § 1.2 the capacity (volume) of the channel.

It should be emphasized that Shannon's formula (4.47) is valid only for a channel with constant parameters and additive Gaussian white (or quasi-white) noise. If the distribution of the additive interference is not normal or its spectrum is uneven in the channel bandwidth, then its bandwidth is greater than that calculated by formula (4.47). Multiplicative interference (signal fading) usually degrades the bandwidth of the channel.

In fig. 4.5 shows the dependences of С / F on the average ratio Р s / Р w for a channel with constant parameters (1) and a channel with Rayleigh fading (2). It follows from the analysis of the curves that slow Rayleigh fading reduces the channel bandwidth by no more than 17%.

5.1. Information transfer rate in a discrete communication system

In a discrete communication system, in the absence of interference, the information at the output of the communication channel (PI channel) completely coincides with the information at its input, therefore, the information transfer rate is numerically equal to the performance of the message source:

In the presence of interference, part of the information from the source is lost and the information transfer rate turns out to be lower than the performance of the source. At the same time, information about interference is added to the message at the channel output (Fig. 12).

Therefore, in the presence of interference, it is necessary to take into account at the channel output not all information given by the source, but only mutual information:

bit / s. (5.2)

Based on formula (5.1), we have

where H(x) source performance ;

H(x/ y)  “unreliability” of the channel (loss) per unit of time;

H(y)  entropy of the output message per unit of time;

H(y/ x) =H’(n) - entropy of interference (noise) per unit time.

Bandwidth of the communication channel(information transmission channel) C is the maximum possible rate of information transmission over the channel


.
(5.4)

To achieve the maximum, all possible output sources and all possible encoding methods are taken into account.

Thus, the throughput of the communication channel is equal to the maximum performance of the source at the input of the channel, fully matched to the characteristics of this channel, minus the loss of information in the channel due to interference.

In a channel without interference C= max H(x) , because H(x/ y)=0 ... When using a uniform radix code k consisting of n elements duration eh, in the channel without interference


,

at k=2
bit / s. (5.5)

For the effective use of the channel capacity, it must be matched with the source of information at the input. Such matching is possible both for communication channels without interference, and for channels with interference on the basis of two theorems proved by K. Shannon.

1st theorem (for a communication channel without interference):

If the message source has entropyH(bits per character), and the communication channel is the bandwidthC(bits per second), then you can encode messages in such a way as to transmit information over the channel at an average rate, arbitrarily close to the valueCbut not surpass it.

K. Shannon also proposed a method of such coding, which is called statistical or optimal coding. Later, the idea of ​​such coding was developed in the works of Fano and Huffman and is now widely used in practice for “message compression”.

5.2. Bandwidth of a homogeneous symmetric communication channel

In a homogeneous communication channel, conditional (transient) probabilities p(y 1 / x 1 ) do not depend on time. The graph of states and transitions of a homogeneous binary communication channel is shown in Fig. thirteen.

In this figure x 1 and x 2 - signals at the input of the communication channel, y 1 and y 2 - output signals. If a signal was transmitted x 1 and received signal y 1, this means that the first signal (index 1) is not distorted. If the first signal was transmitted ( x 1), and the second signal is received ( y 2), this means that the first signal was distorted. The transition probabilities are shown in Fig. 13. If the channel is symmetric, then the transition probabilities are pairwise equal.

Let's denote: p(y 2 / x 1 )= p(y 1 / x 2 )= p eh - the probability of signal element distortion, p(y 1 / x 1 )= p(y 2 / x 2 )=1- p eh - the probability of correct reception of the signal element.

In accordance with formulas (5.1) and (5.3)


.

If the signals x 1 and x 2 have the same duration eh, then
... Then the channel capacity will be equal to

. (5.7)

In this formula maxH(y)= log k... For a binary channel ( k = 2) maxH(y)= 1 and formula (5.4) takes the form


. (5.8)

It remains to determine the conditional entropy H(y/ x) ... For a binary source, we have


Substituting this value of the conditional entropy in (5.8), we finally obtain

. (5.9)

For a communication channel with k>2


bit / s.

In fig. 14 is a graph of the dependence of the throughput of the binary channel on the error probability.

For a communication channel with k>2 throughput is determined by an almost similar formula:

In conclusion, consider one example. Let there be a binary source with performance
bit / s.

If the probability of distortion p eh = 0.01, then it follows from this that out of 1000 signal elements transmitted in one second, on average 990 elements will be received without distortion and only 10 elements will be distorted. It would seem that the throughput in this case would be 990 bits per second. However, the calculation by formula (5.9) gives us a value much less ( C= 919 bps). What's the matter here? The point is that we would get C= 990 bps, if you knew exactly which elements of the message were corrupted. Ignorance of this fact (and this is practically impossible to know) leads to the fact that 10 corrupted elements reduce the value of the received message so much that the throughput decreases dramatically.

Another example. If p eh = 0.5, then out of 1000 transmitted elements, 500 will not be corrupted. However, now the throughput will not be 500 bit / s, as one might expect, but the formula (5.9) will give us the value C= 0. Valid for p eh = 0.5, the signal does not actually pass through the communication channel and the communication channel is simply equivalent to a noise generator.

At p eh 1 throughput is approaching its maximum value. However, in this case, the signals at the output of the communication system must be inverted.

The continuous information transmission channel contains a set of means for transmitting continuous signals, while instead of encoding and decoding devices, various kinds of converters (modulation, etc.) are used. Input and output signals in a continuous communication channel represent ensembles of continuous functions with corresponding density distributions of probability.
If a continuous signal is received at the input of the continuous communication channel X (t) duration T, then due to interference f (t) output signal Y (t) will be different from the input. In this case, the amount of information in the signal Y (t) about the signal X (t) equals:
. (13)
Continuous signal can be considered as discrete at. It can be represented in the form of a lattice function, while on the receiving side, for individual samples taken at intervals Dt the original continuous waveform can be restored.
Quantization step Dt = T / n, where n- the number of reference points. According to the Kotelnikov theorem Dt = 1 / 2f c, where f c - cutoff frequency a n = 2Tf c- signal base.
In this case, in expression (13) for mutual information, instead of the entropy difference, one can write the differences of the corresponding differential entropies of individual samples
.

Bandwidth of continuous communication channel
(14)
For a discrete communication channel, the maximum value of the transmission rate corresponds to equally probable characters of the alphabet. For continuous communication, when the target is the average signal strength, the maximum speed is obtained using a normal centered random signal.
If the signal is centered ( m x = 0) i.e. without a constant component in this case, the rest power is zero ( P 0 = 0). The condition of centeredness provides maximum dispersion for a given average signal power
If the signal has a normal distribution, then the a priori differential entropy of each sample is maximum.
Therefore, when calculating the throughput of a continuous channel, we assume that a continuous signal with a limited average power is transmitted over the channel - P c and additive noise ( y = x + f) also with limited average power - P n type of white (Gaussian) noise. Since the interference is additive, the variance of the output signal is
.
In order for the entropy to be maximum for a signal with limited power, it must be Gaussian, while
.
In order for the interference to be maximal, it must also be Gaussian.
.
In this case, the bandwidth of the continuous channel should be equal to the bandwidth of the signal
. (15)
Thus, the information transmission rate with limited average power is maximum if both the signal and the interference are Gaussian, random processes.
The channel bandwidth can be changed by changing the signal spectrum width - f c its power - P c. But increasing the spectrum width increases the interference power. - P n, therefore, the ratio between the channel bandwidth and the level of interference is chosen in a compromise way.
If the distribution f (x) the source of continuous messages differs from the normal one, then the information transfer rate is WITH will be less. Using a functional transducer, a signal with a normal distribution law can be obtained.
Usually p c / p p >> 1, while the throughput of the continuous channel is With n = F to D k. The relationship between the capacity and capacity of the communication channel has the form V k = T to F to D to = T to C p.
Shannon's theorem for a continuous channel with noise. If the entropy of the source of continuous messages is arbitrarily close to the bandwidth of the channel, then there is a transmission method in which all messages of the source will be transmitted with an arbitrarily high fidelity of reproduction.


Example. Through a continuous communication channel with a bandwidth F k = 1 kHz, useful signal is transmitted X (t), which is a normal random process with zero mathematical expectation and variance = 4 mV. The channel is subject to signal-independent Gaussian noise F (t) with zero mathematical expectation and variance = 1 mV.
Define:
- differential entropy of the input signal;
- differential entropy of the output signal;
- conditional differential entropy;
- the amount of information in one continuous sample of the process Y (t) regarding countdown X (t);
- speed of information transfer over a continuous channel with discrete time;
- capacity of a continuous communication channel;
- determine the capacity of the communication channel, if the time of its operation T = 10 m;
- to determine the amount of information that can be transmitted in 10 minutes of channel operation;
- to show that the information capacity of a continuous channel without memory with additive Gaussian noise under the limitation on the peak power is not more than the information capacity of the same channel with the same value of the limitation on the average power. Solution:
Differential Entropy of the Input Signal

= 3.05 bits / sample.
Output differential entropy
= 3.21 bits / sample.
Conditional differential entropy
= 2.05 bits / sample.
The amount of information in one continuous sample of the process Y (t) regarding countdown X (t) is determined by the formula
I (X, Y) = h (x) - h (x / y) = h (y) - h (y / x) = 3.21–2.05 = 1.16 bits / sample.
The information transfer rate over a continuous channel with discrete time is determined by the formula
=
= 2 × 10 3 × = 2320 bps
The throughput of a continuous noisy channel is determined by the formula

=2322 bps.
Let us prove that the information capacity of a continuous channel without memory with additive Gaussian noise under the limitation on the peak power is not more than the information capacity of the same channel with the same value of the limitation on the average power.
The expected value for a symmetric uniform distribution

Mean square for symmetrical uniform distribution

Variance for symmetric uniform distribution

Moreover, for a uniformly distributed process.
Differential entropy of a signal with uniform distribution
.
The difference between the differential entropies of a normal and uniformly distributed process does not depend on the value of the variance
= 0.3 bit / sample
Thus, the throughput and capacity of the communication channel for a process with a normal distribution is higher than for a uniform one.
Determine the capacity (volume) of the communication channel
V k = T k C k = 10 × 60 × 2322 = 1.3932 Mbit.
Let's determine the amount of information that can be transmitted in 10 minutes of channel operation
10× 60× 2322= 1.3932 Mbps.

Tasks

1. Messages composed of the alphabet are transmitted to the communication channel x 1, x 2 and x 3 with probabilities p (x 1) = 0.2; p (x 2) = 0.3 and p (x 3) = 0.5.
The channel matrix looks like:
wherein .
Calculate:
1. Entropy of the information source H (X) and receiver H (Y).
2. General and conditional entropy H (Y / X).
3. Loss of information in the channel during transmission To characters ( k = 100).
4. The amount of information received during transmission To characters.
5. Baud rate, if the transmission time of one character t = 0.01 ms.
2. Alphabet characters are transmitted over the communication channel x 1, x 2, x 3 and x 4 with probabilities. Determine the amount of information received during the transmission of 300 symbols, if the effect of interference is described by the channel matrix:
.
3. Determine the loss of information in the communication channel when transmitting equiprobable symbols of the alphabet, if the channel matrix has the form

.
t = 0.001 sec.
4. Determine the loss of information when transmitting 1000 characters of the source alphabet x 1, x 2 and x 3 with probabilities p = 0.2; p = 0.1 and p () = 0.7 if the influence of interference in the channel is described by the channel matrix:
.
5. Determine the amount of information received when transmitting 600 symbols, if the probabilities of the appearance of symbols at the source output X are equal: and the effect of interference during transmission is described by the channel matrix:
.
6. Messages consisting of symbols of the alphabet are transmitted to the communication channel, while the probabilities of the appearance of symbols of the alphabet are equal to:
The communication channel is described by the following channel matrix:

.
Determine the baud rate if the transmission time of one character ms.
7.Signals are transmitted through the communication channel x 1, x 2 and x 3 with probabilities p = 0.2; p = 0.1 and p () = 0.7. The effect of interference in the channel is described by the channel matrix:
.
Determine the total conditional entropy and the share of information losses that fall on the signal x 1(partial conditional entropy).
8. Alphabet characters are transmitted over the communication channel x 1, x 2, x 3 and x 4 with probabilities.
Channel noise is specified by the channel matrix
.
Determine the bandwidth of the communication channel, if the transmission time of one symbol t = 0.01 sec.
Determine the amount of information received when transmitting 500 symbols, if the probabilities of the appearance of symbols at the input of the receiver Y are equal:, and the effect of interference during transmission is described by the channel matrix:

Bibliography
1 Grinchenko A.G. Information theory and coding: Textbook. allowance. - Kharkov: KhPU, 2000.
2 Kupriyanov M.S., Matyushkin B.D. - Digital signal processing: processors, algorithms, design tools. - SPb: Polytechnic, 1999.
3 Hemming R.W. Digital filters: Per. from English / Ed. A.M. Trakhtman. - M .: Sov. radio, 1980.
4 Sibert W.M. Chains, signals, systems: In 2-ch. / Per. from English - M .: Mir, 1988.
5 Sklyar B. Digital communication. Theoretical foundations and practical application: Per. from English - M .: Publishing house "Williams", 2003. - 1104 p.
6 Kalinin, V.I. Microwave & Telecommunication Technology, 2007. CriMiCo 2007. 17th International Crimean Conference Volume, Issue, 10-14 Sept. 2007 Page (s): 233 - 234
7 Feer K. Wireless digital communication. Modulation and spectrum spreading techniques. Per. from English - M .: Radio and communication, 2000.
8 Ignatov V.A. Theory of information and signal transmission: Textbook for universities. - 2nd ed., Rev. and add. - M .: Radio and communication, 1991;

Topic 2.5. Communication channel bandwidth

In any communication system, information is transmitted through the channel. Its transmission rate depends not only on the channel itself, but also on the properties of the signal supplied to its input and therefore cannot characterize the channel as a means of information transmission. Let's find a way to assess the channel's ability to transmit information. For each source, the amount of information transmitted over the channel takes on its own value.

The maximum amount of transmitted information, taken from all possible sources of the input signal, characterizes the channel itself and is called the channel bandwidth per character:

Bit / char.

(where maximization is performed over all multivariate probability distributions P (A))

You can also determine the bandwidth of the C channel per unit of time.

Let's calculate the bandwidth of a symmetric channel without memory

(2.26)

The magnitude in this case it is easy to calculate, since the conditional (transitional) probability takes only two values: if and (1-P) if.

The first of these values ​​occurs with probability P, and the second with probability (1-P). In addition, since a memoryless channel is considered, the results of receiving individual characters are independent of each other.

(2.27)

Consequently, H (B / A) does not depend on the probability distribution in the ensemble A, but is determined only by the transition probabilities of the channel. This property is retained for all models with additive noise.

Substituting (2.27) into (2.26) we obtain:

Since on the right side only the term H (B) depends on the probability distribution P (A), then it is necessary to maximize it.

The maximum value of H (B) is equal to log m and it is realized when all received symbols are equally probable and independent of each other. It is easy to verify that this condition is satisfied if the input symbols are equally probable and independent, since in this case

Moreover, and

Hence the throughput per unit of time

For a binary symmetric channel (m = 2) the bandwidth in binary units per unit of time

Dependence on P according to formula (2.31)

With P = 1/2, the bandwidth of the binary channel C = 0, since with such a probability of error, the sequence of output binary symbols can be obtained without transmitting signals at all through the channel, but by choosing them at random (for example, according to the results of a coin toss), that is, with P = 1/2 sequence at the output and input of the channel are independent. The case C = 0 is called a channel break. The fact that the bandwidth at P = 1 in the binary channel is the same as at P = 0 (channel without noise) is explained by the fact that at P = 1 it is enough to invert all the output symbols (that is, replace 0 with 1 and 1 with 0 ) to properly restore the input signal.

The bandwidth of a continuous channel is calculated in the same way. Let, for example, the channel has a limited bandwidth F. Then the signals U (t) and Z (t), respectively, at the input and output of the channel according to the theorem. Kotelnikov are determined by their counts taken at intervals of 1 / (2F), and therefore the information passing through the channel for some time T is equal to the sum of the amount of information transmitted for each such count. Channel bandwidth for one such sample:

Here U and Z are random variables - the cross sections of the processes U (t) and Z (t) at the input and output of the channel, respectively, and the maximum is taken over all admissible input signals, that is, over all distributions U.

Throughput C is defined as the sum of the values ​​taken over all samples per second. In this case, of course, the differential entropies in (2.35) should be calculated taking into account the probabilistic connections between the samples.

Let us calculate the throughput of a continuous memoryless channel with additive white Gaussian noise having a bandwidth of width F if the average signal power. The power (variance) of the noise in the F band is denoted by. Samples of the output and input signals, as well as noise N are related by the equality:

Since N has a normal distribution with zero mathematical expectation, then the conditional probability density for a fixed U will also be normal - with the mathematical expectation U and variance.

The throughput per count is determined by the formula (2.32):

According to (2.24), the conditional differential entropy h (Z / U) of the normal distribution does not depend on the mathematical expectation and is equal to. Therefore, to find, it is necessary to find such a distribution density at which h (Z) is maximized. From (2.33), taking into account that U and N are independent random variables, we have for the variances

Thus, the variance is fixed, as it is given. As is known, for a fixed variance, the maximum differential entropy is provided by a normal distribution. It is seen from (2.33) that for a normal one-dimensional distribution U, the distribution of Z will also be normal and, therefore, a maximum of the differential entropy (2.24) is provided.

(2.34)

Turning to the throughput C per second, we note that the information transmitted over several samples is maximum in the case when the signal samples are independent. This can be achieved if the signal U (t) is chosen so that its spectral density is uniform in the F band. Samples separated by intervals divisible by 1 / (2F) are mutually uncorrelated, and for Gaussian values, uncorrelatedness means independence. Therefore, the throughput C (per second) can be found by adding the throughputs (2.35) for 2F independent samples:

(2.36)

It is realized if U (t) is a Gaussian process with a uniform spectral density in the frequency band F (quasi-white noise).

From (2.36) it can be seen that if the signal power were not limited, then the throughput would be arbitrarily large. The bandwidth is zero if the signal-to-noise ratio of the channel is zero. With the growth of this ratio, the throughput increases indefinitely, but slowly, due to the logarithmic dependence.

Relation (2.36) is called Shannon's formula. This formula is important in information theory, since it determines the dependence of the throughput of the considered continuous channel on such technical characteristics as the bandwidth and signal-to-noise ratio. Shannon's formula indicates the ability to exchange bandwidth for signal strength and vice versa. However, since C depends linearly on F, and on - according to the logarithmic law, it is usually not profitable to compensate for a possible reduction in bandwidth by increasing the signal power. The reverse exchange of signal power for bandwidth is more efficient.

The maximum amount of information that can be transmitted on average over a continuous channel per time,

For gaussian channel

(2.37)

Note that at Expression (2.37) coincides with the characteristic of the named capacity (volume) of the channel.

Top related articles