How to set up smartphones and PCs. Informational portal
  • home
  • In contact with
  • The amount of information contained in a certain message. Joint problem solving

The amount of information contained in a certain message. Joint problem solving

The amount of information as a measure of reducing the uncertainty of knowledge. The information that a person receives can be considered a measure of reducing the uncertainty of knowledge. If some message leads to a decrease in the uncertainty of our knowledge, then we can say that such a message contains information.

Messages usually contain information about any events. The amount of information for events with different probabilities is determined by the formula:

or from the exponential equation:

Example 2.1. After the computer science exam that your friends took, the grades ("2", "3", "4" or "5") are announced. How much information will be conveyed by the assessment message for Student A, who has learned only half of the tickets, and the assessment message for Student B, who has learned all the tickets.

Experience shows that for a student A, all four assessments (events) are equally probable, and then the amount of information carried by the assessment message can be calculated using formula 2.2:

I = log 2 4 = 2 bits

Based on experience, we can also assume that for student B, the most likely grade is “5” (p 1 = 1/2), the probability of grade “4” is half (p 2 = 1/4), and the probability of grade “2” and "3" is still two times less (p 3 = p 4 = 1/8). Since the events are unequal, we will use formula 2.1 to calculate the amount of information in the message:

I = - (1 / 2Elog 2 1/2 + 1 / 4Elog 2 1/4 + 1 / 8Elog 2 1/8 + 1 / 8Elog 2 1/8) bit = 1.75 bit

Calculations have shown that with equiprobable events we get more information than with unequal probable events.

Example 2.2. The opaque bag contains 10 white, 20 red, 30 blue and 40 green balls. How much information will the visual message contain about the color of the removed ball.

Since the number of balls of different colors is not the same, the visual messages about the color of the ball taken out of the bag also differ and are equal to the number of balls of a given color divided by the total number of balls:

p b = 0.1; p k = 0.2; p s = 0.3; p c = 0.4

Events are unequal, therefore, to determine the amount of information contained in the message about the color of the ball, we will use formula 2.1:

I = - (0.1 log 2 0.1+ 0.2 log 2 0.2 + 0.3 log 2 0.3 + 0.4 log 2 0.4) bit

Example 2.3. How many questions is it enough to ask your interlocutor to determine for sure the month in which he was born?

Let's consider 12 months as 12 possible events. If you ask about a specific month of birth, then you may have to ask 11 questions (if the first 11 questions were answered negatively, then the 12th question is optional, since it will be correct).

It is correct to ask "binary" questions, i.e. questions that can only be answered "Yes" or "No". For example, "Were you born in the second half of the year?" Each such question breaks the set of options into two subsets, one for the answer "Yes" and the other for the answer "No".

The correct strategy is to ask questions so that the number of possible options is halved each time. Then the number of possible events in each of the obtained subsets will be the same and their guessing is equally probable. In this case, at each step, the answer ("Yes" or "No") will carry the maximum amount of information (1 bit).

Using formula 2.2 and using a calculator, we get:

I = log 2 12 "3.6 bits

The number of bits of information received corresponds to the number of questions asked, but the number of questions cannot be a non-integer number. We round up to a larger integer and we get the answer: with the right strategy, you need to ask no more than 4 questions.

Units for measuring the amount of information

Units for measuring the amount of information. For a unit of information, 1 bit is taken - the amount of information contained in the message, which halves the uncertainty of knowledge.

The following system of units for measuring the amount of information is adopted:

1 byte = 8 bits

1 KB = 2 10 bytes

1 MB = 2 10 KB = 2 20 bytes

1 GB = 2 10 MB = 2 20 KB = 2 30 bytes

Determination of the amount of information presented using sign systems

If we consider the symbols of the alphabet as a set of possible messages (events) N, then the amount of information that one character carries can be determined from formula 2.1. If we consider the appearance of each character of the alphabet in the text as events of equal probability, then to determine the amount of information, you can use Formula 2.2 or Equation 2.3.

The amount of information that one character of the alphabet carries, the more, the more characters are included in this alphabet, i.e. the greater the power of the alphabet.

The amount of information contained in a message encoded using the sign system is equal to the amount of information that one character carries, multiplied by the number of characters in the message.

Example 2.5. What is the power of the alphabet with which a message containing 2048 characters is written, if its size is 1.25 Kbytes.

Convert the information volume of the message into bits:

I = 10 240 bits

Determine the number of bits per character:

10 240 bits: 2 048 = 5 bits

Using formula 2.3, determine the number of characters in the alphabet.

Author details

Chetvergova Yu. N.

Place of work, position:

MOU "Secondary School No. 1 in Porkhov", teacher

Pskov region

Lesson characteristics (lessons)

The level of education:

Secondary (complete) general education

The target audience:

Teacher (teacher)

Class (s):

Item (s):

Informatics and ICT

The purpose of the lesson:

Repetition, consolidation, control of knowledge and skills

Lesson type:

Lesson of complex application of ZUN students

Students in the classroom (audience):

Methodical literature used:

Lesson-less developments in computer science. Grade 10. O. L. Sokolova;

Equipment used:

Calculator program

Calculator

Theme. The amount of information. Hartley and Shannon formulas

Course of the lesson

Repetition of the material passed in the lesson. Supplement. (10 minutes)

Training cards. Group work (20 minutes)

Solving problems. Pair work (10 minutes)

Test. (40 minutes)

Mutual verification. Work on bugs.

Basic knowledge, skills and competencies

Knowledge:

Which events are equiprobable, which are not equiprobable;

How to find the probability of an event;

How to find the amount of information in a message for different events.

Skills:

Distinguish between equiprobable and non-equiprobable events;

Find the amount of information for different events.

Competencies:

Cooperation

Communicativeness

Creativity and curiosity

Critical thinking (value judgment)

Repetition of the material passed in the lesson

What events are equally probable, what are not equally probable?

In 1928, the American engineer R. Hartley proposed a scientific approach to the evaluation of messages. The formula he proposed was as follows:

I = log 2 K,
Where K is the number of equally probable events; I - the number of bits in the message, such that any of the K events happened. Then K = 2 I.
Sometimes the Hartley formula is written like this:

I = log 2 K = log 2 (1 / p) = - log 2 p,
since each of the K events has an equally probable outcome p = 1 / K, then K = 1 / p.

The ball is in one of three boxes: A, B or C. Determine how many bits of information the message contains that it is in box B.

Solution.

Such a message contains I = log 2 3 = 1.585 bits of information.

But not all situations have the same probability of being realized. There are many such situations in which the probabilities of realization differ. For example, if a biased coin or the "sandwich rule" is thrown.

“Once, as a child, I dropped a sandwich. Watching me guiltily wipe the oil stain left on the floor, my older brother reassured me:

Do not grieve, it worked the law of the sandwich.

What other law is this? I asked.

The law that says: "A sandwich always falls butter down." However, this is a joke, - continued the brother. - There is no law. It's just that the sandwich really behaves rather strange: most of the butter ends up at the bottom.

Let's drop the sandwich a couple more times and check it, ”I suggested. - All the same, you will have to throw it out.

Checked out. Out of ten times, eight sandwich fell butter down.

And then I thought: is it possible to know in advance how the sandwich will now fall butter down or up?

Our experiments were interrupted by our mother ... "
(Excerpt from the book "The Secret of Great Commanders", V. Abchuk).

In 1948, the American engineer and mathematician K. Shannon proposed a formula for calculating the amount of information for events with different probabilities.
If I is the amount of information,
K - the number of possible events, p i - the probabilities of individual events,
then the amount of information for events with different probabilities can be determined by the formula:

I = - Sum p i log 2 p i, where i takes values ​​from 1 to K.

Hartley's formula can now be viewed as a special case of Shannon's formula:

I = - Sum 1 / K log 2 (1 / K) = I = log 2 K.

In case of equiprobable events, the amount of information received is maximum.

How do you find the probability of an event?

If the information contained in a message is new, understandable for a person, replenishes his knowledge, i.e. lead to a decrease in the uncertainty of knowledge, then the message contains information.

1 bit - the amount of information contained in the message, which reduces the uncertainty of knowledge by 2 times.

Example

When a coin is thrown, 2 events (cases) are possible - the coin will fall heads or tails, and both events are equally probable (with a large number of tosses, the number of cases of the coin falling heads and tails is the same). After receiving a message about the result of a coin falling, the uncertainty of knowledge decreased by 2 times, and, therefore, the amount of information received in this case is equal to 1 bit.

How to find the amount of information in a message for different events?

Calculation of the amount of information for equiprobable events.

If the events are equally probable, then the amount of information can be calculated using the formula:

N = 2 I

where N - the number of possible events,

I - the amount of information in bits.

The formula was proposed by the American engineer R. Hartley in 1928.

Objective 1.There are 32 pencils in the box, all pencils are of different colors. They pulled out a red one at random. How much information was received?

Solution.

Since pulling out a pencil of any color from the 32 pencils available in the box is equally probable, the number of possible events

equals 32.

N = 32, I =?

N = 2 I, 32 = 2 5, I = 5 bits.

Answer: 5 bit.

Calculating the amount of information for events with different probabilities.

There are many situations where possible events have different probabilities of being realized. Let's consider examples of such events.

1. There are 20 pencils in the box, of which 15 are red and 5 are black. The probability of pulling out a red pencil at random is greater than a black one.

2. If the sandwich is accidentally dropped, it is more likely to fall with the butter down (heavier side) than the butter up.

3. The pond is home to 8,000 crucians, 2,000 pikes and 40,000 minnows. The most likely for a fisherman is to catch a gudgeon in this pond, in second place - a crucian carp, in third - a pike.

The amount of information in a message about a certain event depends on its probability. The less likely an event is, the more information it carries.
P = K / N , where K is the number of cases of realization of one of the outcomes of the event, N - the total number of possible outcomes of one of the events
2
I = log 2 (1 / p), where I - the amount of information, p - probability of an event

Problem 1 . There are 50 balls in the box, of which 40 are white and 10 are black. Determine the amount of information in the message about drawing a white ball and a black ball at random.

Solution.
Probability of pulling out a white ball

P 1 = 40/50 = 0,8
The likelihood of pulling a black ball
P 2 = 10/50 = 0,2
Amount of information about pulling out the white ball
I 1 = log 2 (1 / 0.8) = log 2 1.25 = log 1.25 / log 2 " 0.32 bit
The amount of information about pulling out the black ball

I 2 = log 2 (1 / 0.2) = log 2 5 = log5 / log2"2.32 bit

Answer: 0.32 bit, 2.32 bit

What is a logarithm?

Logarithm of a to base b is called the exponent to which the number must be raised a to get the number b.

a logab = b, a> 0, b> 0, a ≠ 1

Analysis of tasks
Determine the amount of information received during the implementation of one of the events, if thrown
a) asymmetrical tetrahedral pyramid;
b) a symmetrical and homogeneous tetrahedral pyramid.

Solution.

A) We will throw an asymmetrical tetrahedral pyramid.
The probability of individual events will be as follows:
p1 = 1/2,
p2 = 1/4,
p3 = 1/8,
p4 = 1/8,
then the amount of information received after the implementation of one of these events is calculated by the formula:
I = - (1/2 log 2 1/2 + 1/4 log 2 1/4 + 1/8 log 2 1/8 + 1/8 log 2 1/8) = 1/2 + 2/4 + 3 / 8 + 3/8 = 14/8 = 1.75 (bit).
b) Now let's calculate the amount of information that will be obtained when throwing a symmetrical and homogeneous tetrahedral pyramid:
I = log 2 4 = 2 (bit).
2. The probability of the first event is 0.5, and the probability of the second and third is 0.25. How much information will we get after implementing one of them?
3. How much information will be obtained when playing roulette with 32 sectors?
4. How many different numbers can you encode using 8 bits?
Solution: I = 8 bits, K = 2 I = 2 8 = 256 different numbers.

Objective 2.The lake is inhabited by crucians and perches. It is estimated that there are 1500 crucians and 500 perches. How much information is there in reports that a fisherman caught a crucian carp, perch, caught a fish?

Solution.
The events of catching a crucian carp or a perch are not equally probable, since there are fewer perches in the lake than carps.

The total number of crucians and perches in the pond is 1500 + 500 = 2000.
The probability of hitting a crucian carp fishing rod

p 1 = 1500/2000 = 0.75, perch p 2 = 500/2000 = 0.25.

I 1 = log 2 (1 / p I), I 1 = log 2 (1 / p 2), where P 1 and P 2 - the likelihood of catching crucian carp and perch, respectively.

I 1 = log 2 (1 / 0.75) "0.43 bit, I 2 = log 2 (1 / 0.25) = 2 bits - the amount of information in the message to catch a crucian carp and catch a perch, respectively.

The amount of information in the message to catch fish (crucian carp or perch) is calculated using Shannon's formula

I = - p 1 log 2 p 1 - p 2 log 2 p 2

I = - 0.75 * log 2 0.75 - 0.25 * log 2 0.25 = - 0.75 * (log0.75 / log2) -0.25 * (log0.25 / log2) =

0,311 + 0,5 = 0,811

Answer:the message contains 0.811 bits of information

Practice cards (20 minutes)

№1

1. The box contained 32 multi-colored pencils. How much information does the message that a red pencil have been taken out of the box?

2. The message that your friend lives on the 9th floor carries 4 bits of information. How many floors are there in the house?

3. How many kilobytes will a message be composed of 384 characters in a 16-character alphabet?

4. The book, typed with a computer, contains 250 pages; each page contains 40 lines, each line contains 60 characters. How much information is in the book?

5. Write down the following numbers in binary notation: 37 and 52.

№2

2. There are 8 shelves with books in the school library. Each rack has 4 shelves. The librarian told Vasya that the book he needed was on the fifth shelf on the second shelf from the top. How much information did the librarian give Vasya?

4. What amount of information contains a message that halves the uncertainty of knowledge?

5. Write down the following numbers in binary notation: 12 and 49.

1. When guessing an integer in a certain range, 8 bits of information were received. How many numbers does this range contain?

2. You came to a traffic light when the red light was on. After that, the yellow light came on. How much information did you get?

3. The Pulti tribe has a 16-character alphabet. The Multi tribe uses a 32-character alphabet. The tribal leaders exchanged letters. The letter of the Pulti tribe contained 90 characters, and the letter of the Multi tribe contained 70 characters. Compare the amount of information contained in the letters.

4. How many kilobytes will a message be composed of 384 characters in an 8-character alphabet?

5. Write down the following numbers in binary notation: 33 and 15.

2. The message is 2 pages long and contains 1 / 16Kbyte of information. Each page contains 256 characters. How much information does one letter of the alphabet used carry?

3. A message written in letters from a 128-character alphabet contains 11 characters. How much information does it carry?

4. The box contains 64 different colored pencils. How much information is there in the message that a green pencil was taken out of the box?

5. Write down the following numbers in binary notation: 17 and 42.

1. How much information will the second player receive after the first player's first move in the tic-tac-toe game on the 4x4 field?

2. There are 8 balls in the lottery drum. How much information does the message contain about the first drawn number, for example, number 2 dropped out?

3. The number of bits of information in the message “Misha took one of 16 places at the Olympiad in Informatics”?

4. The raster graphic file contains a black-and-white image with 16 shades of gray, 10x10 pixels in size. What is the information volume of this file?

5. Write down the following numbers in binary notation: 28 and 51.

1. The alphabet of the Multi tribe consists of 8 letters. How much information does a 13-character message contain?

2. A raster graphic file contains a black and white image (without grayscale) with a size of 100x100 pixels. What is the information volume of this file?

3. When guessing an integer in a certain range, 5 bits of information were received. How many numbers does this range contain?

4. A telegram was received: "Meet, car 6". It is known that the train has 16 carriages. How much information was received?

5. Write down the following numbers in binary notation: 23 and 38.

1. A symmetrical tetrahedral pyramid is thrown. How much information do we receive in a visual message about its fall on one of the faces?

2. What is the information volume of the text containing the word ENCODING in 8-bit encoding?

3. Color (with a palette of 256 colors) raster graphic image has a size of 10x10 pixels. How much memory will this image take?

4. The message that your friend lives on the 8th floor carries 4 bits of information. How many floors are there in the house?

5. Write down the following numbers in binary notation: 19 and 46.

1. One card is selected from a deck of 32 cards. How much information do we get in the visual message about the choice of a particular card?

2. How much information is required to binary encode each character in a 256 character set?

3. The text takes up 0.5 Kbytes of computer memory. How many characters does this text contain?

4. The alphabet of the Pulti tribe consists of 128 letters. How much information does one letter of this alphabet carry?

5. Write down the following numbers in binary notation: 11 and 35.

1. “Is your friend at home?” A student was asked at school. “No,” he replied. How much information does the answer contain?

2. The message is 3 pages, 25 lines each. Each line contains 60 characters. How many characters are in the alphabet used if the entire message contains 1125 bytes?

3. The box contains 16 colored balls. How much information does the message contain that a yellow ball has been taken out of the box?

4. When guessing an integer in a certain range, 5 bits of information were received. How many numbers does this range contain?

5. Write down the following numbers in binary notation: 13 and 41.

1. What is the number of bits of information in the message “Vanya took one of 8 places at the Olympiad in Informatics”?

2. The book, typed with the help of a computer, contains 150 pages; each page contains 40 lines, each line contains 60 characters. How much information is in the book? Define in Kbytes.

3. When guessing an integer in the range from 1 to N, 8 bits of information were received. What is N?

4. A message written in letters from a 32-character alphabet contains 30 characters. How much information does it carry?

5. Write down the following numbers in binary notation: 16 and 39.

1. The alphabet of the Multi tribe consists of 16 letters. How much information does one letter of this alphabet carry?

2. The message that your friend lives on the 8th floor carries 5 bits of information. How many floors are there in the house?

3. Find the maximum number of books (each 200 pages, each page 60 lines, 80 characters per line), completely located on a laser disk with a capacity of 600 MB.

4. How much information is needed to guess one of the 64 numbers?

5. Write down the following numbers in binary notation: 14 and 53.

1. A telegram was received: "Meet, car 4". It is known that the train has 8 carriages. How much information was received?

2. The size of a message containing 2048 characters was 1/512 of a MB. What is the size of the alphabet (how many characters are in the alphabet?) With which the message is written?

3. "Are you getting off at the next stop?" - asked the man on the bus. “Yes,” he replied. How much information does the answer contain?

4. A message written in letters from a 16-character alphabet contains 25 characters. How much information does the answer contain?

5. Write down the following numbers in binary notation: 26 and 47.

1. How many kilobytes is a message containing 12288 bits?

2. What amount of information contains a message that reduces the uncertainty of knowledge by 4 times?

3. How many characters does a message written using a 16-character alphabet contain, if its size is 1/16 of a MB?

4. A group of schoolchildren came to the pool, in which there are 8 swimming lanes. The trainer advised that the group would be swimming on lane 4. How much information did the students receive from this message?

5. Write down the following numbers in binary notation: 18 and 25.

1. You came to a traffic light when the yellow light was on. After that, the green light turned on. How much information did you get?

2. A 256-character alphabet was used to write the text. Each page contains 30 lines of 60 characters per line. How much information does 6 pages of text contain?

3. There are 64 balls in the lottery drum. How much information does the message contain about the first drawn number (for example, number 32)?

4. When guessing an integer in a certain range, 7 bits of information were received. How many numbers does this range contain?

5. Write down the following numbers in binary notation: 27 and 56.

1. The message that Petya lives in the first entrance carries 2 bits of information. How many entrances are there in the house?

2. A message written in letters from a 128-character alphabet contains 40 characters. How much information does it carry?

3. An informational message of 1.5 Kbytes contains 3072 characters. How many characters are in the alphabet with which this message was written?

4. How many kilobytes will a message be composed of 284 characters in a 16-character alphabet?

5. Write down the following numbers in binary notation: 10 and 29.

1. How much information will the second player receive after the first player's first move in the tic-tac-toe game on the 4x4 field?

2. How many bytes of information is contained in 1MB?

3. What was the number of possible events if after the implementation of one of them we received the amount of information equal to 7 bits?

4. A 64-character alphabet was used to record the message. Each page contains 30 lines. The entire message contains 8775 bytes of information and is 6 pages long. How many characters are there in a line?

5. Write down the following numbers in binary notation: 22 and 59.

1. A message written in letters from a 128-character alphabet contains 40 characters. How much information does it carry?

2. How much information will the second player receive in the game “Guess the Number” with the correct strategy, if the first player guessed a number in the range from 1 to 64?

3. A 256-character alphabet was used to write the text. Each page contains 30 lines of 70 characters per line. How much information does 3 pages of text contain?

4. The text takes 0.25KB of computer memory. How many characters does this text contain?

5. Write down the following numbers in binary notation: 32 and 51.

1. How many bits of information is contained in 1 Kbyte?

2. The first tribe has a 16-character alphabet. The second tribe uses a 32-character alphabet. The tribal leaders exchanged letters. The letter of the first tribe contained 90 characters, and the letter of the second tribe contained 80 characters. Compare the amount of information contained in the letters.

3. How much information will be obtained when playing roulette with 32 sectors?

4. Information is transmitted at a speed of 2.5Kb / s. How much information will be transmitted in 20 minutes?

5. Write down the following numbers in binary notation: 21 and 48.

Optional problem solving (20 minutes)

№1

The message is written using an alphabet of 8 characters. How much information does one letter of this alphabet carry? Solution: I = log 2 8 = 3 bits.

Answer: 3 bits.

№2

The information volume of one symbol of some message is 6 bits. How many characters are in the alphabet with which this message was / was composed? Solution: N = 2 I = 2 6 = 64 characters.

Answer: 64 characters.

№3

The information volume of one symbol of some message is equal to 5 bits. What are the limits (maximum and minimum value) of the power of the alphabet with which this message is composed?

Solution: N = 2 I = 2 5 = 32 - the maximum value of the power of the alphabet. If there are at least one more characters, then 6 bits are required for encoding.

The minimum value is 17 characters, since for fewer characters, 4 bits will suffice. Answer: 4 bits.

№4

A message written in letters from a 128-character alphabet, containing 30 characters. How much information does it carry?

Given: N = 128, K = 30.

Find: 1 t -?

Solution:

1) I т = KI, unknown I;

2) I = log 2 N = log 2 l 28 = 7 bits - the size of one symbol;

3) I m = 30 * 7 = 210 bits - the size of the entire message.

Answer:210 bits is the size of the entire message.

№5

The message is written using a 32-character alphabet and contains 80 characters. Another message is written using a 64-character alphabet and contains 70 characters. Compare the amount of information contained in the messages.

Given: N 1 = 32, K 1 = 80, N 2 = 64, K 2 = 70.

Find: I т1 I т2

Solution:

I) I 1 = log 2 Nl = log 2 32 = 5 bits - the size of one character of the first message;

We are all accustomed to the fact that everything around can be measured. We can determine the mass of the parcel, the length of the table, the speed of the vehicle. But how do you determine the amount of information contained in a message? The answer to the question is in the article.

So let's select a message first. Let it be " Printer - information output device.". Our task is to determine how much information is contained in a given message. In other words, how much memory is required to store it.

Determining the amount of information in a message

To solve the problem, we need to determine how much information one character of the message carries, and then multiply this value by the number of characters. And if we can count the number of characters, then the weight of the character needs to be calculated. To do this, count the number various characters in the message. Let me remind you that punctuation marks and a space are also symbols. In addition, if the message contains the same uppercase and lowercase letter, we count them as two different characters. Let's get started.

In a word a printer 6 different characters ( R occurs twice and is counted once), then the 7th character space and ninth - dash... Since there was already a space, we do not count it after the dash. In a word device 10 characters, but different - 7, as letters With, T and O are repeated. Also the letters T and R was already in the word a printer... So it turns out that in the word the device 5 various symbols. Counting in this way, we get that there are 20 different characters in the message.

2 i = N

Substituting into it instead of N the number of different characters, we find out how much information one character carries in bits. In our case, the formula will look like this:

2 i = 20

Let's remember and understand that i is in the range from 4 to 5 (since 2 4 = 16, and 2 5 = 32). And since the bit is minimal and cannot be fractional, we round i up to 5. Otherwise, if we accept that i = 4, we could encode only 2 4 = 16 characters, and we have 20. Therefore we get that i = 5, that is, each character in our message carries 5 bits of information.

It remains to count how many characters are in our message. But now we will count all characters, it does not matter whether they are repeated or not. We get that the message consists of 39 characters. And since each character is 5 bits of information, then multiplying 5 by 39 we get:

5 bits x 39 characters = 195 bits

This is the answer to the question of the problem - there are 195 bits of information in the message. And, to summarize, you can write algorithm for finding the amount of information in a message:

  • count the number of different characters.
  • substituting this value into the formula 2i = N find the weight of one symbol (rounded up)
  • calculate the total number of characters and multiply this number by the weight of one character.

In order to be able to compare different sources of messages and different lines and channels of communication, it is necessary to introduce some quantitative measure that allows one to evaluate the information contained in the message and carried by the signal. Such a measure in the form of the amount of information was introduced by K. Shannon on the basis of the concept of choice, which allowed him to build a fairly general mathematical theory of communication.

Let us consider the main ideas of this theory in relation to a discrete source issuing a sequence of elementary messages. Let's try to find a convenient measure of the amount of information contained in some message. The main idea of ​​information theory is that this measure is determined not by the specific content of a given message, but by the fact that the source selects a given elementary communication from a finite set. This idea is justified by the fact that on its basis it was possible to obtain a number of far-reaching and at the same time non-trivial results, which are in good agreement with intuitive ideas about the transfer of information. The main of these results will be presented below.

So, if the source selects one elementary message () from the set of the alphabet, then the amount of information it gives out depends not on the specific content of this element, but on how this choice is made. If the selected message element is predetermined, then it is natural to assume that the information contained in it is equal to zero. Therefore, we will assume that the choice of a letter occurs with some probability. This probability may, generally speaking, depend on which sequence preceded the given letter. Let us assume that the amount of information contained in an elementary message is a continuous function of this probability, and we will try to determine the form of this function so that it satisfies some of the simplest intuitive ideas about information.

For this purpose, we will make a simple transformation of the message, which consists in the fact that each pair of "letters" created sequentially by the source, we will consider as one enlarged "letter". This transformation will be called alphabet enlargement. The set of enlarged "letters" forms an alphabet with a volume, since after each of the elements of the alphabet, any of the elements can, generally speaking, be selected. Let there be a probability that the source will make a sequential selection of elements and. Then, considering the pair as a letter of the new alphabet, it can be argued that this pair contains the amount of information.

It is natural to require that the amount of information contained in a pair of letters satisfy the additivity condition, that is, equal the sum of the amounts of information contained in each of the letters and the original alphabet. The information contained in a letter is equal to, where is the probability of choosing a letter after all the letters preceding it. To determine the information contained in a letter, it is necessary to take into account the probability of choosing a letter after the letter, taking into account also all the letters preceding the letter. We denote this conditional probability. Then the amount of information in a letter will be expressed by a function.

On the other hand, the probability of choosing a pair of letters according to the rule of multiplication of probabilities is

The requirement of additivity of the amount of information in the operation of enlarging the alphabet leads to the equality

Let and. Then for any and the equation must be observed

We exclude cases or from consideration, since due to the finite number of letters of the alphabet, these equalities mean that the choice of a pair of letters by the source is an impossible event.

Equality (1.3) is a functional equation from which the form of the function can be determined. We differentiate both sides of equation (1.3) with respect to p:

.

We multiply both sides of the resulting equation by p and introduce the notation, then

(1.4)

This equation should be true for any and any. The last restriction is not essential, since equation (1.4) is symmetric with respect to and and, therefore, must be satisfied for any pair of positive values ​​of the arguments not exceeding one. But this is possible only if both sides of (1.4) represent some constant value, whence

Integrating the resulting equation, we find

, (1.5)

where is an arbitrary constant of integration.

Formula (1.5) defines a class of functions expressing the amount of information when choosing a letter that has a probability and satisfying the additivity condition. To determine the constant of integration, we will use the condition stated above, according to which a predetermined message element, i.e., having a probability, does not contain information. Therefore, whence it immediately follows that. - the base of natural logarithms), or, in other words, is equal to the information contained in the message that an event has occurred, the probability of which was

assuming that the logarithm is taken on any basis, as long as this basis is preserved throughout the problem being solved.

Due to the property of additivity of information, expressions (1.6) make it possible to determine the amount of information not only in the letter of the message, but also in any arbitrarily long message. It is only necessary to take for the probability of choosing this message from all possible ones, taking into account the previously selected messages.

The main content of the topic: There are two known approaches to measuring information: meaningful and alphabetical. The alphabetic approach is used to measure the amount of information in a text presented as a sequence of characters of a certain alphabet. This approach is not related to the content of the text. The amount of information in this case is called the information volume of the text. From the point of view of a meaningful approach to measuring information, the issue of the amount of information in a message received by a person is being resolved.

Practical work 2. Solving problems using the Hartley formula

Objective: determination of the amount of information with a meaningful approach.

1) a person receives a message about some event; in this case, the uncertainty of a person's knowledge of the expected event is known in advance. The uncertainty of knowledge can be expressed either by the number of possible variants of an event, or by the probability of expected variants of an event;

2) as a result of receiving the message, the uncertainty of knowledge is removed: from a certain possible number of options, one was chosen;

3) the formula calculates the amount of information in the received message, expressed in bits.

The formula used to calculate the amount of information depends on situations, which can be two:

1. All possible variants of the event are equally probable. Their number is finite and equal to N.

2. The probabilities (p) of possible variants of the event are different and they are known in advance: (p i), i = 1..N.

If events are equally probable, then the values ​​of i and N are related to each other by the Hartley formula:

2 i = N (1), where

i - the amount of information in the message that one of N equiprobable events has occurred, measured in bits.

N is the number of possible variants of the event.

Hartley's formula is an exponential equation. If i is an unknown quantity, then the solution to equation (1) will be:

Formulas (1) and (2) are identical to each other.

Equipment:

1. Review the following examples of problems with solutions. Write it down in a notebook.

Objective 1. Find the amount of information in an unambiguous message.

Solution:

N = 1 => 2 i = 1 => i = 0 bits

Objective 2. Measure the amount of information when answering the question: "What precipitation is planned tomorrow?"

Solution:

N = 4 => 2 i = 4 => i = 2 bits

Objective 3. A 10-bit message was received. How many messages can be compiled from the received data?

Solution:

i = 10 => 2 10 = 1024 => N = 1024 messages

1. How much information does the message contain that the Queen of Spades was taken from the deck of cards?

2. How much information does the message on a face with the number 3 on a six-sided dice contain?

3. Someone conceived a natural number in the range from 1 to 32. What is the minimum number of questions to ask in order to be guaranteed to guess the conceived (highlighted) number. Answers can only be "yes" or "no".

4. (The problem of a counterfeit coin). There are 27 coins, of which 26 are real and one is counterfeit. What is the minimum number of weighings on the balance scale, for which it is guaranteed to determine one counterfeit coin out of 27, using the fact that the counterfeit coin is lighter than the real one. Lever scales have two cups and with their help you can only establish whether the contents of the cups are the same in weight, and if not, which of the cups is heavier.

5. How many questions should be asked and how should they be formulated to find out which of the 16 tracks your train departs from?

6. How much information will the first player receive after the first move of the second player in the game "tic-tac-toe" on a 4 x 4 field?

7. After the implementation of one of the possible events, we received the amount of information equal to 15 bits. How many possible events were there originally?

8. Determine the strategy of guessing one card from a deck of 32 playing cards (all four sixes are missing), if the answers are “yes” or “no”.

9. When playing dice, a cube with six sides is used. How many bits of information does the player get on each die roll?

10. The message that your friend lives on the 6th floor carries 4 bits of information. How many floors are in the house.

11. The information capacity of the message that a green ball was taken out of the basket, where a number of colored balls were lying, carries 0, 375 bytes of information. How many balls were in the basket.

12. The library has 16 shelves. Each shelf has 8 shelves The librarian told Olya that the book she was interested in was on shelf 3, on the 2nd shelf from the top. How much information did Olya receive?

13. The bag contains 30 balls, of which 10 are white and 20 are black. How much information is there in the message that you got a white ball, a black ball?

14. There are 30 people in the class. For the test in mathematics, 6 fives, 15 fours, 8 triples and 1 two were received. How much information in the message that Ivanov received a four?



15. The basket contains 32 balls of wool. Among them there are 4 red ones. How much information is there in the message that a ball of red wool was taken out?

16. The box contains 64 colored pencils. The message that a white pencil was taken out carries 4 bits of information. How many white pencils were in the basket?

17. Gloves (white and black) are in the box. Among them are 2 black pairs. The message that a pair of black gloves was taken out of the box carries 4 bits of information. How many pairs of gloves were there in the box?

Control questions:

1. What is the basis for measuring the amount of information?

2. How is the unit of the amount of information determined in the cybernetic approach?

3. What is taken as the minimum unit of information in terms of reducing the uncertainty of knowledge by 2 times?

4. In what cases is the Hartley formula applied?

Practical work 3. Calculation of the amount of information based on a probabilistic approach

Objective: improving the skill of determining the amount of information based on a probabilistic approach

Brief theoretical background: see practical work 2.

Equipment: didactic materials on the topic "Determining the amount of information"

Sequence of execution:

Objective 1. There are only 20 different words in the language of the Mumbo-Yumbo tribe. How many bits does it take to encode any of these words?

Solution.

· According to the condition of the problem, we have 20 different options.

The number of bits of information required to set 20 equally probable (equally taken into account) options can be calculated using the formula:

h = log 2 20 "4.32 bit

or when choosing a two-character alphabet for encoding, it is enough to compose a word of 5 bits.

Objective 2. The house has 14 windows. How many different signals can be given by turning on the lights in the windows? How many bits of information does each such signal carry?

Solution.

· Each window carries 1 bit of information: on - off.

· The number of different equally probable signals transmitted by means of 14 bits is 2 14 = 16 384.

· Each of 16 384 signals carries 14 bits of information.

2. Solve the following tasks. Fill out the result in a notebook.

1. There are balls in the basket. All different colors. The message that the blue ball was taken out carries 5 bits of information. How many balls are there in the basket?

2. There are 4 teams participating in the competition. How much information is in the message that the 3rd team won?

3. A group of schoolchildren came to the pool, which has 4 swimming lanes. The coach advised that the group would be swimming on lane 3. How much information did the students receive from this message?

4. There are 5 blue and 15 red balls in the box. How much information is conveyed by the message that a blue ball has been taken out of the box?

5. The box contains cubes of three colors: red, yellow and green, and there are twice as many yellow red, and 6 more green than yellow ones. The message that a yellow cube was accidentally pulled out of the box contained 2 bits of information. How many green cubes were there?

6. Students of the group study one of three languages: English, German or French, and 12 students do not study English. The message that a randomly selected student Petrov is studying English carries log23 bits of information, and that Ivanov is studying French - 1 bit. How many students are learning German?

7. In the composition of 16 cars, among which K - compartment, P - reserved seat and SV - sleeping. The message that your friend is coming to SV carries 3 bits of information. How many SV cars are there in the train?

8. The student group consists of 21 people who study German or French. The message that student A is learning German carries log 2 3 bits of information. How many people are learning French?

9. How much information does the message say that a number was guessed in the range of integers from 684 to 811?

10. For remote transmission of various commands to the robot, 6-bit signals are used, and a 5-bit signal is not enough to transmit all commands. Can the total number of all commands for this robot be equal to:

42 teams? 70 teams?

28 teams? 55 teams?

What is the smallest and largest number of commands a robot can receive?

11. Eleven classmates decide by voting where to go after school. When voting, everyone can be either “for” or “against”. How many different voting options can there be? How many bits will it take to encode the voting results?

12. What is the minimum number of bits of information required to encode all letters of the Russian alphabet?

13. Friends in neighboring houses agreed to send messages to each other in the form of light signals. How many bulbs do they need to encode 10 different words?

14. In a computer game, 65 different control commands are recognized. How many bits do you need to allocate in a block of memory to encode each instruction? Are the allotted bits enough to encode 100 instructions?

Control questions:

1. What events are equally probable?

2. Give examples from life of equiprobable events.

3. What formula connects the number of possible events and the amount of information?

4. How does the amount of information depend on the number of possible events?

5. Is it true that the greater the number of possible events, the less information will contain the message about the results of the experiment?

Justify the answer.

Practical work 4 ... Solving problems using Shannon's formula

Objective: acquiring the skill of determining the amount of information based on a probabilistic approach

Brief theoretical background:

The degree of uncertainty is one of the characteristics of a random event, which is called entropy. It is designated - Н (α). The unit of entropy is the uncertainty contained in an experiment that has two equally probable outcomes. There are many situations where possible events have different probabilities of being realized. For example, if a coin is not symmetrical (one side is heavier than the other), then when you throw it, the probability of getting "heads" and "tails" will differ. The formula for calculating the amount of information in the case of different probabilities of events was proposed by K. Shannon in 1948. In this case, the amount of information is determined by the formula:

P i log 2 p i, where I is the amount of information, N is the number of possible events, p i is the probability of individual events. The probability of an event is p i = 1 / N.

To solve problems of this type, we need to know the formula for calculating the probability of an outcome. It looks like this:

where M is a value that shows how many times an event has happened, N is the total number of possible outcomes of a process.

You need to know that the sum of all the probabilities give one or in percentage terms 100%.

Equipment: didactic materials on the topic "Determining the amount of information".

Sequence of execution:

Objective 1. Sixteen cards were chosen from the deck (all “pictures” and aces) and placed on the table face down. The top card was turned over. The top card turned up turned out to be a black queen. How much information will be included in the message about which card was on top?

Solution.

As a result of the message about the outcome of a random event, there is no complete certainty: the selected card can have one of two black suits.

Since information is a reduction in the uncertainty of knowledge:

Before the flip of the card, the uncertainty (entropy) was

H1 = log 2 N1, followed by H2 = log 2 N2.

(moreover, under the conditions of the problem, N1 = 16, and N2 = 2).

As a result, the information is calculated as follows:

I = H1 - H2 = log 2 N1 - log 2 N2 = log 2 N1 / N2 = log 2 16/2 = 3 bits.

Objective 2. The probability of the first event is 0.5, and the probability of the second and third is 0.25. How much information will we get after implementing one of them?

Solution.

P 1 = 0.5; P 2 = P 3 = 0.25 Þ bits.

Objective 3. Determine the amount of information received during the implementation of one of the events, if thrown

a) asymmetrical tetrahedral pyramid;

b) a symmetrical and homogeneous tetrahedral pyramid.

Solution.

a) We will throw an asymmetrical tetrahedral pyramid.

The probability of individual events will be as follows:

then the amount of information received after the implementation of one of these events is calculated using the Shannon formula, since non-equiprobable events:

I = - (1/2 log 2 1/2 + 1/4 log 2 1/4 + 1/8 log 2 1/8 + 1/8 log 2 1/8) = 1/2 + 2/4 + + 3/8 + 3/8 = 14/8 = 1.75 (bit).

b) Now let's calculate the amount of information that will be obtained when throwing a symmetrical and homogeneous four-sided pyramid, i.e. equiprobable events:

I = log 2 4 = 2 (bit).

2. Solve the following tasks. Fill out the result in a notebook.

1. There are 30 people in the class. For the test in computer science, 15 A's, 6 A's, 8 A's and 1 A's were received. How much information is there in the message that Andreev received an A?

2. The opaque bag contains 10 white, 20 red, 30 blue and 40 green balls. How much information will the visual message contain about the color of the removed balloon?

3. For the test in computer science received 8 fives, 13 fours, 6 triples and 2 deuces. How much information did Vasechkin receive upon receiving the gradebook?

4. It is known that there are 20 balls in the box. Of these, 10 are black, 4 are white, 4 are yellow and 2 are red. How much information does the color of the taken out ball carry?

5. Banker Bogateev's safe contains banknotes in denominations of 1, 10 or 100 thalers each. The banker opened his safe and pulled out one banknote at random. The information volume of the message "A banknote of 10 thalers was taken from the safe" is 3 bits. The amount of information contained in the message "A banknote with a denomination of not 100 thalers was taken from the safe" is equal to 3-log25 bits. Determine the information volume of the visual message about the denomination of the taken out banknote.

3. Do the exercise

Below are 11 events:

1. The first male person he meets.

2. Monday will be Tuesday.

3. For the test, you can get "excellent".

4. The youngest son will answer the phone of five family members.

6. After the summer there will be winter.

7. Each of the 15 students attending these classes will enroll in a math major.

8. The ticket number 777777 will win the lottery.

9. The tossed coin will fall with the coat of arms up.

10. The tossed die will roll six points.

11. From the cards with numbers chosen at random, select the card with the number 5.

Exercise among 11 events, write down the numbers of those that:

1. Reliable _________________________________________________

2. Impossible ________________________________________________

3. Undefined ______________________________________________

4. Among the undefined, indicate those that have 2 equally possible outcomes ______________________________________________________

5. Arrange uncertain events in ascending order of the number of equally probable outcomes _______________________________________

6. Name the event more vague ____________________________

7. Name the event less vague. ___________________________

8. Taking into account tasks No. 6 and No. 7, establish the dependence of the degree of uncertainty on the number of equally probable outcomes. ____________________________________________________________

9. Draw the same conclusion using the concept of probability . ____________________________________________________________

Control questions:

1. What events are there?

2. Give examples of equally probable and unequal probable events?

3. How to determine the probability of a certain event occurring?

4. At what events is Shannon's formula used to determine the amount of an information message?

5. Under what condition does Hartley's formula become a special case of Shannon's formula?

Practical work 5 ... Solving problems to determine the amount of information

Objective: acquiring the skill of determining the amount of information based on a probabilistic and meaningful approach

Brief theoretical background: As the main characteristic of a message, information theory takes a quantity called the amount of information. This concept does not affect the meaning and importance of the transmitted message, but is associated with the degree of its uncertainty.

Claude Shannon defined the amount of information through entropy - a quantity known in thermodynamics and statistical physics as a measure of the disorder in a system, and took what was later called a bit as a unit of information. The amount of information per message element (sign, letter) is called entropy. Entropy and the amount of information are measured in the same units - in bits.

Since modern information technology is based on elements that have two stable states, the base of the logarithm is usually chosen to be equal to two, i.e. entropy is expressed as: H0 = log 2 m.

In the general case, the amount of entropy H of an arbitrary system X (random variable), which can be in m different states x 1, x 2, ... x m with probabilities p 1, p 2, ... p m, is calculated by the Shannon formula.

Equipment: didactic materials on the topic "Determining the amount of information".

Sequence of execution:

1. Review problem solving examples

Objective 1. Determine the amount of information contained in a television signal corresponding to one scan frame. Let there are 625 lines in a frame, and the signal corresponding to one line is a sequence of 600 pulses random in amplitude, and the pulse amplitude can take any of 8 values ​​with a step

Solution.

In the case under consideration, the length of the message corresponding to one line is equal to the number of pulses random in amplitude in it: n = 600.

The number of message elements (characters) in one line is equal to the number of values ​​that the pulse amplitude in line m = 8 can accept.

The amount of information in one line: I = n log m = 600 log 8, and the amount of information

in frame: I = 625 I = 625 600 log 8 = 1.125 = 106 bits

Objective 2. The cyclocross is attended by 119 athletes. A special device registers the passage of each participant of the intermediate finish, recording his number using the minimum possible number of bits, the same for each athlete. What is the information volume of the message recorded by the device after 70 cyclists have passed the intermediate finish?

1) 70 bits 2) 70 bytes 3) 490 bits 4) 119 bytes

Solution.

1) there were 119 cyclists, they have 119 different numbers, that is, we need to encode 119 options;

2) according to the table of powers of two, we find that this requires at least 7 bits (while 128 options can be encoded, that is, there is still a margin); so 7 bits per sample;

3) when 70 cyclists have passed the intermediate finish, 70 counts are recorded in the device's memory;

4) therefore, in the message 70 * 7 = 490 bits of information (answer 3).

2. Solve the following tasks. Fill out the result in a notebook.

1. In the zoo 32 monkeys live in two enclosures, A and B. One of the monkeys is albino (all white). The message "An albino monkey lives in aviary A" contains 4 bits of information. How many monkeys live in Aviary B?

2. There are 32 balls of wool in the basket, 4 of them are red. How many bits of information are there in the message that a ball of red wool has been retrieved?

3. Two people play tic-tac-toe on a 4 by 4 square. How much information did the second player receive after learning the move of the first player?

4. In some country, a 7-character license plate is composed of capital letters (26 letters in total) and decimal digits in any order. Each character is encoded with the same and minimum possible number of bits, and each number - with the same and minimum possible number of bytes. Determine the amount of memory required to store 20 license plates.

5. The cyclocross is attended by 678 athletes. A special device registers the passage of each participant of the intermediate finish, recording his number using the minimum possible number of bits, the same for each athlete. What is the information volume of the message recorded by the device after 200 cyclists have passed the intermediate finish?

Control questions:

1. Give a definition of entropy.

2. How are the concepts of the amount of information and entropy related?

3. What approaches to determining the amount of information do you know?

4. What is the meaning of each of the approaches to determining the amount of information?

5. What is called the measurement of information?

6. What methods of determining the amount of information are there?

7. Give a definition of the amount of information.

Practical work 6 ... Solving problems to determine the amount of information

Objective: acquiring the skill of quantifying information based on an alphabetical approach

Brief theoretical background:

The alphabetical approach is based on the fact that any message can be encoded using a finite sequence of characters of some alphabet.

An alphabet is an ordered set of characters used to encode messages in a particular language.

The power of the alphabet is the number of characters in the alphabet. The binary alphabet contains 2 characters, its cardinality is two. Messages written using ASCII characters use a 256-character alphabet. UNICODE messages use an alphabet of 65,536 characters.

To determine the amount of information in a message using the alphabetical approach, you need to consistently solve the following problems:

1. Determine the amount of information (i) in one symbol by the formula

2 i = N, where N is the cardinality of the alphabet.

2. Determine the number of characters in the message (m).

3. Calculate the amount of information using the formula: I = i * K.

The amount of information in the entire text (I), consisting of K characters, is equal to the product of the information weight of the character by TO:

I = i* TO.

This value is the informational volume of the text.

Information units

Basic unit measurement of information - bit. 8 bits are 1 byte... Along with bytes, larger units are used to measure the amount of information:

1 KB = 2 10 bytes = 1024 bytes;

1 MB = 2 10 KB = 1024 KB;

1 GB = 2 10 MB = 1024 MB.

1 Terabyte (TB) = 1024 GB = 2 40 bytes,

1 Petabyte (PB) = 1024 TB = 2 50 bytes.

Equipment: didactic materials on the topic "Determining the amount of information".

Sequence of execution:

1. Go through the problem solving examples and write them down in your notebook.

Objective 1. A 256-character alphabet was used to write the text. Each page contains 32 lines of 64 characters per line. How much information does 5 pages of this text contain?

Solution:

N = 256, => 2 i = 256, => i = 8 bit

k = 32 * 64 * 5 characters

I = i * k = 8 * 32 * 64 * 5 bit = 8 * 32 * 64 * 5/8 b = 32 * 64 * 5/10 1024 kb = 10 kb

Objective 2. Is it possible to fit on one floppy disk a book with 432 pages, and on each page of this book there are 46 lines, and in each line there are 62 characters?

Solution:

Because we are talking about a book printed in electronic form, then we are dealing with a computer language. Then N = 256, => 2 i = 256, => i = 8 bit

k = 46 * 62 * 432 characters

I = i * k = 8 * 46 * 62 * 432 bit = 8 * 46 * 62 * 432/8 b = 46 * 62 * 432/1024 kb = 1203.1875 kb = 1.17 Mb

Because the volume of a floppy disk is 1.44 Mb, and the volume of a book is 1.17 Mb, then it will fit on a floppy disk.

Problem 3. The information flow rate is 20 bit / s. How many minutes will it take to transfer 10 kilobytes of information.

Solution:

t = I / v = 10 kb / 20 bit / s = 10 * 1024 bit / 20 bit / s = 512 s = 8.5 min

Problem 4. The laser printer prints at an average speed of 7 Kbps. How long will it take to print a 12-page document if it is known that on one page there are on average 45 lines, 60 characters per line.

Solution:

Because we are talking about a document in electronic form, ready for printing on a printer, then we are dealing with a computer language. Then N = 256, => 2 i = 256, => i = 8 bit

K = 45 * 60 * 12 characters

I = i * k = 8 * 45 * 60 * 12 bit = 8 * 45 * 60 * 12/8 b = 45 * 60 * 12/1024 kb = 31.6 kb

t = I / v = 31.6 kb / 7 kbps = 31.6 * 8 kb / 7 kbps = 36 s

Task 5. The automatic device transcoded the information message in Russian from Unicode to KOI-8. In this case, the information message has decreased by 480 bits. What is the length of the message?

Solution:

The size of 1 character in KOI-8 encoding is 1 byte, and in Unicode encoding - 2 bytes.

Let x be the length of the message, then I KOI-8 = 1 * x b, and I Unicode = 2 * x b.

We get 2 * x8 bits - 1 * x * 8 bits = 480 bits, 8x = 480, x = 60 characters in the message.

2. Solve the following tasks. Fill out the result in a notebook.

1. Some alphabet contains 128 characters. The message contains 10 characters. Determine the scope of the message.

2. Assuming that one character is encoded in 8 bits, estimate the informational volume of the following proverb in the KOI-8 encoding: A faithful friend is better than a hundred servants.

3. One and the same text in Russian is written in different encodings. Text written in 16-bit Unicode encoding is 120 bits larger than text written in 8-bit KOI-8 encoding. How many characters does the text contain?

4. How many gigabytes does a 235-bit file contain?

5. The text file copia.txt is 40960 bytes in size. How many of these files can be recorded on a 5MB media?

6. A 2.5 MB drawing was added to the 46080 byte text message. How many kbytes of information does the received message contain?

7. In the alphabet of some language there are two characters X and O. The word consists of four characters, for example: OOHO, XOOX. Indicate the maximum possible number of words in this language.

8. A 64-character alphabet was used to write the text. How many characters are in the text if its size is 8190 bits?

9. Specify the largest natural number that can be encoded with 8 bits (if all numbers are encoded sequentially, starting from one).

10. Some alphabet contains 2 characters. The message is 2 pages long, each with 16 lines, and each line with 32 characters. Determine the scope of the message.

11. How many bits of information are in a 1/4 kilobyte message?

12. Find x from the following ratio: 8x bits = 16 MB.

13. A color raster graphic image with a palette of 256 colors has a size of 64x128 pixels. What is the information volume of the image?

14. To store a 64x128 pixel raster image, 4 Kbytes of memory were allocated. What is the maximum possible number of colors in the image palette?

Control questions:

1. How is information measured in a meaningful approach?

2. What is the alphabetical approach to determining the amount of information?

3. What is the alphabet? What is called the power of the alphabet? What is called the volume of information?

4. What is the informational weight of a computer alphabet symbol?

6. Why is the information capacity of the Russian letter "a" greater than the information capacity of the English letter?

7. What units of measurement of information are there?

Practical work7 ... Complex work to determine the amount of information

Objective: control of skills in determining the amount of information.

Brief theoretical background: see practical works 1-6.

Equipment: Control materials from CBS on the discipline "Fundamentals of Information Theory"

Sequence of execution:

· Complete TK # 1. Test 3. Units of information measurement. In the test, you need to select only one answer from the proposed options. It is better to carry out the test on your own, without using abstracts, textbooks and other auxiliary literature.

· Perform PZ # 2. Tasks 1-10.

Top related articles