How to set up smartphones and PCs. Informational portal
  • home
  • OS
  • Information concept. Concept and types of information, transmission and processing, search and storage of information

Information concept. Concept and types of information, transmission and processing, search and storage of information

Information is information about something

Concept and types of information, transmission and processing, search and storage of information

Information is, definition

Information is any intelligence, received and transmitted, stored by various sources. - this is the whole set of information about the world around us, about all kinds of processes occurring in it, which can be perceived by living organisms, electronic machines and other information systems.

- it meaningful information about something, when the form of their presentation is also information, that is, it has a formatting function in accordance with its own nature.

Information is everything that can be supplemented by our knowledge and assumptions.

Information is information about something, regardless of the form of their presentation.

Information is psychic of any psychophysical organism, produced by it using any means, called a means of information.

Information is information perceived by a person and (or) special. devices as a reflection of the facts of the material or spiritual world in process communications.

Information is data organized in a way that makes sense to the person dealing with it.

Information is the value that a person puts into data based on known conventions used to represent it.

Information is information, explanations, presentation.

Information is any data or information that interests anyone.

Information is information about objects and phenomena of the environment, their parameters, properties and state, which are perceived by information systems (living organisms, control machines, etc.) in process life and work.

The same information message (newspaper article, advertisement, letter, telegram, help, story, drawing, radio broadcast, etc.) may contain a different amount of information for different people - depending on their previous knowledge, on the level of understanding of this message and interest in it.

When talking about automated work with information through any technical devices, they are not interested in the content of the message, but in how many characters this message contains.

Information is

Applied to computer processing data, information is understood as a certain sequence of symbolic designations (letters, numbers, encoded graphic images and sounds, etc.), which carries a semantic load and is presented in a form understandable to a computer. Each new character in such a sequence of characters increases the informational volume of the message.

Currently, there is no single definition of information as a scientific term. From the point of view of various fields of knowledge this concept is described by its specific set of features. For example, the concept of "information" is basic in the course of computer science, and it is impossible to define it through other, more "simple" concepts (just as in geometry, for example, it is impossible to express the content of the basic concepts of "point", "line", "plane" through simpler concepts).

The content of the basic, basic concepts in any science should be explained by examples or revealed by comparing them with the content of other concepts. In the case of the concept of "information", the problem of its definition is even more complicated, since it is a general scientific concept. This concept is used in various sciences (computer science, cybernetics, biology, physics, etc.), while in each science the concept of "information" is associated with different systems of concepts.

Information concept

In modern science, two types of information are considered:

Objective (primary) information is the property of material objects and phenomena (processes) to generate a variety of states, which through interactions (fundamental interactions) are transmitted to other objects and imprinted in their structure.

Subjective (semantic, semantic, secondary) information is the semantic content of objective information about objects and processes of the material world, formed by a person's consciousness with the help of semantic images (words, images and sensations) and recorded on some material medium.

In the everyday sense, information is information about the surrounding world and the processes occurring in it, perceived by a person or a special device.

Currently, there is no single definition of information as a scientific term. From the point of view of various fields of knowledge, this concept is described by its specific set of features. According to K. Shannon's concept, information is a removed uncertainty, i.e. Information that should remove the uncertainty that exists in the acquirer to one degree or another before they are received, expand his understanding of the object with useful information.

From the point of view of Gregory Beton, an elementary unit of information is a "not indifferent difference" or an effective difference for some larger perceiving system. Those differences that are not perceived, he calls "potential", and perceived - "effective." "Information consists of not indifferent differences" (c) "Any perception of information is necessarily the receipt of information about the difference." From the point of view of informatics, information has a number of fundamental properties: novelty, relevance, reliability, objectivity, completeness, value, etc. Information analysis is primarily concerned with the science of logic. The word "information" comes from the Latin word informatio, which in translation means information, clarification, familiarization. The concept of information was considered by ancient philosophers.

Information is

Until the industrial revolution, defining the essence of information remained the prerogative of philosophers. Further, to consider the issues of information theory was the science of cybernetics, a new at that time.

Sometimes, in order to comprehend the essence of a concept, it is useful to analyze the meaning of the word that denotes this concept. Clarifying the inner form of a word and studying the history of its use can throw unexpected light on its meaning, overshadowed by the usual "technological" use of the word and modern connotations.

The word information entered the Russian language in the Peter the Great era. For the first time it is recorded in the "Spiritual Regulations" of 1721 in the meaning of "idea, concept of smth." (In European languages, it was fixed earlier - around the XIV century.)

Information is

Based on this etymology, any significant change in shape or, in other words, any materially recorded traces formed by the interaction of objects or forces and amenable to understanding can be considered information. Information is thus a converted form of energy. The carrier of information is a sign, and the way of its existence is interpretation: the identification of the meaning of a sign or a sequence of signs.

The meaning can be an event reconstructed from a sign that caused its occurrence (in the case of "natural" and involuntary signs, such as traces, evidence, etc.), or a message (in the case of conventional signs inherent in the sphere of language). It is the second type of signs that makes up the body of human culture, which, according to one of the definitions, is "the aggregate of non-hereditarily transmitted information."

Information is

Messages can contain information about facts or interpretation of facts (from Latin interpretatio, interpretation, translation).

A living entity receives information through the senses, as well as through reflection or intuition. The exchange of information between subjects is communication or communication (from Latin communicatio, message, transmission, derived in turn from Latin communico, to make it common, to communicate, to converse, to connect).

From a practical point of view, information is always presented as a message. An informational message is associated with the source of the message, the recipient of the message, and the communication channel.

Returning to the Latin etymology of the word information, let's try to answer the question of what exactly is given the form here.

It is obvious that, first, some sense, which, being initially formless and unexpressed, exists only potentially and must be "built" in order to become perceived and transmitted.

Secondly, the human mind, which is brought up to think structurally and clearly. Third, a society that, precisely because its members share these meanings and share them, gains unity and functionality.

Information is

information as an expressed reasonable meaning is knowledge that can be stored, transmitted and be the basis for generating other knowledge. The forms of preservation of knowledge (historical memory) are diverse: from myths, chronicles and pyramids to libraries, museums and computer bases data.

Information - information about the world around us, about the processes taking place in it, which are perceived by living organisms, managing directors machines and other information systems.

The word "information" is Latin. Over its long life, its meaning has undergone evolution, now expanding, then extremely narrowing its boundaries. Initially, the word "information" meant: "presentation", "concept", then - "information", "message transmission".

In recent years, scientists have decided that the usual (all accepted) meaning of the word "information" is too elastic, vague, and gave it such a meaning: "a measure of certainty in the message."

Information is

Information theory was brought into being by the needs of practice. Its occurrence is associated with work Claude Shannon's "Mathematical Theory of Communication", published in 1946. The foundations of information theory are based on the results obtained by many scientists. By the second half of the 20th century, the globe was buzzing with transmitted information running through telephone and telegraph cables and radio channels. Later, electronic computers appeared - information processors. And for that time, the main task of information theory was, first of all, to increase the efficiency of the functioning of communication systems. The difficulty in the design and operation of means, systems and communication channels is that it is not enough for a designer and engineer to solve a problem from a physical and energy standpoint. From these points of view, the system can be the most perfect and economical. But it is important, even when creating transmission systems, to pay attention to how much information will pass through this transmission system. After all, information can be measured quantitatively, counted. And they act in such calculations in the most usual way: they abstract from the meaning of the message, as they abandon concreteness in the arithmetic operations that are familiar to all of us (like from adding two apples and three apples to adding numbers in general: 2 + 3).

The scientists stated that they "completely ignored the human assessment of information." For example, they assign a certain meaning to information to a sequential series of 100 letters, regardless of whether this information has meaning and whether, in turn, it has meaning in practical application. The quantitative approach is the most developed branch of information theory. By this definition, a collection of 100 letters - a 100-letter phrase from a newspaper, a Shakespeare play, or Einstein's theorem - has exactly the same amount of information.

This quantification of information is eminently useful and practical. It exactly corresponds to the task of the communications engineer, who must transmit all the information contained in the submitted telegram, regardless of the value of this information to the addressee. The communication channel is soulless. One thing is important to the transmitting system: to transmit the required amount of information in a certain time. How do you calculate the amount of information in a particular message?

Information is

Estimation of the amount of information is based on the laws of probability theory, more precisely, it is determined through probabilities events. This is understandable. A message has value, carries information only when we learn from it about the outcome of an event of a random nature, when it is to some extent unexpected. After all, the message about the already known does not contain any information. Those. if, for example, someone calls you on telephone set and says: "It is light during the day and dark at night," then such a message will surprise you only with the absurdity of the obvious and well-known statement, and not with the news it contains. Another thing is, for example, the result of a race at a horse race. Who will come first? The outcome here is difficult to predict. The more the event of interest to us has random outcomes, the more valuable the message about its result, the more information... An event message that has only two equally possible outcomes contains one piece of information called a bit. The choice of a unit of information is not accidental. It is related to the most common binary way of encoding it in transmission and processing. Let us try, at least in the most simplified form, to imagine that general principle quantifying information, which is the cornerstone of all information theory.

We already know that the amount of information depends on probabilities certain outcomes of the event. If an event, as scientists say, has two equally likely outcomes, this means that each outcome is 1/2. This is the probability of getting "heads" or "tails" when a coin is tossed. If an event has three equally probable outcomes, then the probability of each is 1/3. Note that the sum of the probabilities of all outcomes is always equal to one: after all, one of all possible outcomes will surely come. An event, as you yourself understand, can have unequal outcomes. So, in a football match between a strong and a weak team, the probability of a strong team winning is high - for example, 4/5. draw is much less, for example 3/20. The likelihood of defeat is very small.

It turns out that the amount of information is a measure of reducing the uncertainty of a certain situation. Various amounts of information are transmitted through communication channels, and the amount of information passing through the channel cannot be more than its bandwidth. And it is determined by how much information passes here per unit of time. One of the heroes of Jules Verne's novel "The Mysterious Island", journalist Gideon Spillett, broadcast on telephone set chapter from the Bible so that its competitors cannot take advantage of telephone connection... In this case, the channel was fully loaded, and the amount of information was equal to zero, since the information known to him was transmitted to the subscriber. This means that the channel was idle, passing a strictly defined number of pulses, without loading them with anything. Meanwhile, the more information each of a certain number of pulses carries, the more fully the channel bandwidth is used. Therefore, you need to intelligently encode information, find an economical, stingy language for transmitting messages.

Information is "sifted" in the most careful way. In the telegraph, frequently encountered letters, combinations of letters, even whole phrases are depicted with a shorter set of zeros and ones, and those that are less common - with a longer one. In the case when the length of the codeword is reduced for frequently occurring symbols and increased for rarely occurring, one speaks of efficient coding of information. But in practice, it quite often happens that the code resulting from the most careful "sifting", the code is convenient and economical, can distort the message due to interference, which, unfortunately, always occurs in communication channels: sound distortion in the phone, atmospheric interference in, distortion or darkening of the image in television, errors in transmission telegraph... These interferences, or, as experts call them, noises, strike the information. And from this there are the most incredible and, naturally, unpleasant surprises.

Therefore, to increase the reliability in the transmission and processing of information, it is necessary to introduce extra characters - a kind of protection against distortion. They - these superfluous symbols - do not carry the actual content in the message, they are redundant. From the point of view of information theory, everything that makes a language colorful, flexible, rich in shades, multifaceted, multi-valued is redundancy. How redundant from such positions Tatyana's letter to Onegin! How many informational excesses it contains for a short and understandable message "I love you"! And how informationally accurate are the drawn designations, understandable to everyone and everyone who enters the metro today, where instead of words and phrases of announcements there are laconic symbolic signs indicating: "Entry", "Exit".

In this regard, it is useful to recall the anecdote told at one time by the famous American scientist Benjamin Franklin about a hat maker who invited his friends to discuss the draft of a sign. It was supposed to draw a hat on the sign and write: "John Thompson, a hat maker, makes and sells hats for cash." A friend noticed that the words “for cash money"Are unnecessary - such a reminder would be offensive to buyer... Another also found the word “sells” superfluous, since it goes without saying that the hat maker sells hats, and does not give them away for free. The third thought that the words "hatman" and "makes hats" were unnecessary tautology, and the last words were thrown out. The fourth proposed to throw out the word "hat-maker" - the drawn hat clearly says who John Thompson is. Finally, the fifth insisted that for buyer it made no difference whether the hat-maker was called John Thompson or otherwise, and suggested that this indication be dispensed with, so that in the end there was nothing left on the sign but the hat. Of course, if people used only this kind of codes, without redundancy in messages, then all "information forms" - books, reports, articles - would be extremely short. But they would lose in clarity and beauty.

Information can be divided into types according to different criteria: in truth: true and false;

by the way of perception:

Visual - perceived by the organs of vision;

Auditory - perceived by the organs of hearing;

Tactile - perceived by tactile receptors;

Olfactory - perceived by the olfactory receptors;

Gustatory - perceived by taste buds.

by the form of presentation:

Text - transmitted in the form of symbols intended to indicate the lexemes of the language;

Numeric - in the form of numbers and signs indicating mathematical operations;

Graphic - in the form of images, objects, graphs;

Sound - oral or in the form of a recording, the transmission of language lexemes by means of an auditory route.

by appointment:

Massive - contains trivial information and operates with a set of concepts that are understandable to most of the society;

Special - contains a specific set of concepts, when used, information is transmitted that may not be understood by the bulk of society, but are necessary and understandable within a narrow social group where is used this information;

Secret - transmitted to a narrow circle of people and through closed (protected) channels;

Personal (private) - a set of information about a person that determines the social status and types of social interactions within the population.

by value:

Relevant - information is valuable in this moment time;

Reliable - information received without distortion;

Understandable - information expressed in a language understandable to the person to whom it is intended;

Complete - sufficient information for acceptance correct decision or understanding;

Useful - the usefulness of the information is determined by the subject who received the information, depending on the scope of possibilities for its use.

The value of information in various fields of knowledge

In information theory in our time, many systems, methods, approaches, ideas are being developed. However, scientists believe that new trends in information theory will be added to modern trends and new ideas will appear. As proof of the correctness of their assumptions, they cite the "living", developing nature of science, indicate that the theory of information is surprisingly quickly and firmly introduced into the most diverse areas of human knowledge. Information theory has penetrated into physics, chemistry, biology, medicine, philosophy, linguistics, pedagogy, economics, logic, technical sciences, and aesthetics. According to the experts themselves, the doctrine of information, which arose due to the needs of the theory of communication and cybernetics, has stepped over their framework. And now, perhaps, we have the right to talk about information as a scientific concept that gives researchers an information-theoretic method, with the help of which one can penetrate into many sciences about animate and inanimate nature, about society, which will allow not only to look at all the problems with new side, but also to see what has not yet been seen. That is why the term "information" has become widespread in our time, having become part of such concepts as an information system, information culture, even information ethics.

Many scientific disciplines use information theory to emphasize a new direction in the old sciences. This is how, for example, information geography, information economy, information law. But the term "information" has acquired extremely great importance in connection with the development of the latest computer technology, the automation of mental labor, the development of new means of communication and information processing, and especially with the emergence of informatics. One of the most important tasks of information theory is the study of the nature and properties of information, the creation of methods for its processing, in particular the transformation of various modern information into computer programs, with the help of which the automation of mental work takes place, a kind of strengthening of intelligence, and therefore the development of the intellectual resources of society.

The word "information" comes from the Latin word informatio, which means information, clarification, familiarization. The concept of "information" is basic in the course of computer science, but it is impossible to define it through other, more "simple" concepts. The concept of "information" is used in various sciences, while in each science the concept of "information" is associated with different systems of concepts. Information in biology: Biology studies living nature and the concept of "information" is associated with the appropriate behavior of living organisms. In living organisms, information is transmitted and stored using objects of various physical nature (state of DNA), which are considered as signs of biological alphabets. Genetic information is inherited and stored in all cells of living organisms. Philosophical approach: information is interaction, reflection, cognition. Cybernetic Approach: Information is Characteristics manager signal transmitted over the communication line.

The role of information in philosophy

The traditionalism of the subjective has always dominated the early definitions of information as a category, concept, property of the material world. Information exists outside of our consciousness, and can be reflected in our perception only as a result of interaction: reflection, reading, receiving in the form of a signal, stimulus. Information is not material, like all properties of matter. Information is in the series: matter, space, time, consistency, function, etc., which are the fundamental concepts of a formalized reflection of objective reality in its distribution and variability, diversity and manifestations. Information is a property of matter and reflects its properties (state or ability to interact) and quantity (measure) through interaction.

From a material point of view, information is the order in which the objects of the material world follow. For example, the order of letters on a piece of paper according to certain rules is written information. The order of the multi-colored dots on a sheet of paper according to certain rules is graphic information... The order of musical notes is musical information. The order of genes in DNA is hereditary information. The order of the bits in the computer is computer information etc. etc. For the implementation of information exchange, the presence of necessary and sufficient conditions is required.

Information is

The necessary conditions:

The presence of at least two different objects of the material or non-material world;

Objects have a common property that allows them to be identified as a carrier of information;

The presence of a specific property in objects that allows you to distinguish objects from each other;

The presence of a space property that allows you to determine the order of objects. For example, the placement of written information on paper is a specific property of paper that allows letters to be positioned from left to right and top to bottom.

There is only one sufficient condition: the presence of a subject capable of recognizing information. This is man and human society, animal societies, robots, etc. The information message is constructed by selecting copies of objects from the basis and the arrangement of these objects in space in a certain order. The length of the information message is defined as the number of copies of the basis objects and is always expressed as an integer. It is necessary to distinguish between the length of an informational message, which is always measured as an integer, and the amount of knowledge contained in an informational message, which is measured in an unknown unit of measurement. From a mathematical point of view, information is a sequence of integers that are written into a vector. Numbers are the number of the object in the basis of information. The vector is called the invariant of information, since it does not depend on the physical nature of the objects of the basis. One and the same information message can be expressed in letters, words, sentences, files, pictures, notes, songs, video clips, any combination of all previously named.

Information is

The role of information in physics

information is information about the surrounding world (object, process, phenomenon, event), which is the object of transformation (including storage, transmission, etc.) and is used to develop behavior, to make a decision, to control or for training.

The characteristic features of the information are as follows:

It is the most important resource of modern production: it reduces the need for land, labor, capital, and reduces the cost of raw materials and energy. So, for example, having the ability to archive your files (that is, having such information), you can not spend money on buying new floppy disks;

Information gives rise to new industries. For example, the invention of the laser beam was the reason for the emergence and development of the production of laser (optical) discs;

Information is a commodity, and information does not lose it after being sold. So, if a student informs his friend about the schedule of classes during the semester, he will not lose this data for himself;

Information adds value to other resources, in particular labor. Indeed, an employee with a higher education is valued more than an employee with a secondary education.

As follows from the definition, three concepts are always associated with information:

The source of information is that element of the surrounding world (object, phenomenon, event), information about which is the object of transformation. So, the source of information that the reader of this textbook is currently receiving is informatics as a sphere of human activity;

The acquirer of information is that element of the surrounding world that uses information (to develop behavior, to make a decision, to control or to learn). The purchaser of this information is the reader himself;

A signal is a material medium that fixes information for transferring it from a source to a purchaser. In this case, the signal is electronic. If the student takes this manual from the library, then the same information will be in hard copy. Once read and memorized by the student, the information will acquire another carrier - biological, when it is “recorded” in the student's memory.

The signal is the most important element in this circuit. The forms of its presentation, as well as the quantitative and qualitative characteristics of the information contained in it, which are important for the acquirer of information, are discussed later in this section of the textbook. The main characteristics of a computer as the main tool for mapping the information source into a signal (link 1 in the figure) and “bringing” the signal to the acquirer of information (link 2 in the figure) are given in the Computer part. The structure of the procedures that implement links 1 and 2 and constitute the information process is the subject of consideration in the part of the Information process.

Objects of the material world are in a state of continuous change, which is characterized by the exchange of energy of the object with the environment. A change in the state of one object always leads to a change in the state of some other object in the environment. This phenomenon, regardless of how, which states and which particular objects have changed, can be considered as a signal transmission from one object to another. Changing the state of an object when transmitting a signal to it is called signal registration.

A signal or a sequence of signals form a message that can be perceived by the recipient in one form or another, as well as in this or that volume. Information in physics is a term that qualitatively generalizes the concepts of "signal" and "message". If signals and messages can be quantified, then we can say that signals and messages are units of measurement of the amount of information. Message (signal) different systems interpreted in its own way. For example, a successively long and two short beeps in Morse code terminology is the letter de (or D), in the BIOS terminology from the award company - a video card malfunction.

Information is

The role of information in mathematics

In mathematics, information theory (mathematical communication theory) - section applied mathematics, which defines the concept of information, its properties and establishes the limiting relations for data transmission systems. The main sections of information theory are source coding (compression coding) and channel (noise-immune) coding. Mathematics is more than a scientific discipline. It creates a single language for all Science.

The subject of research in mathematics is abstract objects: number, function, vector, set, and others. Moreover, most of them are introduced axiomatically (axiom), i.e. without any connection with other concepts and without any definition.

Information is

information is not included in the research subjects of mathematics. However, the word "information" is used in mathematical terms - own information and mutual information, referring to the abstract (mathematical) part of information theory. However, in mathematical theory, the concept of "information" is associated with exclusively abstract objects - random variables, while in modern theory information, this concept is considered much broader - as a property of material objects. The connection between these two identical terms is undeniable. It was the mathematical apparatus of random numbers that was used by the author of information theory Claude Shannon. He himself means by the term "information" something fundamental (irreducible). Shannon's theory intuitively assumes that information has content. Information reduces overall uncertainty and information entropy. The amount of information is measurable. However, he warns researchers against mechanical transfer of concepts from his theory to other areas of science.

"The search for ways to apply information theory in other areas of science is not reduced to a trivial transfer of terms from one area of ​​science to another. This search is carried out in the long process of proposing new hypotheses and their experimental verification." K. Shannon.

Information is

The role of information in cybernetics

The founder of cybernetics, Nor bert Wiener, spoke about information like this:

information is not matter or energy, information is information. "But the basic definition of information, which he gave in several of his books, is the following: information is a designation of the content that we have received from the outside world, in the process of adapting us and our feelings.

Information is the basic concept of cybernetics, just as economic information is the basic concept of economic cybernetics.

There are many definitions of this term, they are complex and contradictory. The reason, obviously, is that various sciences are involved in the phenomenon of cybernetics, and cybernetics is only the youngest of them. I. is the subject of study of such sciences as the science of management, mathematics, genetics, and the theory of mass media (press, radio, television), informatics, dealing with the problems of scientific and technical information, etc. Finally, recently big interest philosophers show interest in the problems of imaging: they tend to regard imaging as one of the basic universal properties of matter, associated with the concept of reflection. For all interpretations of the concept of I., it assumes the existence of two objects: the source of I. and the acquirer (recipient) I. the relationship is determined by agreement. For example, a blow to the veche bell meant that it was necessary to gather for the square, but for those who did not know about this order, he did not inform any I.

In a veche bell situation, the person participating in the agreement on the meaning of the signal knows that at the moment there can be two alternatives: the veche meeting will take place or not. Or, in the language of the theory of I., an indefinite event (veche) has two outcomes. The received signal leads to a decrease in uncertainty: the person now knows that the event (veche) has only one outcome - it will take place. However, if it was known in advance that the veche would take place at such and such an hour, the bell did not say anything new. It follows from this that the less probable (i.e., more unexpected) the message, the more I. it contains, and vice versa, the more likely the outcome before the event occurs, the less I. contains the signal. Approximately such reasoning was given in the 40s. XX century to the emergence of the statistical, or "classical", theory of I., which defines the concept of I. through the measure of reducing the uncertainty of knowledge about the occurrence of an event (such a measure was called entropy). At the origins of this science were N. Wiener, K. Shannon and Soviet scientists A.N. Kolmogorov, V.A. ., the capacity of memory information devices, etc., which served as a powerful stimulus to the development of cybernetics as a science and electronic computing technology as a practical application of the achievements of cybernetics.

As for the definition of the value and usefulness of I. for the recipient, there is still a lot of unresolved, unclear. If we proceed from the needs of economic management and, consequently, economic cybernetics, then I. can be defined as all the information, knowledge, messages that help solve a particular control problem (i.e., reduce the uncertainty of its outcomes). Then some opportunities open up for evaluating I .: it is the more useful and valuable, the sooner or with less costs leads to the solution of the problem. The concept of I. is close to the concept of data. However, there is a difference between them: data are signals from which it is still necessary to extract I. Data processing is the process of bringing them to a form suitable for this.

The process of their transfer from the source to the acquirer and perception as I. can be considered as passing through three filters:

Physical, or statistical (purely quantitative limitation on the bandwidth of the channel, regardless of the content of the data, that is, from the point of view of syntactics);

Semantic (selection of those data that can be understood by the recipient, that is, correspond to the thesaurus of his knowledge);

Pragmatic (selection among understood information of those that are useful for solving a given problem).

This is well illustrated in a diagram taken from EG Yasin's book on economic information. Correspondingly, three aspects of the study of I. problems are distinguished - syntactic, semantic, and pragmatic.

In terms of content, I. is subdivided into socio-political, socio-economic (including economic I.), scientific and technical, etc. In general, there are many classifications of I., they are built on different grounds. As a rule, due to the similarity of concepts, data classifications are constructed in the same way. For example, I. is subdivided into static (constant) and dynamic (variable), and the data at the same time - into constants and variables. Another division is primary, derivative, output I. (data are also classified). The third division is I. managing and informing. The fourth is redundant, useful and false. Fifth - full (solid) and selective. This thought of Wiener gives a direct indication of the objectivity of information, i.e. its existence in nature independently of the consciousness (perception) of a person.

Information is

Modern cybernetics defines objective information as the objective property of material objects and phenomena to generate a variety of states, which, through fundamental interactions of matter, are transferred from one object (process) to another, and are imprinted in its structure. A material system in cybernetics is considered as a set of objects that themselves can be in different states, but the state of each of them is determined by the states of other objects in the system.

Information is

In nature, the set of states of a system is information, the states themselves are the primary code, or source code. Thus, each material system is a source of information. Cybernetics defines subjective (semantic) information as the meaning or content of a message.

The role of information in informatics

The subject of the study of science is precisely data: methods of their creation, storage, processing and transmission. Content (also: "content" (in context), "site content") is a term that means all types of information (both textual and multimedia - images, audio, video) that make up the content (visualized, for a visitor, content) of the web -site. It is used to separate the concept of information that makes up the internal structure of a page / site (code) from that which will eventually be displayed on the screen.

The word "information" comes from the Latin word informatio, which means information, clarification, familiarization. The concept of "information" is basic in the course of computer science, but it is impossible to define it through other, more "simple" concepts.

The following approaches to the definition of information can be distinguished:

Traditional (everyday) - used in computer science: information is information, knowledge, messages about the state of affairs that a person perceives from the outside world with the help of the senses (sight, hearing, taste, smell, touch).

Probabilistic - used in information theory: information is information about objects and phenomena of the environment, their parameters, properties and state, which reduce the degree of uncertainty and incompleteness of knowledge about them.

Information is stored, transmitted and processed in symbolic (sign) form. The same information can be presented in different forms:

Signed writing, consisting of various signs, among which there is a symbolic one in the form of text, numbers, specials. characters; graphic; tabular, etc .;

The form of gestures or signals;

Oral verbal form (conversation).

Information is presented using languages ​​like sign systems, which are built on the basis of a certain alphabet and have rules for performing operations on signs. Language is a certain sign system of information presentation. Exists:

Natural languages ​​are spoken and written languages. In some cases, spoken language can be replaced by the language of facial expressions and gestures, the language of special signs (for example, road);

Formal languages ​​- special languages for various areas of human activity, which are characterized by a rigidly fixed alphabet, more stringent rules of grammar and syntax. This is the language of music (notes), the language of mathematics (numbers, mathematical signs), number systems, programming languages, etc. Any language is based on an alphabet - a set of symbols / signs. The total number of characters in the alphabet is usually called the cardinality of the alphabet.

Storage media - a medium or physical body for the transmission, storage and reproduction of information. (These are electrical, light, thermal, sound, radio signals, magnetic and laser disks, prints, photographs, etc.)

Information processes are processes associated with the receipt, storage, processing and transmission of information (i.e. actions performed with information). Those. these are processes during which the content of information or the form of its presentation changes.

To ensure the information process, you need a source of information, a communication channel and an acquirer of information. The source transmits (sends) information, and the receiver receives (perceives) it. The transmitted information reaches from the source to the receiver using a signal (code). Changing the signal provides information.

As an object of transformation and use, information is characterized by the following properties:

Syntax is a property that determines the way information is presented on a medium (in a signal). So, this information is presented on electronic media using a certain font. Here you can also consider such parameters of the presentation of information as the style and color of the font, its size, line spacing, etc. Highlighting required parameters both syntactic properties are obviously determined by the intended transformation method. For example, for a visually impaired person, font size and color are essential. If you intend to enter this text into a computer through a scanner, the paper size is important;

Semantics is a property that determines the meaning of information as the correspondence of a signal to the real world. So, the semantics of the signal "informatics" is in the definition given earlier. Semantics can be thought of as some agreement, known to the acquirer of the information, about what each signal means (the so-called rule of interpretation). For example, it is the semantics of signals that a novice motorist studies the rules road traffic learning road signs (in this case, the signals themselves are the signs). The semantics of words (signals) is learned by the learner of a foreign language. We can say that the meaning of teaching computer science is to learn semantics different signals- the essence key concepts this discipline;

Pragmatics is a property that determines the influence of information on the behavior of the purchaser. So the pragmatics of the information received by the reader of this textbook is, at least, in the successful passing of the computer science exam. I would like to believe that the pragmatics of this work will not be limited to this, and it will serve for further training and professional activity of the reader.

Information is

It should be noted that signals of different syntax may have the same semantics. For example, the signals “computer” and “computer” mean electronic device to transform information. In this case, they usually talk about the synonymy of signals. On the other hand, a single signal (i.e., information with the same syntactic property) can have different consumer pragmatics and different semantics. So, road sign, known as “brick” and having a very definite semantics (“no entry”), means a ban on entry for a motorist, but does not affect a pedestrian in any way. At the same time, the “key” signal can have different semantics: a treble clef, a spring clef, a key to open a lock, a key used in computer science to encode a signal in order to protect it from unauthorized access (in this case, they speak of a signal homonymy). There are signals - antonyms with opposite semantics. For example, "cold" and "hot", "fast" and "slow", etc.

The subject of study of the science of informatics is precisely data: methods of their creation, storage, processing and transmission. And the information itself, recorded in the data, its meaningful meaning is of interest to users of information systems who are specialists in various sciences and fields of activity: medicine is interested in medical information, geologist - geological, businessman - commercial, etc. (including a computer scientist interested in information on working with data).

Semiotics - the science of information

Information cannot be imagined without its receipt, processing, transmission, etc., that is, outside the framework of information exchange. All acts of information exchange are carried out by means of symbols or signs with the help of which one system acts on another. Therefore, the main knowledge that studies information is semiotics - the science of signs and sign systems in nature and society (theory of signs). In each act of information exchange, one can find three of its "participants", three elements: the sign, the object that it designates, and the recipient (user) of the sign.

Depending on the relationship between which elements are considered, semiotics is divided into three sections: syntactics, semantics and pragmatics. Syntactics studies signs and the relationship between them. At the same time, it abstracts from the content of the mark and from its practical meaning for the recipient. Semantics studies the relationship between signs and the objects they designate, while distracting from the recipient of the signs and the value of the latter: for him. It is clear that the study of the patterns of semantic display of objects in signs is impossible without taking into account and using the general patterns of construction of any sign systems studied by syntactics. Pragmatics studies the relationship between signs and their users. Within the framework of pragmatics, all the factors that distinguish one act of information exchange from another are studied, all issues practical results use of information and its value to the recipient.

At the same time, many aspects of the relationship of signs between themselves and with the objects designated by them are inevitably affected. Thus, the three sections of semiotics correspond to three levels of abstraction (distraction) from the features of specific acts of information exchange. The study of information in all its diversity corresponds to pragmatic level... Distracting from the recipient of information, excluding him from consideration, we move on to studying it at the semantic level. Abstraction from the content of signs, the analysis of information is transferred to the level of syntactics. Such interpenetration of the main sections of semiotics, associated with different levels of abstraction, can be represented using the scheme "Three sections of semiotics and their relationship." The measurement of information is carried out, respectively, in three aspects: syntactic, semantic and pragmatic. The need for such different dimension information, as will be shown below, is dictated by design practice and firms work of information systems. Consider a typical production situation.

At the end of the shift, the site planner prepares data on the execution of the production schedule. This data is sent to the information and computing center (ITC) of the enterprise, where it is processed, and in the form of reports on the state of production at the current moment is issued to managers. The shop manager, on the basis of the data received, makes a decision to change the production plan to the next planned one or to take any other organizational measures. Obviously, for the head of the shop, the amount of information contained in the summary depends on the amount of economic affect obtained from its use in making decisions, on how useful the information received was. For the site planner, the amount of information in the same message is determined by the accuracy of its correspondence with the actual state of affairs on the site and the degree of surprise of the reported facts. The more unexpected they are, the faster you need to report them to the management, the more information in this message... For the employees of the ITC, the number of characters and the length of the message carrying information will be of paramount importance, since it is this that determines the loading time of computers and communication channels. At the same time, they are practically not interested in either the usefulness of information or the quantitative measure of the semantic value of information.

Naturally, when organizing a production management system, building a model for choosing a solution, we will use the usefulness of information as a measure of the informativeness of messages. When building a system accounting and reporting that provides guidance on the progress of the production process, the novelty of the information obtained should be taken as a measure of the amount of information. Company the same procedures for mechanical processing of information requires measuring the volume of messages in the form of the number of processed characters. These three substantially different approaches to measuring information do not contradict or exclude each other. On the contrary, by measuring information in different scales, they allow a more complete and comprehensive assessment of the information content of each message and more efficiently organize a production management system. According to the apt expression of prof. NOT. Kobrinsky, when it comes to a rational company of information flows, the quantity, novelty, usefulness of information are as interconnected as the quantity, quality and cost of products in production.

Information in the material world

information is one of the general concepts associated with matter. Information exists in any material object in the form of a variety of its states and is transmitted from object to object in the process of their interaction. The existence of information as an objective property of matter logically follows from the known fundamental properties of matter - structure, continuous change (movement) and interaction of material objects.

The structural nature of matter manifests itself as an internal dismemberment of integrity, a natural order of communication between elements in the whole. In other words, any material object, from the subatomic particle Meta of the Universe (Big Bang) as a whole, is a system of interconnected subsystems. Due to continuous movement, understood in a broad sense as movement in space and development in time, material objects change their states. The states of objects also change when interacting with other objects. The set of states of a material system and all of its subsystems represents information about the system.

Strictly speaking, due to uncertainty, infinity, structural properties, the amount of objective information in any material object is infinite. This information is called complete. However, it is possible to distinguish structural levels with finite sets of states. Information that exists at the structural level with a finite number of states is called private. For private information meaning is the concept of the amount of information.

The choice of the unit of measurement for the amount of information follows from the above presentation, which is logical and simple. Imagine a system that can be in only two equiprobable states. Let's assign one of them the code "1", and the other - "0". This is the minimum amount of information that the system can contain. It is a unit of measurement of information and is called a bit. There are other, more difficult to define, methods and units for measuring the amount of information.

Depending on the material form of the carrier, information is of two main types - analog and discrete. Analog information changes continuously over time and takes values ​​from a continuum of values. Discrete information changes at some points in time and takes values ​​from a certain set of values. Any material object or process is the primary source of information. All its possible states make up the code of the information source. The instantaneous value of the states is represented as a symbol ("letter") of this code. In order for information to be transmitted from one object to another as to a receiver, it is necessary that there be some intermediate material carrier interacting with the source. Such carriers in nature, as a rule, are rapidly spreading processes of the wave structure - cosmic, gamma and X-rays, electromagnetic and sound waves, potentials (and maybe not yet discovered waves) of the gravitational field. When electromagnetic radiation interacts with an object as a result of absorption or reflection, its spectrum changes, i.e. the intensities of some wavelengths change. Harmonics of sound vibrations also change during interactions with objects. Information is also transmitted during mechanical interaction, however, mechanical interaction, as a rule, leads to big changes structures of objects (up to their destruction), and the information is greatly distorted. Distortion of information during its transmission is called disinformation.

The transfer of source information to the structure of the medium is called encoding. This converts the source code to the media code. A medium with a source code transferred to it in the form of a medium code is called a signal. The signal receiver has its own set of possible states, which is called the receiver code. A signal interacting with a receiver object changes its states. The process of converting a signal code into a receiver code is called decoding. The transmission of information from a source to a receiver can be viewed as information interaction. Communication is fundamentally different from other interactions. With all other interactions of material objects, there is an exchange of matter and (or) energy. In this case, one of the objects loses matter or energy, and the other receives them. This property of interactions is called symmetry. During information interaction, the receiver receives information, and the source does not lose it. Information interaction is asymmetric. Objective information itself is not material, it is a property of matter, such as structure, movement, and exists on material carriers in the form of its codes.

Information in nature

Wildlife is complex and varied. Sources and receivers of information in it are living organisms and their cells. An organism has a number of properties that distinguish it from inanimate material objects.

Basic:

Continuous exchange of matter, energy and information with the environment;

Irritability, the ability of the body to perceive and process information about changes in the environment and the internal environment of the body;

Excitability, the ability to respond to stimuli;

Self-organization, manifested as changes in the body to adapt to conditions external environment.

An organism considered as a system has hierarchical structure... This structure, relative to the organism itself, is subdivided into internal levels: molecular, cellular, organ level and, finally, the organism itself. However, the organism also interacts over the organismic living systems, the levels of which are the population, the ecosystem and the whole nature as a whole (biosphere). Between all these levels, flows of not only matter and energy, but also information circulate. Informational interactions in living nature occur in the same way as in inanimate. At the same time, living nature in the process of evolution has created a wide variety of sources, carriers and receivers of information.

The reaction to the influences of the outside world is manifested in all organisms, since it is due to irritability. In higher organisms, adaptation to the external environment has the character of a complex activity, which is effective only with sufficiently complete and timely information about the environment. Their receivers of information from the external environment are the sense organs, which include vision, hearing, smell, taste, touch and the vestibular apparatus. In the internal structure of organisms there are numerous internal receptors associated with the nervous system. The nervous system consists of neurons, the processes of which (axons and dendrites) are analogous to information transmission channels. The main organs providing storage and processing of information in vertebrates are the spinal cord and the brain. In accordance with the characteristics of the sense organs, the information perceived by the body can be classified as visual, auditory, gustatory, olfactory and tactile.

Getting on the retina of the human eye, the signal excites its constituent cells in a special way. Nerve impulses of cells through axons are transmitted to the brain. The brain remembers this sensation in the form of a certain combination of states of its constituent neurons. (Continuation of the example - in the section "information in human society Accumulating information, the brain creates on its structure an associated information model of the surrounding world. In living nature for the organism - the receiver of information, an important characteristic is its availability. The amount of information that the human nervous system is able to supply to the brain when reading texts is about 1 bit for 1/16 s.

Information is

The study of organisms is hindered by their complexity. The abstraction of a structure as a mathematical set, acceptable for inanimate objects, is hardly acceptable for a living organism, because in order to create a more or less adequate abstract model of an organism, it is necessary to take into account all the hierarchical levels of its structure. Therefore, it is difficult to introduce a measure of the amount of information. It is very difficult to define the connections between the components of the structure. If it is known which body is the source of the information, then what is the signal and what is the receiver?

Before the emergence computing machines biology, which studies living organisms, used only high-quality, i.e. descriptive models. In a qualitative model, take into account information links between the components of the structure is almost impossible. Electronic computing technology has made it possible to apply new methods in biological research, in particular, the method of machine modeling, which involves a mathematical description of known phenomena and processes occurring in the body, adding hypotheses about some unknown processes to them and calculating possible options for the behavior of the body. The resulting options are compared with the real behavior of the organism, which allows you to determine the truth or falsity of the hypotheses put forward. In such models, information interaction can also be taken into account. The information processes that ensure the existence of life itself are extremely complex. And although it is intuitively clear that this property is directly related to the formation, storage and transmission of complete information about the structure of the organism, an abstract description of this phenomenon seemed until some time ago impossible. Nevertheless, the informational processes that ensure the existence of this property are partially revealed due to the deciphering of the genetic code and reading the genomes of various organisms.

Information in human society

The development of matter in the process of movement is directed towards the complication of the structure of material objects. One of the most complex structures is the human brain. So far, this is the only structure known to us that has a property that a person himself calls consciousness. Speaking about information, we, as thinking beings, a priori mean that information, in addition to its presence in the form of signals we receive, also has some meaning. Forming in his consciousness a model of the world around him as an interconnected set of models of its objects and processes, a person uses semantic concepts, not information. Meaning is the essence of any phenomenon that does not coincide with itself and connects it with the broader context of reality. The word itself directly indicates that the semantic content of information can be formed only by thinking receivers of information. In human society, it is not the information itself that is decisive, but its semantic content.

Example (continued). Having experienced such a sensation, a person assigns the concept of "tomato" to the object, and the concept of "red color" to his state. In addition, his consciousness fixes the connection: "tomato" - "red". This is the meaning of the received signal. (Continued example: later in this section). The ability of the brain to create meaningful concepts and connections between them is the basis of consciousness. Consciousness can be viewed as a self-developing conceptual model of the surrounding world. Meaning is not information. Information exists only on a tangible medium. Human consciousness is considered immaterial. The meaning exists in the mind of a person in the form of words, images and sensations. A person can pronounce words not only out loud, but also "to himself." He can also "silently" create (or remember) images and sensations. However, he can recover information corresponding to this meaning by speaking or writing words.

Information is

Example (continued). If the words "tomato" and "red" are the meaning of the concepts, then where is the information? information is contained in the brain in the form of certain states of its neurons. It is also contained in the printed text consisting of these words, and when the letters are encoded in a three-bit binary code, its number is 120 bits. If you say the words out loud, there will be much more information, but the meaning will remain the same. Nai large quantity carries information visual image... This is reflected even in folklore - “it is better to see once than hear a hundred times.” The information thus recovered is called semantic information, since it encodes the meaning of some primary information (semantics). Hearing (or seeing) a phrase uttered (or written) in a language that a person does not know, he receives information, but cannot determine its meaning. Therefore, for the transmission of the semantic content of information, some agreements are required between the source and the receiver on the semantic content of signals, i.e. words. Such agreements can be achieved through communication. Communication is one of essential conditions the existence of human society.

In the modern world, information is one of the most important resources and, at the same time, one of the driving forces of the development of human society. Information processes occurring in the material world, living nature and human society are studied (or at least taken into account) by all scientific disciplines from philosophy to marketing. The increasing complexity of scientific research tasks has led to the need to involve large teams of scientists of different specialties in their solution. Therefore, almost all theories discussed below are interdisciplinary. Historically, two complex branches of science - cybernetics and informatics - are directly involved in information research.

Modern cybernetics is a multidisciplinary branch sciences that investigate super-complex systems, such as:

Human Society (Social Cybernetics);

Economics (economic cybernetics);

Living organism (biological cybernetics);

The human brain and its function is consciousness (artificial intelligence).

Informatics, which emerged as a science in the middle of the last century, separated from cybernetics and is engaged in research in the field of methods of obtaining, storing, transmitting and processing semantic information. Both of these industries use several fundamental scientific theories. These include information theory, and its sections - coding theory, algorithm theory and automata theory. Studies of the semantic content of information are based on a complex of scientific theories under the general name semiotics. Information theory is a complex, mainly mathematical theory that includes the description and assessment of methods for extracting, transferring, storing and classifying information. Considers information carriers as elements of an abstract (mathematical) set, and interactions between carriers as a way of arranging elements in this set. This approach makes it possible to formally describe the information code, that is, to define an abstract code and study it using mathematical methods. For these studies, he uses the methods of probability theory, mathematical statistics, linear algebra, game theory and other mathematical theories.

The foundations of this theory were laid by the American scientist E. Hartley in 1928, who determined the measure of the amount of information for some communication problems. Later, the theory was significantly developed by the American scientist K. Shannon, the Russian scientists A.N. Kolmogorov, V.M. Glushkov and others. Modern information theory includes as sections the theory of coding, the theory of algorithms, the theory of digital automata (see below) and some others. There are also alternative information theories, for example, "Qualitative information theory" proposed by the Polish scientist M. Mazur. Any person is familiar with the concept of an algorithm, without even knowing it. Here is an example of an informal algorithm: “Cut the tomatoes into slices or slices. Put chopped onions in them, pour vegetable oil, then sprinkle with finely chopped paprika, mix. Sprinkle with salt before use, put in a salad bowl and garnish with parsley. " (Tomato salad).

The first rules in the history of mankind for solving arithmetic problems were developed by one of the famous scientists of antiquity Al - Khorezmi in the 9th century AD. In his honor, formalized rules for achieving a goal are called algorithms. The subject of the theory of algorithms is to find methods for constructing and evaluating effective (including universal) computational and control algorithms for information processing. To substantiate such methods, the theory of algorithms uses the mathematical apparatus of information theory. scientific concept algorithms as methods of information processing were introduced in the works of E. Post and A. Turing in the 1920s (Turing Machine). A great contribution to the development of the theory of algorithms was made by the Russian scientists A. Markov (Markov's Normal Algorithm) and A. Kolmogorov. The theory of automata is a branch of theoretical cybernetics, in which mathematical models of actually existing or fundamentally possible devices processing discrete information at discrete times.

The concept of an automaton originated in the theory of algorithms. If there are some universal algorithms for solving computational problems, then there must be devices (albeit abstract) for the implementation of such algorithms. Actually, the abstract Turing machine, considered in the theory of algorithms, is at the same time an informally defined automaton. The theoretical substantiation of the construction of such devices is the subject of the theory of automata. The theory of automata uses the apparatus of mathematical theories - algebra, mathematical logic, combinatorial analysis, graph theory, probability theory, etc. The theory of automata, together with the theory of algorithms, is the main theoretical basis for creating electronic computers and automated control systems. Semiotics is a complex of scientific theories that study the properties of sign systems. The most significant results have been achieved in the section of semiotics - semantics. The subject of semantics research is the semantic content of information.

A sign system is a system of concrete or abstract objects (signs, words), with each of which a certain meaning is associated in a certain way. It has been proven in theory that there can be two such comparisons. The first type of correspondence directly determines the material object that denotes this word and is called the denotatum (or, in some works, the nominee). The second type of correspondence determines the meaning of a sign (word) and is called a concept. At the same time, such properties of comparisons as "meaning", "truth", "definability", "following", "interpretation", etc. are investigated. The apparatus of mathematical logic and mathematical linguistics is used for research. The ideas of semantics, outlined by G.V. Leibniz and F. de Saussure in the 19th century, formulated and developed by C. Pierce (1839-1914), C. Morris (b. 1901), R. Carnap (1891-1970) and others. The main achievement of the theory is the creation of an apparatus of semantic analysis that allows the meaning of a text in a natural language as a record in some formalized semantic (semantic) language Semantic analysis is the basis for creating devices (programs) machine translation from one natural language to another.

Information storage is carried out by transferring it to some material carriers. Semantic information recorded on a tangible storage medium is called a document. Humanity has learned to store information a long time ago. In the most ancient forms of storing information, the arrangement of objects was used - shells and stones in the sand, knots on a rope. A significant development of these methods was writing - graphic image symbols on stone, clay, papyrus, paper. Of great importance in the development of this direction was invention typography. Throughout its history, mankind has accumulated a huge amount of information in libraries, archives, periodicals and other written documents.

At the present time, storage of information in the form of sequences of binary symbols is of particular importance. A variety of storage devices are used to implement these methods. They are the central link in information storage systems. In addition to them, such systems use information retrieval tools ( search system), means of obtaining information (information and reference systems) and means of displaying information (output device). Formed for the purpose of information, such information systems form databases, databanks and a knowledge base.

The transfer of semantic information is the process of its spatial transfer from the source to the recipient (addressee). A person learned to transmit and receive information even before storing it. Speech is a method of transmission that our distant ancestors used in direct contact (conversation) - we still use it now. To transmit information over long distances, it is necessary to use much more complex information processes. To carry out such a process, information must be formalized (presented) in some way. To represent information, various sign systems are used - sets of pre-agreed semantic symbols: objects, pictures, written or printed words of a natural language. Semantic information about an object, phenomenon or process presented with their help is called a message.

Obviously, in order to transmit a message over a distance, information must be transferred to some kind of mobile carrier. Media can move through space using vehicles, as is the case with letters sent by mail. This method ensures complete reliability of the information transfer, since the addressee receives the original message, but it takes a significant amount of time to transfer. Since the middle of the 19th century, methods of transmitting information have become widespread using a naturally spreading medium of information - electromagnetic vibrations(electrical vibrations, radio waves, light). Implementing these methods requires:

Preliminary transfer of the information contained in the message to the carrier - encoding;

Ensuring the transmission of the signal thus obtained to the addressee via a special communication channel;

Reverse transformation of the signal code into the message code - decoding.

Information is

The use of electromagnetic carriers makes the delivery of a message to the addressee almost instantaneous, but requires additional measures for quality assurance (reliability and accuracy) transmitted information since real communication channels are subject to natural and man-made interference. Devices that implement the data transmission process form communication systems. Depending on the method of presenting information, communication systems can be subdivided into sign (, fax), sound (), video and combined systems(the television). The most developed communication system in our time is the Internet.

Data processing

Since information is not material, its processing consists in various transformations. Any transfer of information from a carrier to another carrier can be referred to as processing processes. The information to be processed is called data. The main type of processing of primary information received by various devices is transformation into a form that ensures its perception by the human sense organs. So, photographs of space, obtained in x-rays, are converted into ordinary color photographs using special spectrum converters and photographic materials. Night vision devices convert infrared (thermal) images into visible images. For some communication and control tasks, conversion of analog information is necessary. For this, analog-to-digital and digital-to-analog signal converters are used.

The most important type of processing of semantic information is the definition of the meaning (content), which is contained in a message. In contrast to the primary, semantic information does not have statistical characteristics, that is, a quantitative measure - the meaning is either there or not. And how much of it, if there is one, is impossible to establish. The meaning of the message is described in artificial language, reflecting the semantic connections between the words of the source text. The vocabulary of such a language, called a thesaurus, resides in the receiver of the message. The meaning of words and phrases of a message is determined by assigning them to certain groups of words or phrases, the meaning of which has already been established. Thus, the thesaurus allows you to establish the meaning of the message and, at the same time, is replenished with new semantic concepts. The described type of information processing is used in information retrieval systems and machine translation systems.

One of the widespread types of information processing is the solution of computational problems and problems of automatic control using computers. Information processing is always done for some purpose. To achieve it, the order of actions with information, leading to a given goal, must be known. This procedure is called an algorithm. In addition to the algorithm itself, you also need some device that implements this algorithm. In scientific theories, such a device is called an automaton. It should be noted as the most important feature of information that, due to the asymmetry of information interaction, new information appears during information processing, and the original information is not lost.

Analog and digital information

Sound is wave vibrations in a medium, for example, in air. When a person speaks, the vibrations of the ligaments of the throat are converted into wave vibrations of the air. If we consider sound not as a wave, but as vibrations at one point, then these vibrations can be represented as air pressure changing over time. With a microphone, pressure changes can be captured and converted into electrical voltage. The air pressure has been converted into voltage fluctuations.

Such a transformation can occur according to various laws, most often the transformation occurs according to a linear law. For example, for this:

U (t) = K (P (t) -P_0),

where U (t) is the electrical voltage, P (t) is the air pressure, P_0 is the average air pressure, and K is the conversion factor.

Both electrical voltage and air pressure are continuous functions over time. Functions U (t) and P (t) are information about throat ligaments vibrations. These functions are continuous and this information is called analogue. Music is special case sound and it can also be represented as some function of time. This will be an analogue presentation of the music. But music is also recorded in the form of notes. Each note has a duration that is a multiple of a predetermined duration, and a pitch (do, re, mi, fa, g, etc.). If we convert this data into numbers, then we get a digital representation of the music.

Human speech is also a special case of sound. It can also be represented in analog form. But just as music can be broken down into notes, speech can be broken down into letters. If each letter is given its own set of numbers, then we get a digital representation of speech. The difference between analog information and digital is that analog information is continuous, and digital is discrete. Conversion of information from one type to another, depending on the type of conversion, is called differently: simply "conversion", such as digital-to-analog conversion, or analog-to-digital conversion; complex transformations are called "coding", for example, delta coding, entropy coding; the conversion between characteristics such as amplitude, frequency or phase is called "modulation", for example, amplitude-frequency modulation, pulse-width modulation.

Information is

Usually, analog conversions are quite simple and various devices invented by man can easily cope with them. A tape recorder converts the magnetization on the film into sound, the recorder converts the sound into magnetization on the film, a video camera converts light into magnetization on the film, an oscilloscope converts an electric voltage or current into an image, etc. Converting analog information to digital is much more difficult. Some of the transformations the machine fails or succeeds with great difficulty. For example, converting speech to text, or converting a concert recording to sheet music, and even by its nature a digital representation: it is very difficult for a machine to convert text on paper into the same text in computer memory.

Information is

Why then use digital representation of information if it is so difficult? The main advantage of digital information over analog is noise immunity. That is, in the process of copying information, digital information is copied as it is, it can be copied almost an infinite number of times, while analog information becomes noisy during copying, its quality deteriorates. Usually, analog information can be copied no more than three times. If you have a two-cassette audio recorder, you can make such an experiment, try to rewrite the same song several times from cassette to cassette, after several such re-recordings you will notice how much the recording quality has deteriorated. The information on the cassette is stored in analog form. You can rewrite music in mp3 format as many times as you like, and the quality of the music does not deteriorate. The information in the mp3 file is stored digitally.

Amount of information

A person or some other receiver of information, having received a portion of information, resolves some uncertainty. Let's take all the same tree as an example. When we saw the tree, we resolved a number of uncertainties. We learned the height of the tree, the type of tree, the density of the foliage, the color of the leaves, and if it is a fruit tree, then we saw the fruits on it, how ripe they are, etc. Before we looked at the tree, we did not know all this, after we looked at the tree, we resolved the uncertainty - we received information.

If we go out to a meadow and look at it, then we will receive information of a different kind, how big the meadow is, how tall the grass is and what color the grass is. If a biologist comes to the same meadow, then, among other things, he will be able to find out: what types of grasses grow in the meadow, what type of this meadow, he will see which flowers have bloomed, which ones will only bloom, whether the meadow is suitable for grazing cows, etc. That is, he will receive more information than we do, since before he looked at the meadow, he had more questions, the biologist will resolve more uncertainties.

Information is

The more uncertainty was resolved in the process of obtaining information, the more information we received. But this is a subjective measure of the amount of information, and we would like to have an objective measure. There is a formula for calculating the amount of information. We have some uncertainty, and we have N-th number of cases of uncertainty resolution, and each case has a certain probability of resolution, then the amount of information received can be calculated using the following formula, which Shannon suggested to us:

I = - (p_1 log_ (2) p_1 + p_2 log_ (2) p_2 + ... + p_N log_ (2) p_N), where

I is the amount of information;

N is the number of outcomes;

p_1, p_2, ..., p_N are the probabilities of the outcome.

Information is

The amount of information is measured in bits - an abbreviation for the English words BInary digiT, which means a binary digit.

For equiprobable events, the formula can be simplified:

I = log_ (2) N, where

I is the amount of information;

N is the number of outcomes.

Let's take a coin for example and drop it on the table. She will fall either heads or tails. We have 2 equally probable events. After we flipped a coin, we got log_ (2) 2 = 1 bit of information.

Let's try to find out how much information we get after we roll the dice. The cube has six faces - six equally probable events. We get: log_ (2) 6 approx 2.6. After we rolled the dice on the table, we received approximately 2.6 bits of information.

The probability that we will see a Martian dinosaur when we leave the house is one in ten billion. How much information do we get about the Martian dinosaur after we leave the house?

Left (((1 over (10 ^ (10))) log_2 (1 over (10 ^ (10))) + left ((1 - (1 over (10 ^ (10)))) ight) log_2 left (( 1 - (1 over (10 ^ (10)))) ight)) ight) approx 3.4 cdot 10 ^ (- 9) bits.

Let's say we threw 8 coins. We have 2 ^ 8 options for falling coins. So after tossing coins we get log_2 (2 ^ 8) = 8 bits of information.

When we ask a question and can equally likely get an answer "yes" or "no", then after answering the question we get one bit of information.

Surprisingly, if we apply Shannon's formula to analog information, then we get an infinite amount of information. For example, the voltage at the point electrical circuit can take on an equiprobable value from zero to one volt. The number of outcomes we have is equal to infinity and, substituting this value into the formula for equiprobable events, we get infinity - an infinite amount of information.

Now I will show you how to code "war and peace" with just one risk on any metal rod. Let's encode all letters and signs that appear in " war and the world ”, with the help of two-digit numbers - they should be enough for us. For example, we will give the letter "A" the code "00", the letter "B" - the code "01" and so on, encode the punctuation marks, letters and numbers. Let's recode " war and the world "with the help of this code and get a long number, for example, such as 70123856383901874 ..., add a comma and zero before this number (0.70123856383901874 ...). The result is a number from zero to one. Let's put risk on a metal rod so that the ratio of the left side of the rod to the length of this rod equals just our number. Thus, if we suddenly want to read "war and peace", we simply measure the left side of the rod to risks and the length of the entire rod, divide one number by another, get the number and recode it back into letters ("00" in "A", "01" in "B", etc.).

Information is

In reality, we will not be able to do this, since we will not be able to determine the lengths with infinite precision. Some engineering problems prevent us from increasing the measurement accuracy, and quantum physics shows us that after a certain limit, quantum laws will already interfere with us. Intuitively, we understand that the lower the measurement accuracy, the less information we receive, and the higher the measurement accuracy, the more information we receive. Shannon's formula is not suitable for measuring the amount of analog information, but there are other methods for this, which are discussed in Information Theory. In computer technology, a bit corresponds to the physical state of the information carrier: magnetized - not magnetized, there is a hole - no hole, charged - not charged, reflects light - does not reflect light, high electrical potential - low electrical potential. In this case, one state is usually denoted by the number 0, and the other - by the number 1. Any information can be encoded with a sequence of bits: text, image, sound, etc.

Along with a bit, a value called a byte is often used, usually it is equal to 8 bits. And if a bit allows you to choose one equally probable option out of two possible ones, then a byte is 1 out of 256 (2 ^ 8). It is also customary to use larger units to measure the amount of information:

1 KB (one kilobyte) 210 bytes = 1024 bytes

1 MB (one megabyte) 210 KB = 1024 KB

1 GB (one gigabyte) 210 MB = 1024 MB

In reality, the SI prefixes kilo-, mega-, giga- should be used for multipliers of 10 ^ 3, 10 ^ 6 and 10 ^ 9, respectively, but historically the practice of using factors with powers of two has developed.

A Shannon bit and a bit that is used in computer technology are the same if the probabilities of occurrence of a zero or one in a computer bit are equal. If the probabilities are not equal, then the amount of information according to Shannon becomes less, we saw this on the example of the Martian dinosaur. The computerized amount of information gives an upper estimate of the amount of information. Volatile memory, after energizing it, is usually initialized with some value, for example, all ones or all zeros. It is clear that after energizing the memory, there is no information there, since the values ​​in the memory cells are strictly defined, there is no uncertainty. The memory can store in itself a certain amount of information, but after the power is applied to it, there is no information in it.

Disinformation - knowingly false information provided to the enemy or business partner for more effective conduct of hostilities, cooperation, verification of information leakage and the direction of its leakage, identification potential clients The process of manipulating information is also called disinformation (also misinformed), such as: misleading someone by providing incomplete information or complete, but no longer the information you need, distortion of the context, distortion of a piece of information.

The goal of such an impact is always the same - the opponent must act as the manipulator needs. The act of the object against which the disinformation is directed may consist in making a decision necessary for the manipulator or in refusing to make a decision that is unfavorable for the manipulator. But in any case, the ultimate goal is the action that will be taken by the opponent.

Disinformation is thus product human activity, an attempt to create a false impression and, accordingly, push to the desired actions and / or inaction.

Information is

Types of disinformation:

Misleading a specific person or group of persons (including a whole nation);

Manipulation (by the actions of one person or a group of persons);

Creation of public opinion about a problem or object.

Information is

Misleading is nothing more than outright deception, the provision of false information. Manipulation is a method of influence aimed directly at changing the direction of people's activity. The following levels of manipulation are distinguished:

Strengthening the values ​​that are beneficial to the manipulator (ideas, attitudes) existing in the minds of people;

Partial change of views on this or that event or circumstance;

A radical change in attitudes.

The creation of public opinion is the formation in society of a certain attitude towards the chosen problem.

Sources and links

ru.wikipedia.org - the free encyclopedia Wikipedia

youtube.com - YouTube video hosting

images.yandex.ua - yandex pictures

google.com.ua - Google pictures

ru.wikibooks.org - wikibooks

inf1.info - Planet of Informatics

old.russ.ru - Russian Journal

shkolo.ru - Information guide

5byte.ru - Informatics website

ssti.ru - Information Technology

klgtu.ru - Informatics

informatika.sch880.ru - site of the teacher of informatics O.V. Podvintseva

Encyclopedia of Cultural Studies

The basic concept of cybernetics, in the same way, economic intelligence is the basic concept of economic cybernetics. There are many definitions of this term, they are complex and contradictory. The reason for this, obviously, is that I. as a phenomenon is engaged ... ... Economics and Mathematics Dictionary


Wir verwenden Cookies für die beste Präsentation unserer Website. Wenn Sie diese Website weiterhin nutzen, stimmen Sie dem zu. OK

Data operations

During the information process, data is converted from one type to another using certain methods... The data processing itself includes many different operations. In the structure of possible operations, the following can be distinguished:

1. Data collection - accumulation of information in order to ensure sufficient completeness for decision-making.

2. Data formalization - bringing data from various sources to one form.

3. Filtering data - filtering out "unnecessary" data, which is not necessary for making decisions.

4. Sorting data - sorting data according to a given criterion.

5. Data archiving - organization of data storage in a convenient and accessible form; serves to reduce the economic costs of data storage.

6. Data protection - a set of measures aimed at preventing the loss, reproduction and modification of data.

7. Transporting data - receiving and transmitting data between remote participants in the information process.

8. Transformation of data - transfer of data from one form to another.

Information coding Is the operation of converting information from one sign system to another.

Coding Is the process of presenting information, convenient for its storage and / or transmission.

There are three main ways of encoding:

1) Graphic, using special drawings or icons;

2) Numeric - using numbers,

3) Symbolic - using alphabet symbols

The coding tool is sign systems correspondence table, which establishes a one-to-one correspondence between signs or groups of signs of two different sign systems.

In the process of exchanging information, operations often have to be performed encoding and decoding information. For example, when you enter an alphabet character into a computer by pressing the corresponding key on the computer, the character is encoded, i.e. converting it to computer code. When a character is displayed on the screen of a monitor or printer, the reverse process occurs - decoding, when from computer code the sign is converted into its graphic image.

Coding systems - human languages, alphabet, writing mathematical expressions, telegraph, marine alphabet, etc.

Encryption- this is also encoding, but with a secret method known only to the source and the addressee.

Decryption- the process of ciphertext into open (source) text.

Science deals with encryption methods cryptography.

Computer technology uses its own coding system, which is called binary coding... In binary encoding, one binary bit carries one unit of information, which is called 1 bit. This system is based on the representation of data with a sequence of only two characters - 0 and 1 (machine code).



One bit can express two concepts 0 or 1 (yes or no, true or false, there is a signal or no signal).

With two bits, you can express four different concepts:

Three bits can encode eight different values:

000,001,010,011,100,101,110,111

By increasing the number of bits in the binary coding system by one, we double the number of values ​​that can be expressed in this system:

N = 2 i,

Where N is the number of independent encoded values,

i - bit depth of binary encoding.

To encode integers from 0 to 255, it is enough to have 8 bits of the binary code (8 bits - 256 = 2 8):

16 bits allow you to encode integers from 0 to 65535, and 24 bits already have more than 16.5 million different values.

With binary code you can encode and text information: 8 bits - 256 characters, 16 bits - 65536 characters (UNICODE).

Graphics data (bitmap - 32 bits - full color). Audio information is also encoded.

One of the most important topics in computer science. She is discussed in detail in the school curriculum. Knowledge on the topic information and information processes are a prerequisite for the successful delivery of the Unified State Exam and admission to universities in the relevant faculties. They will already allow you to easily score 15 test points (15%). Below are discussed in detail such concepts as measuring the amount of information, alphabetical and probabilistic approaches for equiprobable and non-equiprobable events. On exams, there are a large number of problems on this topic. The ability to solve them is one of the requirements for applicants. For each topic of the section, in addition to detailed theoretical material, almost all possible options tasks for self-study... In addition, you have the opportunity to download ready-made detailed solutions to these tasks illustrating different ways to get the correct answer.

1.
2.
3.
4.
5.
6.

Information

Information is information that we receive from the world around us.

Information is a general scientific concept that includes the exchange of information between people, a person and an automaton, an automaton and an automaton; exchange of signals in the animal and plant world; transmission of signs from cell to cell, from organism to organism. In the original and narrowest sense, information is an attribute of thinking beings, people: information, data, facts obtained from experience, observation or through thinking, recorded in material form for communication to other thinking beings or to oneself. Any information inevitably contains two components - meaningful (makes sense, understandable to those to whom it is intended) and material (must be presented in a tangible form on one or another physical medium).

Informatics is a science that studies the properties of information, as well as ways of presenting, accumulating, processing and transmitting information using technical means.

Information is the knowledge of what the perceived information means for a given person. ”This definition concerns only the semantic (semantic) properties of information, moreover, for a specific person.

    Forms of existence of the world:
  • substance - a variety of material objects;
  • energy is the interaction of objects;
  • information - information about the surrounding world.
    Consideration of information in different fields of activity:
  • Information in everyday life is information about the surrounding world and the processes taking place in it.
  • Information in technology is a sequence of signs and signals.
  • Information in science is a measure of reducing the uncertainty of knowledge.
  • Information in cybernetics is a piece of knowledge for managing information processes.
  • Information within the framework of semantic theory is considered as something new (novelty).
    Distinguishing information:
  • by the way of perception: visual, auditory, tactile, olfactory, gustatory;
  • by the form of presentation: text, numerical, graphic, sound, combined;
  • by public importance: public, personal, special, etc.
    Information properties:
  • Objectivity is independence from a person's opinion.

    Information does not depend on anyone's opinion, judgment.

    For example, the message “It's warm outside” is subjective information, and the message “It's 22 ° C outside” is objective.

    Objective information can be obtained using serviceable sensors, measuring instruments... But, reflected in the consciousness of a particular person, it ceases to be objective, since it is transformed (to a greater or lesser extent) depending on the experience, opinion, judgment and other qualities of a particular subject.

  • Completeness - sufficiency for making a decision.

    Information is complete if it is sufficient to make a decision.

    For example, historical information is never complete and its completeness decreases with distance from us in the historical era.

  • Credibility is a reflection of the true state of affairs.

    Information is reliable if it reflects the true state of affairs.

    Objective information is always reliable, but reliable information can be both objective and subjective. The main mechanisms for obtaining inaccurate information: 1. deliberate distortion (disinformation); 2. distortion due to interference ("broken phone"); 3. exaggeration or understatement of a real fact (rumors, fishing and hunting stories, etc.).

    For example, historical or socio-political information is subject to all three ways of receiving and transmitting inaccurate information.

  • Adequacy - compliance with the current moment.

    Adequacy of information is a certain level of correspondence of the image created with the help of the received information to a real object, process, phenomenon, etc.

    V real life a situation is hardly possible when you can count on the complete adequacy of the information. There is always some degree of uncertainty. The correctness of human decision-making depends on the degree of information adequacy to the real state of an object or process.

    Example: You have successfully finished school and want to continue your education in an economic direction. After talking with friends, you will find out that similar training can be obtained in different universities. As a result of such conversations, you receive very contradictory information that does not allow you to make a decision in favor of one or another option, i.e. the information received is inadequate to the real state of affairs. In order to get more reliable information, you buy a reference book for applicants to universities, from which you get comprehensive information. In this case, we can say that the information you received from the reference book adequately reflects the areas of study in universities and helps you to make your final choice.

    The possibility and efficiency of using information is determined by such basic consumer quality indicators as representativeness, meaningfulness, sufficiency, availability, relevance, timeliness, accuracy, reliability, sustainability.

  • Availability is the ability to receive.

    Availability (of information [resources of an automated information system]) (eng. Availability) - the state of information (resources of an automated information system), in which subjects who have the right of access can implement them without hindrance.

    Access rights include: the right to read, change, copy, destroy information, as well as the right to change, use, destroy resources.

  • Relevance - importance at the moment.

    The relevance of information is its importance, relevance for the present.

    Timeliness of information plays important role in an objective assessment of the situation and in the decision-making process.

    Reasons for the information being out of date:

    1. Obsolescence;
    2. Uselessness, uselessness.

Information processes

Information processes- actions performed on information.

    Information processes:
  • Information processing (transfer of information from one type to another according to certain rules).
  • Data storage.
  • Transfer of information.

Collection of information - search and selection according to any criteria.

    Methods of collecting information:
  • Automated (with measuring instruments).
  • Mechanized (without measuring instruments).
  • Automatic (sensors, counters, etc. are used. A person acts as an observer).
    Search for information:
  • Observation.
  • Communication with specialists.
  • Literature.
  • The television.
  • Radio.
  • Banks and databases, etc.
    Systematization of information:
  • Data libraries.
  • Photo and video archives / albums, etc.

Information encoding - converting one set of characters to another.

Registration - fixing on a medium.

Data are registered signals.

Media - a device designed to store and transmit information.

    Types of media:
  • Human readable.
  • Machine readable.

Storage - placing information in storage for subsequent retrieval and use.

    Memory:
  • Internal.
  • External
    Memory:
  • Long-term.
  • Operational.

General information transfer scheme:

Measuring the amount of information

A person perceives information in analog form, i.e. continuous flow. In a computer, information is processed in discrete or digital form. Hence the name of the sampling process, i.e. splitting the information flow into separate signals, signal sequences. The digital signal consists of several discrete streams.

A bit is the smallest unit of information.

A byte is the basic unit of information.

Below is a table of units of information:

Name Symbol Factor
Kilobyte KB 2 10
Megabyte MB 2 20
Gigabyte GB 2 30
Terabyte TB 2 40
Petabyte PB 2 50
Exabyte EB 2 60
Zettabyte ZB 2 70
Yottabyte YB 2 80

This table is used to convert "large" units to bytes.

1 byte = 2 3 bit = 8bit.

For example: 2Kb = 2 * 2 10 byte = 2 * 2 13 bit = 2 14 bit = 16384bit. (2 10 = 1024).

To calculate the probability of an individual event (pi), use the following formula:

    In this formula:
  • N i- the number of events identified.
  • p i
  • N is the number of possible events.

There is a formula for calculating the amount of information about one event from a set:

    In this formula:
  • I i- the amount of information about one event.
  • p i- the likelihood of a single event.

Tasks on the topic "".

More complex tasks on the topic "".


Usage alphabetical approach fully justifies itself when using technical means of working with information. In this case, the concepts of "new - old", "understandable - incomprehensible" information lose their meaning. This method does not associate the amount of information with the content of the message.

Using alphabetical approach to defining information, the length of the code becomes important for us. If earlier we did not take into account the length of the answer, then when using the alphabetical approach, this becomes important. When calculating the amount of information, every character in the code, every letter in the message has a weight for us.

The alphabetic approach is an objective way of measuring information as opposed to the subjective probabilistic approach.

The alphabetic approach does not consider the content of information, and messages are considered as sequences of characters of certain sign systems.

    Languages:
  • Natural (e.g. biological).
  • Formal (man-made systems of signs, symbols are used).

To write a message in a formal language, a specific alphabet is used. According to the alphabetical approach, the number of different characters used in a given alphabet - the cardinality of the alphabet (N) can be found by the following formula:


    In this formula:
  • N is the cardinality of the alphabet.
  • i - informational weight of one character.

From here we can express the informational weight of one symbol (i):


The information capacity of a message using the alphabetical approach can be found by the following formula:

    In this formula:
  • I - the amount of information contained in the message.
  • k is the number of characters in the message.

Tasks on the topic "".

More complex tasks on the topic "Alphabetical approach to measuring information".

How to solve logarithms

At the request of the "workers" I am posting a small and, in my opinion, fairly simple explanation of how to solve log.

So, let's analyze the topic on simple example: i = log 2 N.

In fact, this formula answers the question: "How to find i from the formula N = 2 i".

Thus, when we see the record i = log 2 N, we must say: "to what power do you need to raise 2 to get N? This power is your answer, that is, if N = 4, then i = 2 (because 2 squared is 4) ".

Let's look at a couple more examples on this topic:

    Calculate:
  1. i = log 2 16.
  2. i = log 3 81.
  3. I = log 2 (1/4).
  4. I = log 5 (1/125).
    Solution:
  1. To what degree do you need to raise 2 to get 16? - in 4 (2 * 2 * 2 * 2 = 2 4 = 16). Answer: i = 4.
  2. To what degree does 3 need to be raised to get 81? - in 4 (3 * 3 * 3 * 3 = 3 4 = 81). Answer: i = 4.
  3. To what power do you need to raise 2 to get 1/4? - in 2
    (Remember: a -x = 1 / a x. 1/(2*2)=2 -2 =1/4).
    Answer: I = -2.
  4. To what degree does 5 need to be raised to get 1/125? - in -3 (1 / (5 * 5 * 5) = 5 -3 = 1/125). Answer: I = -3.

Challenges and solutions

Time to move on to solving possible problems on the topic "" ...

This includes some "unusual" tasks (after all, encoding does not have to be binary ...). However, the difference is only in the number of different signals, so their solution is reduced to similar formulas.

    1. Choose the correct definitions for the term "bit":
  • Bit is the smallest unit of information measurement.
  • Bit - the amount of information equal to one eighth of a byte.
  • A bit is an amount of information that halves the uncertainty of knowledge.
  • A bit can only take two values ​​- 0 or 1.
  • Bit is the basic unit of measurement of information.
  • Bit - the amount of information required to transmit the "Yes" / "No" message.

Note: if the answer in the problem is not an integer, then select the next integer (example: if you get 2.16 bits, the answer is 3 bits).

2. They asked for a number from 1 to 8. What amount of information in the message about which number was conceived (in bits)?

3. Throw a six-sided dice. How much information is in the message about the number on the die?

4. They asked for a number from 1 to 100. The person who asked for all the questions answers "Yes" or "No". What is the smallest number of questions to be sure to guess a number?

5. For the exchange of messages, sequences of characters of the same length are used, consisting only of characters "A" "B". What is the minimum length of these sequences so that they each encode any of 50 different messages?

6. The light board consists of light bulbs, each of which can be in two states ("on" or "off"). What is the smallest number of light bulbs on the scoreboard so that 200 different signals can be transmitted with it?

7. Ellochka the man-eating (in whose vocabulary, as you know, there were 30 words) says a phrase consisting of 50 words. How much information in bits does Ellochka tell you?

8. The cyclocross is attended by 119 athletes. A special device registers the passage of each participant of the intermediate finish, recording his number using the minimum possible number of bits, the same for each athlete. What is the information volume in bits of the message recorded by the device after 70 cyclists have passed the intermediate finish?

9. 125 people take the rehearsal exam at the school. Each of them is allocated special number... When registering a participant, to record his number, the minimum possible number of bits is used, which is the same for each participant. What is the amount of information in bits recorded by the device after 60 participants have registered?

10. For the transmission of a secret message, a code consisting of decimal digits is used. In this case, all digits are encoded with the same (minimum possible) number of bits. Determine the information size in bits of such a 150-character message.

11. The meteorological station monitors air humidity. The result of a single measurement is an integer between 0 and 100 percent, which is written using the fewest possible bits. The station made 80 measurements. Determine the information volume in bits of the measurement results.

12. To record the results of the children's game "Zarnitsa", a table is used, in each cell of which is written either the number of points received by the team in the corresponding type of competition (1, 2, 3), or a dash (if the team did not participate in this type of competition). In "Zarnitsa" 30 teams compete in 10 types of competitions. How much information in bits does the table contain?

13. Vasya sends Petya a message consisting only of characters (uppercase and lowercase) of the Latin alphabet, spaces and punctuation marks (.,!?) In 2 minutes. The message consists of 200 characters. What is the transfer rate (bits per second)?

14. The leader of the tribe, in the vocabulary of which there are only 64 different words, makes a fiery speech in front of his fellow tribesmen, consisting of 100 words for 2 minutes. What is the transfer rate (bits per second)?

15. The flag signaller uses 36 different gestures to transmit the message. The signalman transmits a message consisting of 50 gestures in 30 seconds. What is the message transmission rate (bits per second)?

16. How many kilobytes of information does a 224-bit message contain?

17. How many kilobits of information does a 214 byte message contain?

18. During a cable TV broadcast, the system collects information from viewers about the movie they would like to watch. There are 4 films to choose from. The minimum possible number of bits is used to encode each wish. A total of 102,400 viewers expressed their opinion. How many kilobytes should the system analyze?

19. The data transfer rate through the ADSL connection is 128000 bps. Across this connection a 625KB file is transferred. Determine the file transfer time in seconds.

20. Sasha wants to download a 240Mbps video from the Internet. Download speed is limited to 16 kilobytes per second. How many minutes does Sasha need?

21. A file is transmitted through a communication channel at a speed of 64 kilobytes per second for 10 minutes. How many megabytes are in the file?

22. The data transfer rate through the ADSL connection is 256000 bps. A file is transferred over this connection in 2 minutes. Determine the information weight of the file in kilobytes.

23. The light board consists of light bulbs. Each light can be in one of three states ("on", "off" or "blinking"). What is the smallest number of lamps that need to be on the scoreboard to transmit 27 different signals?

24. Morse code allows you to encode characters for radio communications by specifying a combination of dots and dashes. How many different characters can you encode using Morse code with a length of at least five and at most six signals?

25. Vasya and Petya send messages to each other using blue, red and green flashlights. They do this by turning on one flashlight for an equally short time in a certain sequence. The number of flashes in one message is 3 or 4. There are pauses between messages. How many different messages can boys send?

26. For transmission of 300 different messages, 5 consecutive color flashes are used. Color lamps are switched on for an equally short time in a certain sequence. How many lamps different colors should be used in transmission (minimum amount)?

27. To transmit 1000 different messages, 5 consecutive color flashes are used. Color lamps turn on for an equally short time in a certain sequence. How many different colors of lamps should be used for transmission (minimum)?

28. 12,500 perches, 25,000 minnows, 6,250 crucians and 6,250 pikes swim in the lake. How much information will we get when we catch any fish?

Note: all "individual probabilities" must add up to 1.

29. After the computer science exam, grades are announced (“2”, “3”, “4” or “5”). How much information is there in the grade report for student A, who learned only half of the tickets, and the grade report for student B, who learned all the tickets?

30. There are only black, white and gray cars in the principality. White cars 18. The message that a black car got into an accident carries 7 bits of information. The message that the accident was not a gray car carries 5 bits of information. How many black cars are there in the principality?

Information concept. Properties of information. Information processes: receiving, transferring, transforming and storing information

Information- one of the basic concepts of science. Along with such concepts,as matter, energy, space and time, it forms the basis of the modern scientific picture of the world. It cannot be defined in terms of simpler concepts.

Term information comes from the Latin word informatio, which means - clarification, message, awareness.

Information in everyday life (everyday aspect) is understood as information about the surrounding world and the processes occurring in it, perceived by a person or special devices.

In technology, information is understood as messages transmitted in the form of signs or signals.

Information in information theory does not mean any information, but only those that completely remove or reduce the existing uncertainty. According to K. Shannon's definition, information is a removed uncertainty.

According to N. Wiener's definition, information in cybernetics means that part of knowledge that is used for orientation, active action, control, i.e. in order to preserve, improve, develop the system.

Information in semantic theory (the meaning of a message) is understood as information with novelty.

Information is a reflection of the outside world using signs and signals.

Information properties , i.e. its qualitative features.

Objectivity... Information is objective if it does not depend on anyone's opinion.

Credibility... Information is reliable if it reflects the true state of affairs.

Completeness... Information can be considered complete if it is sufficient for understanding and making a decision.

Relevance- importance, materiality for the present time.

Adequacy- a certain level of correspondence of the image created with the help of the received information to a real object, process, phenomenon.

Information processes

Exchange, storage and processing of information are inherent in nature, man, society, technical devices. In systems of different nature, actions with information: exchange, storage, processing are the same. These actions are called INFORMATION PROCESSES.

Let us consider in more detail the various types of information processes between an automaton and an automaton (technical devices).

Information exchange

The transmission and reception of information is called the exchange of information. The transfer of information between the machines is carried out using technical means of communication. The relay tower transmits information, which is perceived by the receiving unit of the TV. The radio station transmits information, which is perceived by the receiving unit of the radio receiver. The VCR transmits information from the videotape to the screen.

When exchanging information, you need a source of information and a receiver of information. Information transmitted from the source reaches the receiver through a sequence of signals called MESSAGE. Signals can be sound, electrical, electromagnetic, etc. Information can be received continuously, or discretely, that is, in the form of a sequence of signals separated from each other by time or space intervals.

Information transformation

Information processing is the transformation of information from one type to another, carried out according to strict formal rules.

Information processing according to the "black box" principle is a process in which only input and output information is important and necessary for the user, but the rules by which the transformation takes place are not interested in and are not taken into account.

The possibility of automated processing of informatization is based on the fact that information processing does not imply its comprehension.

Data storage

Information for a tape recorder, video recorder, movie camera is stored on special devices: audio cassettes, videotapes, films. The device for storing information is called the MEDIA. The information carrier can be of different nature: mechanical, magnetic, electrical. Information carriers differ in the form of information presentation, in the reading principle, in the types of material.

Information is stored in the form of signals or signs. With the help of a microphone and other tape recorder devices, sound information is recorded on a magnetic tape, i.e. information is stored on a magnetic tape. With the help of the magnetic head of the tape recorder, information is read from the magnetic tape. Information is RECORDED on the medium by changing the physical, chemical or mechanical properties of the environment. Recording and reading of information is carried out as a result of physical impact on the storage medium of the recording and reading devices.

Information. Transfer of information

Information is transmitted in the form messages from some source information to her receiver through communication channel between them. Source sends transmitted message which encoded into the transmitted signal... This signal is sent to communication channel... As a result, the receiver appears received signal, which the decoded and becomes received message.

Examples:

  1. Message, containing information about the weather forecast, is transmitted to the receiver(to the viewer) from source- specialist meteorologist via communication channel- television transmitting equipment and television.
  2. Living creature with their senses (eye, ear, skin, tongue, etc.) perceives information from the outside world, recycles it in a certain sequence of nerve impulses, transfers impulses along nerve fibers, keeps in memory in the form of the state of the neural structures of the brain, reproduces in the form of sound signals, movements, etc., uses in the course of their life.

The transmission of information through communication channels is often accompanied by the impact interference causing distortion and loss of information.

Information properties

Information properties:

Information is reliable if it reflects the true state of affairs ... Inaccurate information can lead to misunderstandings or incorrect decisions.

Reliable information can become inaccurate over time , since it has the property become obsolete, that is ceases to reflect the true state of affairs.

Information is complete if it is sufficient for understanding and making decisions. ... Both incomplete and redundant information inhibits decision making or can lead to errors.

Accuracy of information is determined by the degree of its proximity to the real state of an object, process, phenomenon, etc.

The value of information depends on how important it is for solving the problem. , as well as from the fact how much later it will find application in any types of human activity.

Only information received in a timely manner can bring expected benefits... Equally undesirable as premature submission of information(when it cannot yet be assimilated), so it delay.

If valuable and timely information is expressed in an incomprehensible way , she can become useless.

Information becomes clear if it is expressed in the language spoken by those to whom the information is intended.

Information should be presented in an accessible (according to the level of perception) form. Therefore, the same questions are presented in different ways in school textbooks and scientific publications.

Information on the same issue can be summarized(succinctly, without irrelevant details) or extensively(in detail, verbose). Brevity of information is necessary in reference books, encyclopedias, textbooks, all kinds of instructions.

Data processing

Data processing obtaining some information objects from other information objects by performing some algorithms

Processing is one of the main operations performed on information, and the main means of increasing the volume and variety of information.

Information processing tools these are all kinds of devices and systems created by mankind, and first of all, a computer universal machine for information processing.

Computers process information by performing certain algorithms.

Living organisms and plants process information using their organs and systems.

Ticket number 1. Information concept. Information processes and systems. Information resources and technologies.

Information- information about objects and phenomena of the environment, their parameters, properties and condition, which are perceived by information systems (living organisms, control machines, etc.) in the process of life and work.In informatics, information is understood as a message that reduces the degree of uncertainty in knowledge about the state of objects or phenomena and helps to solve the problem.

Information can exist in the form:

1. texts, pictures, drawings, photographs;

2. light or sound signals;

3. radio waves;

4. electrical and nerve impulses;

5. magnetic records;

6. gestures and facial expressions;

7. smells and tastes;

8. chromosomes, through which the traits and properties of organisms are inherited, etc.

Information processes -processes associated with the search, storage, transmission, processing and use of information.

1.direct observation;

2.Communication with experts on the issue you are interested in;

3.reading relevant literature;

4.view video, TV programs;

5.listeningradio broadcasts, audio cassettes;

6. work in libraries and archives;

2. Collection and storage.
Data storage -
it is a way of disseminating information in space and time.
The way information is stored depends on its medium.
(book-library, painting-museum, photography-album).
The computer is designed for
compact storageinformation with the possibilityquick access To her.
Information system -
it is a repository of information, equipped with procedures for entering, searching and placing and issuing information. The presence of such procedures is the main feature of information systems, distinguishing them from simple accumulations of information materials.For example, a personal library, in which only its owner can navigate, is not an information system. In public libraries, the order of placement of books is always strictly defined. Thanks to him, the search and issuance of books, as well as the placement of new acquisitions, are standard,formalized procedures.

3. Transfer.
In the process of transferring information, they are necessarily involved
source and destination information: the first transmits information, the second receives it. There is an information transfer channel between them - link.
Link -
a set of technical devices that ensure the transmission of a signal from a source to a receiver.
Encoder -
a device designed to transform the original message of the source into a form convenient for transmission.
Decoder -
a device for converting the encoded message into the original one.
Human activities are always associated with the transfer of information.

4. Processing.
Data processing -
transformation of information from one type to another, carried out according to strict formal rules. Processing information onthe "black box" principle -a process in which only the input and output information is important and necessary for the user, but the rules by which the transformation takes place are not of interest to him and are not taken into account.

1. The reliability, completeness, objectivity of the information received will provide you with the opportunity to make the right decision.

2. Your ability to clearly and easily present information will be useful in communicating with others.

3. The ability to communicate, that is, to exchange information, is becoming one of the main skills of a person in the modern world.
Computer literacy suggests:

4. Knowledge of the purpose and user characteristics of the main computer devices;

5. Knowledge of the main types of software and types of user interfaces;

6. Ability to search, store, process text, graphic, numerical information using the appropriate software.

1. access to information to persons who do not have the appropriate permission (unauthorized, illegal access);

2.intentional or unlawful use, change or destruction information.

Information security, in a broader sense, is understood as a complex of organizational, legal and technical measures to prevent threats to information security and eliminate their consequences.

Information resources and technologies.

Informational resources - these are the ideas of mankind and instructions for their implementation, accumulated in a form that allows their reproduction.

Informational resources (unlike all other types of resources - labor, energy, mineral, etc.) grow faster, how spend more of them.

Information technology Is a collection of methods and devices used by people to process information.

Currently, the term "information technology" used in connection with the use of computers for information processing. Information technology covers all computing and communications technology and, in part, consumer electronics, television and radio broadcasting.

They find application in industry, commerce, management, banking system, education, healthcare, medicine and science, transport and communications, agriculture, system social security, serve as a help to people of various professions and housewives.

The peoples of the developed countries recognize that improving information technology is the most important, albeit costly and difficult, task.

Currently, the creation of large-scale information technology systems is economically feasible, and this leads to the emergence of national research and educational programs designed to stimulate their development.

Question 1 (The concept of information. Information processes and systems. Information resources and technologies.)

Term information comes from the Latin word information , which means "information, clarification, presentation".

Information name any data or information that interests anyone.

In technology - information - information about objects and phenomena of the environment, their parameters and condition, which reduce the degree of uncertainty about them, incompleteness of knowledge.

The characteristic features of the informationare as follows:

  1. It is the most important resource of modern production: it reduces the need for land, labor, capital, and reduces the consumption of raw materials and energy.
  2. It gives rise to new industries.
  3. It is a commodity, and the seller of information does not lose it after the sale.
  4. Gives added value to other resources, in particular labor. Indeed, an employee with a higher education is valued more than an employee with a secondary education.
  5. Information can accumulate.

Information processes - these are the processes associated with the receipt, storage, processing and transmission of information (i.e. actions performed with information). Those. these are processes during which the content of information or the form of its presentation changes.

Information processes

  1. Reception (reading a book, newspaper or watching and listening to TV (radio), preparing for an exam);
  2. Storage (we remember what we read from the book);
  3. Transfer (Broadcast) (we can retell the content of the book to our friend);
  4. Processing (after reading the book, we can process the information received and draw some conclusions for ourselves);
  5. Usage (After receiving information that it was raining, we took an umbrella).

    1. The origin of data - the formation of primary messages that record the results of certain operations, properties of objects and subjects of management, process parameters, the content of regulatory and legal acts, etc.

    2. Accumulation and systematization of data ... - the organization of their placement, which would ensure a quick search and selection of the necessary information, methodical updating of data, protection of them from distortion, loss, deformation of integrity, etc.

    4. Displaying data - their presentation in a form suitable for human perception. First of all, this is printing, that is, the creation of documents on the so-called hard (paper) carriers. The construction of graphic illustrative materials (graphs, diagrams) and the formation of sound signals are widely used.

    Informational resources - information used in production, technology and social management (scientific and technical knowledge, works of literature and art, a lot of information recorded in any form on any medium).

    Countrywide information resources - national information resources. The country's information resources determine its scientific and technical potential (STP), scientific potential and economic and strategic power.

    The information resource will not disappear, they accumulate and change.


    Under Information technology understand the totality of methods, production and software and technological means, united in a technological chain, ensuring the collection, storage, processing, output and dissemination of information.

Top related articles