How to set up smartphones and PCs. Informational portal
  • home
  • Windows 7, XP
  • Lecture notes on the topic: "Information. Properties of information"

Lecture notes on the topic: "Information. Properties of information"

The word "information" comes from the Latin word informatio , which means mixing, clarification, familiarization. Information is a fundamental concept of computer science, very capacious and deep, which cannot be defined unambiguously. In science, technology, life, this word has different meanings. Therefore, we will consider the concept of information from different points of view. In mathematics, information is information that a person has created with the help of inferences; in biology, it is the human genetic code; in cybernetics, information is associated with control processes in complex systems; in ordinary life, this is information about the world around. From the point of view of computing technology, information is signals.
Thus, in informatics, the concept of information is considered as the knowledge of a person, which he receives from the world around him and which he implements with the help of computer technology.
Common in the considered points of view is that the concept of information involves the creation, transmission, processing and storage of information. These processes are called informational.

Information process Is a process as a result of which the reception, transmission, processing and use of information is carried out. Work with information can be performed according to the following scheme:

Any information can be divided into input, which is received by a person or device, and output, which is obtained after processing by a person or device.
Transfer of information information is transmitted in the form of messages from some source of information to the receiver through a communication channel between them.
The source sends the transmitted message, which is encoded into the transmitted signal. This signal is sent over the communication channel. As a result, a received signal appears at the receiver, which is decoded and becomes a received message.
In general, the process of receiving and transmitting information can be represented by the following diagram:

A living being or a technical device can act as a source of information. From it, the information goes to the encoder, which is designed to convert the original message into a convenient form. Through the communication channel, the information reaches the recipient's decoding device, which converts the encoded message into a form understandable to the recipient. Some of the most sophisticated decoding devices are the human ear and eye.
In the process of transmission, information can be lost, distorted. This is due to various interference, both on the communication channel and during encoding and decoding of information. Issues related to methods of encoding and decoding information are dealt with by a special science - cryptography.
Example: a message containing information about a weather forecast is transmitted to a receiver (TV viewer) from a source - a specialist meteorologist through a communication channel - television transmitting equipment and television.

Data processing - getting some information objects from other information objects by performing some algorithms.
Processing is one of the main operations performed on information and the main means of increasing the volume and variety of information.

Examples of information processing can be seen in the table.



Information processing tools these are all kinds of system devices created by mankind and, first of all, a computer - a universal machine for processing information. Computers process information by performing certain algorithms.
Data storage
it is its accumulation on various media. A storage medium is a medium for recording and storing information.

Information is information about something

Concept and types of information, transmission and processing, search and storage of information

Information is, definition

Information is any intelligence, received and transmitted, stored by various sources. - this is the whole set of information about the world around us, about all kinds of processes occurring in it, which can be perceived by living organisms, electronic machines and other information systems.

- it meaningful information about something, when the form of their presentation is also information, that is, it has a formatting function in accordance with its own nature.

Information is everything that can be supplemented by our knowledge and assumptions.

Information is information about something, regardless of the form of their presentation.

Information is psychic of any psychophysical organism, produced by it using any means, called a means of information.

Information is information perceived by a person and (or) special. devices as a reflection of the facts of the material or spiritual world in process communications.

Information is data organized in a way that makes sense to the person dealing with it.

Information is the value that a person puts into data based on known conventions used to represent it.

Information is information, explanations, presentation.

Information is any data or information that interests anyone.

Information is information about objects and phenomena of the environment, their parameters, properties and state, which are perceived by information systems (living organisms, control machines, etc.) in process life and work.

The same information message (newspaper article, advertisement, letter, telegram, help, story, drawing, radio broadcast, etc.) may contain a different amount of information for different people - depending on their previous knowledge, on the level of understanding of this message and interest in it.

When talking about automated work with information through any technical devices, they are not interested in the content of the message, but in how many characters this message contains.

Information is

With regard to computer data processing, information is understood as a certain sequence of symbolic designations (letters, numbers, encoded graphics and sounds, etc.), which carries a semantic load and is presented in a computer-understandable form. Each new character in such a sequence of characters increases the informational volume of the message.

Currently, there is no single definition of information as a scientific term. From the point of view of various fields of knowledge, this concept is described by its specific set of features. For example, the concept of "information" is basic in the course of computer science, and it is impossible to define it through other, more "simple" concepts (just as in geometry, for example, it is impossible to express the content of the basic concepts of "point", "line", "plane" through simpler concepts).

The content of the basic, basic concepts in any science should be explained by examples or revealed by comparing them with the content of other concepts. In the case of the concept of "information", the problem of its definition is even more complicated, since it is a general scientific concept. This concept is used in various sciences (computer science, cybernetics, biology, physics, etc.), while in each science the concept of "information" is associated with different systems of concepts.

Information concept

In modern science, two types of information are considered:

Objective (primary) information is the property of material objects and phenomena (processes) to generate a variety of states, which through interactions (fundamental interactions) are transmitted to other objects and imprinted in their structure.

Subjective (semantic, semantic, secondary) information is the semantic content of objective information about objects and processes of the material world, formed by a person's consciousness with the help of semantic images (words, images and sensations) and recorded on some material medium.

In the everyday sense, information is information about the surrounding world and the processes occurring in it, perceived by a person or a special device.

Currently, there is no single definition of information as a scientific term. From the point of view of various fields of knowledge, this concept is described by its specific set of features. According to K. Shannon's concept, information is a removed uncertainty, i.e. Information that should remove the uncertainty that exists in the acquirer to one degree or another before they are received, expand his understanding of the object with useful information.

From the point of view of Gregory Beton, an elementary unit of information is a "not indifferent difference" or an effective difference for some larger perceiving system. Those differences that are not perceived, he calls "potential", and perceived - "effective." "Information consists of not indifferent differences" (c) "Any perception of information is necessarily the receipt of information about the difference." From the point of view of informatics, information has a number of fundamental properties: novelty, relevance, reliability, objectivity, completeness, value, etc. Information analysis is primarily concerned with the science of logic. The word "information" comes from the Latin word informatio, which in translation means information, clarification, familiarization. The concept of information was considered by ancient philosophers.

Information is

Until the industrial revolution, defining the essence of information remained the prerogative of philosophers. Further, to consider the issues of information theory was the science of cybernetics, a new at that time.

Sometimes, in order to comprehend the essence of a concept, it is useful to analyze the meaning of the word that denotes this concept. Clarifying the inner form of a word and studying the history of its use can throw unexpected light on its meaning, overshadowed by the usual "technological" use of the word and modern connotations.

The word information entered the Russian language in the Peter the Great era. For the first time it is recorded in the "Spiritual Regulations" of 1721 in the meaning of "idea, concept of smth." (In European languages, it was fixed earlier - around the XIV century.)

Information is

Based on this etymology, any significant change in shape or, in other words, any materially recorded traces formed by the interaction of objects or forces and amenable to understanding can be considered information. Information is thus a converted form of energy. The carrier of information is a sign, and the way of its existence is interpretation: the identification of the meaning of a sign or a sequence of signs.

The meaning can be an event reconstructed from a sign that caused its occurrence (in the case of "natural" and involuntary signs, such as traces, evidence, etc.), or a message (in the case of conventional signs inherent in the sphere of language). It is the second type of signs that makes up the body of human culture, which, according to one of the definitions, is "the aggregate of non-hereditarily transmitted information."

Information is

Messages can contain information about facts or interpretation of facts (from Latin interpretatio, interpretation, translation).

A living entity receives information through the senses, as well as through reflection or intuition. The exchange of information between subjects is communication or communication (from Latin communicatio, message, transmission, derived in turn from Latin communico, to make it common, to communicate, to converse, to connect).

From a practical point of view, information is always presented as a message. An informational message is associated with the source of the message, the recipient of the message, and the communication channel.

Returning to the Latin etymology of the word information, let's try to answer the question of what exactly is given the form here.

It is obvious that, first, some sense, which, being initially formless and unexpressed, exists only potentially and must be "built" in order to become perceived and transmitted.

Secondly, the human mind, which is brought up to think structurally and clearly. Third, a society that, precisely because its members share these meanings and share them, gains unity and functionality.

Information is

information as an expressed reasonable meaning is knowledge that can be stored, transmitted and be the basis for generating other knowledge. The forms of knowledge conservation (historical memory) are diverse: from myths, chronicles and pyramids to libraries, museums and computer databases.

Information - information about the world around us, about the processes taking place in it, which are perceived by living organisms, managing directors machines and other information systems.

The word "information" is Latin. Over its long life, its meaning has undergone evolution, now expanding, then extremely narrowing its boundaries. Initially, the word "information" meant: "presentation", "concept", then - "information", "message transmission".

In recent years, scientists have decided that the usual (all accepted) meaning of the word "information" is too elastic, vague, and gave it such a meaning: "a measure of certainty in the message."

Information is

Information theory was brought into being by the needs of practice. Its occurrence is associated with work Claude Shannon's "Mathematical Theory of Communication", published in 1946. The foundations of information theory are based on the results obtained by many scientists. By the second half of the 20th century, the globe was buzzing with transmitted information running through telephone and telegraph cables and radio channels. Later, electronic computers appeared - information processors. And for that time, the main task of information theory was, first of all, to increase the efficiency of the functioning of communication systems. The difficulty in the design and operation of means, systems and communication channels is that it is not enough for a designer and engineer to solve a problem from a physical and energy standpoint. From these points of view, the system can be the most perfect and economical. But it is important, even when creating transmission systems, to pay attention to how much information will pass through this transmission system. After all, information can be measured quantitatively, counted. And they act in such calculations in the most usual way: they abstract from the meaning of the message, as they abandon concreteness in the arithmetic operations that are familiar to all of us (like from adding two apples and three apples to adding numbers in general: 2 + 3).

The scientists stated that they "completely ignored the human assessment of information." For example, they assign a certain meaning to information to a sequential series of 100 letters, regardless of whether this information has meaning and whether, in turn, it has meaning in practical application. The quantitative approach is the most developed branch of information theory. By this definition, a collection of 100 letters - a 100-letter phrase from a newspaper, a Shakespeare play, or Einstein's theorem - has exactly the same amount of information.

This quantification of information is eminently useful and practical. It exactly corresponds to the task of the communications engineer, who must transmit all the information contained in the submitted telegram, regardless of the value of this information to the addressee. The communication channel is soulless. One thing is important to the transmitting system: to transmit the required amount of information in a certain time. How do you calculate the amount of information in a particular message?

Information is

Estimation of the amount of information is based on the laws of probability theory, more precisely, it is determined through probabilities events. This is understandable. A message has value, carries information only when we learn from it about the outcome of an event of a random nature, when it is to some extent unexpected. After all, the message about the already known does not contain any information. Those. If, for example, someone calls you on the telephone and says: "It is light during the day and dark at night," then such a message will surprise you only with the absurdity of the obvious and well-known statement, and not with the news it contains. Another thing is, for example, the result of a race at a horse race. Who will come first? The outcome here is difficult to predict. The more an event of interest to us has random outcomes, the more valuable the message about its result, the more information. An event message that has only two equally possible outcomes contains one piece of information called a bit. The choice of a unit of information is not accidental. It is related to the most common binary way of encoding it in transmission and processing. Let us try, at least in the most simplified form, to imagine the general principle of the quantitative assessment of information, which is the cornerstone of the entire theory of information.

We already know that the amount of information depends on probabilities certain outcomes of the event. If an event, as scientists say, has two equally likely outcomes, this means that each outcome is 1/2. This is the probability of getting "heads" or "tails" when a coin is tossed. If an event has three equally probable outcomes, then the probability of each is 1/3. Note that the sum of the probabilities of all outcomes is always equal to one: after all, one of all possible outcomes will surely come. An event, as you yourself understand, can have unequal outcomes. So, in a football match between a strong and a weak team, the probability of a strong team winning is high - for example, 4/5. draw is much less, for example 3/20. The likelihood of defeat is very small.

It turns out that the amount of information is a measure of reducing the uncertainty of a certain situation. Various amounts of information are transmitted through communication channels, and the amount of information passing through the channel cannot be more than its bandwidth. And it is determined by how much information passes here per unit of time. One of the heroes of Jules Verne's novel "The Mysterious Island", journalist Gideon Spillett, broadcast on telephone set a chapter from the Bible so that his competitors cannot use the telephone. In this case, the channel was fully loaded, and the amount of information was equal to zero, since the information known to him was transmitted to the subscriber. This means that the channel was idle, passing a strictly defined number of pulses, without loading them with anything. Meanwhile, the more information each of a certain number of pulses carries, the more fully the channel bandwidth is used. Therefore, you need to intelligently encode information, find an economical, stingy language for transmitting messages.

Information is "sifted" in the most careful way. In the telegraph, frequently encountered letters, combinations of letters, even whole phrases are depicted with a shorter set of zeros and ones, and those that are less common - with a longer one. In the case when the length of the codeword is reduced for frequently occurring symbols and increased for rarely occurring, one speaks of efficient coding of information. But in practice, it quite often happens that the code resulting from the most careful "sifting", the code is convenient and economical, can distort the message due to interference, which, unfortunately, always occurs in communication channels: sound distortion in the phone, atmospheric interference in, distortion or darkening of the image in television, errors in transmission telegraph... These interferences, or, as experts call them, noises, strike the information. And from this there are the most incredible and, naturally, unpleasant surprises.

Therefore, to increase the reliability in the transmission and processing of information, it is necessary to introduce extra characters - a kind of protection against distortion. They - these superfluous symbols - do not carry the actual content in the message, they are redundant. From the point of view of information theory, everything that makes a language colorful, flexible, rich in shades, multifaceted, multi-valued is redundancy. How redundant from such positions Tatyana's letter to Onegin! How many informational excesses it contains for a short and understandable message "I love you"! And how informationally accurate are the drawn designations, understandable to everyone and everyone who enters the metro today, where instead of words and phrases of announcements there are laconic symbolic signs indicating: "Entry", "Exit".

In this regard, it is useful to recall the anecdote told at one time by the famous American scientist Benjamin Franklin about a hat maker who invited his friends to discuss the draft of a sign. It was supposed to draw a hat on the sign and write: "John Thompson, a hat maker, makes and sells hats for cash." A friend noticed that the words “for cash money"Are unnecessary - such a reminder would be offensive to buyer... Another also found the word “sells” superfluous, since it goes without saying that the hat maker sells hats, and does not give them away for free. The third thought that the words "hatman" and "makes hats" were unnecessary tautology, and the last words were thrown out. The fourth proposed to throw out the word "hat-maker" - the drawn hat clearly says who John Thompson is. Finally, the fifth insisted that for buyer it made no difference whether the hat-maker was called John Thompson or otherwise, and suggested that this indication be dispensed with, so that in the end there was nothing left on the sign but the hat. Of course, if people used only this kind of codes, without redundancy in messages, then all "information forms" - books, reports, articles - would be extremely short. But they would lose in clarity and beauty.

Information can be divided into types according to different criteria: in truth: true and false;

by the way of perception:

Visual - perceived by the organs of vision;

Auditory - perceived by the organs of hearing;

Tactile - perceived by tactile receptors;

Olfactory - perceived by the olfactory receptors;

Gustatory - perceived by taste buds.

by the form of presentation:

Text - transmitted in the form of symbols intended to indicate the lexemes of the language;

Numeric - in the form of numbers and signs indicating mathematical operations;

Graphic - in the form of images, objects, graphs;

Sound - oral or in the form of a recording, the transmission of language lexemes by means of an auditory route.

by appointment:

Massive - contains trivial information and operates with a set of concepts that are understandable to most of the society;

Special - contains a specific set of concepts, when used, information is transmitted that may not be understood by the bulk of society, but are necessary and understandable within a narrow social group where this information is used;

Secret - transmitted to a narrow circle of people and through closed (protected) channels;

Personal (private) - a set of information about a person that determines the social status and types of social interactions within the population.

by value:

Relevant - information that is valuable at a given time;

Reliable - information received without distortion;

Understandable - information expressed in a language understandable to the person to whom it is intended;

Complete - information sufficient to make a correct decision or understanding;

Useful - the usefulness of the information is determined by the subject who received the information, depending on the scope of possibilities for its use.

The value of information in various fields of knowledge

In information theory in our time, many systems, methods, approaches, ideas are being developed. However, scientists believe that new trends in information theory will be added to modern trends and new ideas will appear. As proof of the correctness of their assumptions, they cite the "living", developing nature of science, indicate that the theory of information is surprisingly quickly and firmly introduced into the most diverse areas of human knowledge. Information theory has penetrated into physics, chemistry, biology, medicine, philosophy, linguistics, pedagogy, economics, logic, technical sciences, and aesthetics. According to the experts themselves, the doctrine of information, which arose due to the needs of the theory of communication and cybernetics, has stepped over their framework. And now, perhaps, we have the right to talk about information as a scientific concept that gives researchers a theoretical informational method, with the help of which one can penetrate into many sciences about animate and inanimate nature, about society, which will allow not only to look at all the problems with a new side, but also to see the unseen. That is why the term "information" has become widespread in our time, having become part of such concepts as information system, information culture, even information ethics.

Many scientific disciplines use information theory to emphasize a new direction in the old sciences. This is how, for example, information geography, information economy, information law arose. But the term "information" has acquired extremely great importance in connection with the development of the latest computer technology, the automation of mental labor, the development of new means of communication and information processing, and especially with the emergence of informatics. One of the most important tasks of information theory is the study of the nature and properties of information, the creation of methods for its processing, in particular the transformation of various modern information into computer programs, with the help of which the automation of mental work takes place, a kind of strengthening of intelligence, and therefore the development of the intellectual resources of society.

The word "information" comes from the Latin word informatio, which means information, clarification, familiarization. The concept of "information" is basic in the course of computer science, but it is impossible to define it through other, more "simple" concepts. The concept of "information" is used in various sciences, while in each science the concept of "information" is associated with different systems of concepts. Information in biology: Biology studies living nature and the concept of "information" is associated with the appropriate behavior of living organisms. In living organisms, information is transmitted and stored using objects of various physical nature (state of DNA), which are considered as signs of biological alphabets. Genetic information is inherited and stored in all cells of living organisms. Philosophical approach: information is interaction, reflection, cognition. Cybernetic Approach: Information is Characteristics manager signal transmitted over the communication line.

The role of information in philosophy

The traditionalism of the subjective has always dominated the early definitions of information as a category, concept, property of the material world. Information exists outside of our consciousness, and can be reflected in our perception only as a result of interaction: reflection, reading, receiving in the form of a signal, stimulus. Information is not material, like all properties of matter. Information is in the series: matter, space, time, consistency, function, etc., which are the fundamental concepts of a formalized reflection of objective reality in its distribution and variability, diversity and manifestations. Information is a property of matter and reflects its properties (state or ability to interact) and quantity (measure) through interaction.

From a material point of view, information is the order in which the objects of the material world follow. For example, the order of letters on a piece of paper according to certain rules is written information. The order of colored dots on a sheet of paper according to certain rules is graphical information. The order of musical notes is musical information. The order of genes in DNA is hereditary information. The order of bits in a computer is computer information, etc. etc. For the implementation of information exchange, the presence of necessary and sufficient conditions is required.

Information is

The necessary conditions:

The presence of at least two different objects of the material or non-material world;

Objects have a common property that allows them to be identified as a carrier of information;

The presence of a specific property in objects that allows you to distinguish objects from each other;

The presence of a space property that allows you to determine the order of objects. For example, the placement of written information on paper is a specific property of paper that allows letters to be positioned from left to right and top to bottom.

There is only one sufficient condition: the presence of a subject capable of recognizing information. This is man and human society, animal societies, robots, etc. The information message is constructed by selecting copies of objects from the basis and the arrangement of these objects in space in a certain order. The length of the information message is defined as the number of copies of the basis objects and is always expressed as an integer. It is necessary to distinguish between the length of an informational message, which is always measured as an integer, and the amount of knowledge contained in an informational message, which is measured in an unknown unit of measurement. From a mathematical point of view, information is a sequence of integers that are written into a vector. Numbers are the number of the object in the basis of information. The vector is called the invariant of information, since it does not depend on the physical nature of the objects of the basis. One and the same information message can be expressed in letters, words, sentences, files, pictures, notes, songs, video clips, any combination of all previously named.

Information is

The role of information in physics

information is information about the surrounding world (object, process, phenomenon, event), which is an object of transformation (including storage, transmission, etc.) and is used to develop behavior, to make a decision, to control or for training.

The characteristic features of the information are as follows:

It is the most important resource of modern production: it reduces the need for land, labor, capital, and reduces the cost of raw materials and energy. So, for example, having the ability to archive your files (that is, having such information), you can not spend money on buying new floppy disks;

Information gives rise to new industries. For example, the invention of the laser beam was the reason for the emergence and development of the production of laser (optical) discs;

Information is a commodity, and information does not lose it after being sold. So, if a student informs his friend about the schedule of classes during the semester, he will not lose this data for himself;

Information adds value to other resources, in particular labor. Indeed, an employee with a higher education is valued more than an employee with a secondary education.

As follows from the definition, three concepts are always associated with information:

The source of information is that element of the surrounding world (object, phenomenon, event), information about which is the object of transformation. So, the source of information that the reader of this textbook is currently receiving is informatics as a sphere of human activity;

The acquirer of information is that element of the surrounding world that uses information (to develop behavior, to make a decision, to control or to learn). The purchaser of this information is the reader himself;

A signal is a material medium that fixes information for transferring it from a source to a purchaser. In this case, the signal is electronic. If the student takes this manual from the library, then the same information will be in hard copy. Once read and memorized by the student, the information will acquire another carrier - biological, when it is “recorded” in the student's memory.

The signal is the most important element in this circuit. The forms of its presentation, as well as the quantitative and qualitative characteristics of the information contained in it, which are important for the acquirer of information, are discussed later in this section of the textbook. The main characteristics of a computer as the main tool for mapping the information source into a signal (link 1 in the figure) and “bringing” the signal to the acquirer of information (link 2 in the figure) are given in the Computer part. The structure of the procedures that implement links 1 and 2 and constitute the information process is the subject of consideration in the part of the Information process.

Objects of the material world are in a state of continuous change, which is characterized by the exchange of energy of the object with the environment. A change in the state of one object always leads to a change in the state of some other object in the environment. This phenomenon, regardless of how, which states and which particular objects have changed, can be considered as a signal transmission from one object to another. Changing the state of an object when transmitting a signal to it is called signal registration.

A signal or a sequence of signals form a message that can be perceived by the recipient in one form or another, as well as in this or that volume. Information in physics is a term that qualitatively generalizes the concepts of "signal" and "message". If signals and messages can be quantified, then we can say that signals and messages are units of measurement of the amount of information. The message (signal) is interpreted in different ways by different systems. For example, a successively long and two short beeps in Morse code terminology is the letter de (or D), in the terminology of BIOS from the award company - a video card malfunction.

Information is

The role of information in mathematics

In mathematics, information theory (mathematical communication theory) is a section of applied mathematics that defines the concept of information, its properties and establishes the limiting relations for data transmission systems. The main sections of information theory are source coding (compression coding) and channel (noise-immune) coding. Mathematics is more than a scientific discipline. It creates a single language for all Science.

The subject of research in mathematics is abstract objects: number, function, vector, set, and others. Moreover, most of them are introduced axiomatically (axiom), i.e. without any connection with other concepts and without any definition.

Information is

information is not included in the research subjects of mathematics. Nevertheless, the word "information" is used in mathematical terms - own information and mutual information, referring to the abstract (mathematical) part of information theory. However, in mathematical theory, the concept of "information" is associated with exclusively abstract objects - random variables, while in modern information theory this concept is considered much broader - as a property of material objects. The connection between these two identical terms is undeniable. It was the mathematical apparatus of random numbers that was used by the author of information theory Claude Shannon. He himself means by the term "information" something fundamental (irreducible). Shannon's theory intuitively assumes that information has content. Information reduces overall uncertainty and information entropy. The amount of information is measurable. However, he warns researchers against mechanical transfer of concepts from his theory to other areas of science.

"The search for ways to apply information theory in other areas of science is not reduced to a trivial transfer of terms from one area of ​​science to another. This search is carried out in the long process of proposing new hypotheses and their experimental verification." K. Shannon.

Information is

The role of information in cybernetics

The founder of cybernetics, Nor bert Wiener, spoke about information like this:

information is not matter or energy, information is information. "But the basic definition of information, which he gave in several of his books, is the following: information is a designation of the content that we received from the outside world, in the process of adapting us and our feelings.

Information is the basic concept of cybernetics, just as economic information is the basic concept of economic cybernetics.

There are many definitions of this term, they are complex and contradictory. The reason, obviously, is that various sciences are involved in the phenomenon of cybernetics, and cybernetics is only the youngest of them. I. is the subject of study of such sciences as the science of management, mathematics, genetics, and the theory of mass media (press, radio, television), informatics, which deals with the problems of scientific and technical information, etc. Finally, in recent years, philosophers have shown great interest in the problems of information technology: they are inclined to regard information technology as one of the basic universal properties of matter, associated with the concept of reflection. For all interpretations of the concept of I., it assumes the existence of two objects: the source of I. and the acquirer (recipient) I. the relationship is determined by agreement. For example, a blow to the veche bell meant that it was necessary to gather for the square, but for those who did not know about this order, he did not inform any I.

In a veche bell situation, the person participating in the agreement on the meaning of the signal knows that at the moment there can be two alternatives: the veche meeting will take place or not. Or, in the language of the theory of I., an indefinite event (veche) has two outcomes. The received signal leads to a decrease in uncertainty: the person now knows that the event (veche) has only one outcome - it will take place. However, if it was known in advance that the veche would take place at such and such an hour, the bell did not say anything new. It follows from this that the less probable (i.e., more unexpected) the message, the more I. it contains, and vice versa, the greater the probability of the outcome before the event occurs, the less I. contains the signal. Approximately such reasoning was given in the 40s. XX century to the emergence of the statistical, or "classical", theory of I., which defines the concept of I. through the measure of reducing the uncertainty of knowledge about the occurrence of an event (such a measure was called entropy). At the origins of this science were N. Wiener, K. Shannon and Soviet scientists A.N. Kolmogorov, V.A. ., the capacity of memory information devices, etc., which served as a powerful stimulus to the development of cybernetics as a science and electronic computing technology as a practical application of the achievements of cybernetics.

As for the definition of the value and usefulness of I. for the recipient, there is still a lot of unresolved, unclear. If we proceed from the needs of economic management and, consequently, economic cybernetics, then I. can be defined as all the information, knowledge, messages that help solve a particular control problem (i.e., reduce the uncertainty of its outcomes). Then some opportunities open up for evaluating I .: it is the more useful and valuable, the sooner or with less costs leads to the solution of the problem. The concept of I. is close to the concept of data. However, there is a difference between them: data are signals from which it is still necessary to extract I. Data processing is the process of bringing them to a form suitable for this.

The process of their transfer from the source to the acquirer and perception as I. can be considered as passing through three filters:

Physical, or statistical (purely quantitative limitation on the bandwidth of the channel, regardless of the content of the data, that is, from the point of view of syntactics);

Semantic (selection of those data that can be understood by the recipient, that is, correspond to the thesaurus of his knowledge);

Pragmatic (selection among understood information of those that are useful for solving a given problem).

This is well illustrated in a diagram taken from EG Yasin's book on economic information. Correspondingly, three aspects of the study of I. problems are distinguished - syntactic, semantic, and pragmatic.

In terms of content, I. is subdivided into socio-political, socio-economic (including economic I.), scientific and technical, etc. In general, there are many classifications of I., they are built on different grounds. As a rule, due to the similarity of concepts, data classifications are constructed in the same way. For example, I. is subdivided into static (constant) and dynamic (variable), and the data at the same time - into constants and variables. Another division is primary, derivative, output I. (data are also classified). The third division is I. managing and informing. The fourth is redundant, useful and false. Fifth - full (solid) and selective. This thought of Wiener gives a direct indication of the objectivity of information, i.e. its existence in nature independently of the consciousness (perception) of a person.

Information is

Modern cybernetics defines objective information as an objective property of material objects and phenomena to generate a variety of states, which, through fundamental interactions of matter, are transferred from one object (process) to another, and are imprinted in its structure. A material system in cybernetics is considered as a set of objects that themselves can be in different states, but the state of each of them is determined by the states of other objects in the system.

Information is

In nature, the set of states of a system is information, the states themselves are the primary code, or source code. Thus, each material system is a source of information. Cybernetics defines subjective (semantic) information as the meaning or content of a message.

The role of information in informatics

The subject of the study of science is precisely data: methods of their creation, storage, processing and transmission. Content (also: "content" (in context), "site content") is a term that means all types of information (both textual and multimedia - images, audio, video) that make up the content (visualized, for a visitor, content) of the web -site. It is used to separate the concept of information that makes up the internal structure of a page / site (code) from that which will eventually be displayed on the screen.

The word "information" comes from the Latin word informatio, which means information, clarification, familiarization. The concept of "information" is basic in the course of computer science, but it is impossible to define it through other, more "simple" concepts.

The following approaches to the definition of information can be distinguished:

Traditional (everyday) - used in computer science: information is information, knowledge, messages about the state of affairs that a person perceives from the outside world with the help of the senses (sight, hearing, taste, smell, touch).

Probabilistic - used in information theory: information is information about objects and phenomena of the environment, their parameters, properties and state, which reduce the degree of uncertainty and incompleteness of knowledge about them.

Information is stored, transmitted and processed in symbolic (sign) form. The same information can be presented in different forms:

Signed writing, consisting of various signs, among which there is a symbolic one in the form of text, numbers, specials. characters; graphic; tabular, etc .;

The form of gestures or signals;

Oral verbal form (conversation).

Information is presented using languages ​​like sign systems, which are built on the basis of a certain alphabet and have rules for performing operations on signs. Language is a certain sign system of information presentation. Exists:

Natural languages ​​are spoken and written languages. In some cases, spoken language can be replaced by the language of facial expressions and gestures, the language of special signs (for example, road);

Formal languages ​​are special languages ​​for various areas of human activity, which are characterized by a rigidly fixed alphabet, more stringent rules of grammar and syntax. These are the language of music (notes), the language of mathematics (numbers, mathematical signs), number systems, programming languages, etc. Any language is based on an alphabet - a set of symbols / signs. The total number of characters in the alphabet is usually called the power of the alphabet.

Storage media - a medium or physical body for the transmission, storage and reproduction of information. (These are electrical, light, thermal, sound, radio signals, magnetic and laser disks, prints, photographs, etc.)

Information processes are processes associated with the receipt, storage, processing and transmission of information (i.e. actions performed with information). Those. these are processes during which the content of information or the form of its presentation changes.

To ensure the information process, you need a source of information, a communication channel and an acquirer of information. The source transmits (sends) information, and the receiver receives (perceives) it. The transmitted information reaches from the source to the receiver using a signal (code). Changing the signal provides information.

As an object of transformation and use, information is characterized by the following properties:

Syntax is a property that determines the way information is presented on a medium (in a signal). So, this information is presented on electronic media using a certain font. Here you can also consider such parameters of the presentation of information as the style and color of the font, its size, line spacing, etc. The selection of the desired parameters as syntactic properties is obviously determined by the intended transformation method. For example, for a visually impaired person, font size and color are essential. If you intend to enter this text into a computer through a scanner, the paper size is important;

Semantics is a property that determines the meaning of information as the correspondence of a signal to the real world. So, the semantics of the signal "informatics" is in the definition given earlier. Semantics can be thought of as some agreement, known to the acquirer of the information, about what each signal means (the so-called rule of interpretation). For example, it is the semantics of signals that a novice motorist studies, studying the rules of the road, learning road signs (in this case, the signals themselves are the signs). The semantics of words (signals) is learned by the learner of a foreign language. We can say that the meaning of teaching computer science is to study the semantics of various signals - the essence of the key concepts of this discipline;

Pragmatics is a property that determines the influence of information on the behavior of the purchaser. So the pragmatics of the information received by the reader of this textbook is, at least, in the successful passing of the computer science exam. I would like to believe that the pragmatics of this work will not be limited to this, and it will serve for further training and professional activity of the reader.

Information is

It should be noted that signals of different syntax may have the same semantics. For example, the signals “computer” and “computer” mean an electronic device for converting information. In this case, they usually talk about the synonymy of signals. On the other hand, a single signal (i.e., information with the same syntactic property) can have different consumer pragmatics and different semantics. For example, a road sign known as “brick” and having a well-defined semantics (“no entry”) means a ban on entry for a motorist, but does not affect a pedestrian in any way. At the same time, the “key” signal can have different semantics: a treble clef, a spring clef, a key to open a lock, a key used in computer science to encode a signal in order to protect it from unauthorized access (in this case, they speak of a signal homonymy). There are signals - antonyms with opposite semantics. For example, "cold" and "hot", "fast" and "slow", etc.

The subject of study of the science of informatics is precisely data: methods of their creation, storage, processing and transmission. And the information itself, recorded in the data, its meaningful meaning is of interest to users of information systems who are specialists in various sciences and fields of activity: a physician is interested in medical information, a geologist - geological, a businessman - commercial, etc. (including a computer scientist interested in information on working with data).

Semiotics - the science of information

Information cannot be imagined without its receipt, processing, transmission, etc., that is, outside the framework of information exchange. All acts of information exchange are carried out by means of symbols or signs with the help of which one system acts on another. Therefore, the main knowledge that studies information is semiotics - the science of signs and sign systems in nature and society (theory of signs). In each act of information exchange, one can find three of its "participants", three elements: the sign, the object that it designates, and the recipient (user) of the sign.

Depending on the relationship between which elements are considered, semiotics is divided into three sections: syntactics, semantics and pragmatics. Syntactics studies signs and the relationship between them. At the same time, it abstracts from the content of the mark and from its practical meaning for the recipient. Semantics studies the relationship between signs and the objects they designate, while distracting from the recipient of the signs and the value of the latter: for him. It is clear that the study of the patterns of semantic display of objects in signs is impossible without taking into account and using the general patterns of construction of any sign systems studied by syntactics. Pragmatics studies the relationship between signs and their users. Within the framework of pragmatics, all the factors that distinguish one act of information exchange from another, all issues of the practical results of using information and its value for the recipient are studied.

At the same time, many aspects of the relationship of signs between themselves and with the objects designated by them are inevitably affected. Thus, the three sections of semiotics correspond to three levels of abstraction (distraction) from the features of specific acts of information exchange. The study of information in all its diversity corresponds to the pragmatic level. Distracting from the recipient of information, excluding him from consideration, we move on to studying it at the semantic level. Abstraction from the content of signs, the analysis of information is transferred to the level of syntactics. Such interpenetration of the main sections of semiotics, associated with different levels of abstraction, can be represented using the scheme "Three sections of semiotics and their relationship." The measurement of information is carried out, respectively, in three aspects: syntactic, semantic and pragmatic. The need for such a different dimension of information, as will be shown below, is dictated by design practice and firms work of information systems. Consider a typical production situation.

At the end of the shift, the site planner prepares data on the execution of the production schedule. This data is sent to the information and computing center (ITC) of the enterprise, where it is processed, and in the form of reports on the state of production at the current moment is issued to managers. The shop manager, on the basis of the data received, makes a decision to change the production plan to the next planned one or to take any other organizational measures. Obviously, for the head of the shop, the amount of information contained in the summary depends on the amount of economic affect obtained from its use in making decisions, on how useful the information received was. For the site planner, the amount of information in the same message is determined by the accuracy of its correspondence with the actual state of affairs on the site and the degree of surprise of the reported facts. The more unexpected they are, the faster you need to report them to the management, the more information there is in this message. For the employees of the ITC, the number of characters and the length of the message carrying information will be of paramount importance, since it is this that determines the loading time of computers and communication channels. At the same time, they are practically not interested in either the usefulness of information or the quantitative measure of the semantic value of information.

Naturally, when organizing a production management system, building a model for choosing a solution, we will use the usefulness of information as a measure of the informativeness of messages. When building a system accounting and reporting that provides guidance on the progress of the production process, the novelty of the information obtained should be taken as a measure of the amount of information. Company the same procedures for mechanical processing of information requires measuring the volume of messages in the form of the number of processed characters. These three substantially different approaches to measuring information do not contradict or exclude each other. On the contrary, by measuring information in different scales, they allow a more complete and comprehensive assessment of the information content of each message and more efficiently organize a production management system. According to the apt expression of prof. NOT. Kobrinsky, when it comes to a rational company of information flows, the quantity, novelty, usefulness of information are as interconnected as the quantity, quality and cost of products in production.

Information in the material world

information is one of the general concepts associated with matter. Information exists in any material object in the form of a variety of its states and is transmitted from object to object in the process of their interaction. The existence of information as an objective property of matter logically follows from the known fundamental properties of matter - structure, continuous change (movement) and interaction of material objects.

The structural nature of matter manifests itself as an internal dismemberment of integrity, a natural order of communication between elements in the whole. In other words, any material object, from the subatomic particle Meta of the Universe (Big Bang) as a whole, is a system of interconnected subsystems. Due to continuous movement, understood in a broad sense as movement in space and development in time, material objects change their states. The state of objects also changes when interacting with other objects. The set of states of a material system and all of its subsystems represents information about the system.

Strictly speaking, due to uncertainty, infinity, structural properties, the amount of objective information in any material object is infinite. This information is called complete. However, it is possible to distinguish structural levels with finite sets of states. Information that exists at the structural level with a finite number of states is called private. For private information, the meaning is the concept of the amount of information.

The choice of the unit of measurement for the amount of information follows from the above presentation, which is logical and simple. Imagine a system that can be in only two equiprobable states. Let's assign one of them the code "1", and the other - "0". This is the minimum amount of information that the system can contain. It is a unit of measurement of information and is called a bit. There are other, more difficult to define, methods and units for measuring the amount of information.

Depending on the material form of the carrier, information is of two main types - analog and discrete. Analog information changes continuously over time and takes values ​​from a continuum of values. Discrete information changes at some points in time and takes values ​​from a certain set of values. Any material object or process is the primary source of information. All its possible states make up the code of the information source. The instantaneous value of the states is represented as a symbol ("letter") of this code. In order for information to be transmitted from one object to another as to a receiver, it is necessary that there be some intermediate material carrier interacting with the source. Such carriers in nature, as a rule, are rapidly spreading processes of the wave structure - cosmic, gamma and X-rays, electromagnetic and sound waves, potentials (and maybe not yet discovered waves) of the gravitational field. When electromagnetic radiation interacts with an object as a result of absorption or reflection, its spectrum changes, i.e. the intensities of some wavelengths change. Harmonics of sound vibrations also change during interactions with objects. Information is also transmitted during mechanical interaction, however, mechanical interaction, as a rule, leads to large changes in the structure of objects (up to their destruction), and the information is greatly distorted. Distortion of information during its transmission is called disinformation.

The transfer of source information to the structure of the medium is called encoding. This converts the source code to the media code. A medium with a source code transferred to it in the form of a medium code is called a signal. The signal receiver has its own set of possible states, which is called the receiver code. A signal interacting with a receiver object changes its states. The process of converting a signal code into a receiver code is called decoding. The transmission of information from a source to a receiver can be viewed as information interaction. Communication is fundamentally different from other interactions. With all other interactions of material objects, there is an exchange of matter and (or) energy. In this case, one of the objects loses matter or energy, and the other receives them. This property of interactions is called symmetry. During information interaction, the receiver receives information, and the source does not lose it. Information interaction is asymmetric. Objective information itself is not material, it is a property of matter, such as structure, movement, and exists on material carriers in the form of its codes.

Information in nature

Wildlife is complex and varied. Sources and receivers of information in it are living organisms and their cells. An organism has a number of properties that distinguish it from inanimate material objects.

Basic:

Continuous exchange of matter, energy and information with the environment;

Irritability, the ability of the body to perceive and process information about changes in the environment and the internal environment of the body;

Excitability, the ability to respond to stimuli;

Self-organization, manifested as changes in the body to adapt to environmental conditions.

An organism considered as a system has a hierarchical structure. This structure, relative to the organism itself, is subdivided into internal levels: molecular, cellular, organ level and, finally, the organism itself. However, the organism also interacts over organismic living systems, the levels of which are the population, ecosystem and all living nature as a whole (biosphere). Between all these levels, flows of not only matter and energy, but also information circulate. Informational interactions in living nature occur in the same way as in inanimate. At the same time, living nature in the process of evolution has created a wide variety of sources, carriers and receivers of information.

The reaction to the influences of the outside world is manifested in all organisms, since it is due to irritability. In higher organisms, adaptation to the external environment has the character of a complex activity, which is effective only with sufficiently complete and timely information about the environment. Their receivers of information from the external environment are the sense organs, which include vision, hearing, smell, taste, touch and the vestibular apparatus. In the internal structure of organisms there are numerous internal receptors associated with the nervous system. The nervous system consists of neurons, the processes of which (axons and dendrites) are analogous to information transmission channels. The main organs providing storage and processing of information in vertebrates are the spinal cord and the brain. In accordance with the characteristics of the sense organs, the information perceived by the body can be classified as visual, auditory, gustatory, olfactory and tactile.

Getting on the retina of the human eye, the signal excites its constituent cells in a special way. Nerve impulses of cells through axons are transmitted to the brain. The brain remembers this sensation in the form of a certain combination of states of its constituent neurons. (Continuation of the example - in the section "information in human society"). Accumulating information, the brain creates a connected information model of the surrounding world on its structure. In living nature, an important characteristic for an organism that is a receiver of information is its availability. The amount of information that the human nervous system is able to supply to the brain when reading texts is approximately 1 bit in 1/16 s.

Information is

The study of organisms is hindered by their complexity. The abstraction of a structure as a mathematical set, acceptable for inanimate objects, is hardly acceptable for a living organism, because in order to create a more or less adequate abstract model of an organism, it is necessary to take into account all the hierarchical levels of its structure. Therefore, it is difficult to introduce a measure of the amount of information. It is very difficult to define the connections between the components of the structure. If it is known which body is the source of the information, then what is the signal and what is the receiver?

Before the advent of computers, biology engaged in the study of living organisms used only qualitative ones, i.e. descriptive models. In a qualitative model, it is almost impossible to take into account information connections between structure components. Electronic computing technology has made it possible to apply new methods in biological research, in particular, the method of machine modeling, which involves a mathematical description of known phenomena and processes occurring in the body, adding hypotheses about some unknown processes to them and calculating possible options for the behavior of the body. The resulting options are compared with the real behavior of the organism, which allows you to determine the truth or falsity of the hypotheses put forward. In such models, information interaction can also be taken into account. The information processes that ensure the existence of life itself are extremely complex. And although it is intuitively clear that this property is directly related to the formation, storage and transmission of complete information about the structure of the organism, an abstract description of this phenomenon seemed until some time ago impossible. Nevertheless, the informational processes that ensure the existence of this property are partially revealed due to the deciphering of the genetic code and reading the genomes of various organisms.

Information in human society

The development of matter in the process of movement is directed towards the complication of the structure of material objects. One of the most complex structures is the human brain. So far, this is the only structure known to us that has a property that a person himself calls consciousness. Speaking about information, we, as thinking beings, a priori mean that information, in addition to its presence in the form of signals we receive, also has some meaning. Forming in his consciousness a model of the world around him as an interconnected set of models of its objects and processes, a person uses semantic concepts, not information. Meaning is the essence of any phenomenon that does not coincide with itself and connects it with the broader context of reality. The word itself directly indicates that the semantic content of information can be formed only by thinking receivers of information. In human society, it is not the information itself that is decisive, but its semantic content.

Example (continued). Having experienced such a sensation, a person assigns the concept of "tomato" to the object, and the concept of "red color" to his state. In addition, his consciousness fixes the connection: "tomato" - "red". This is the meaning of the received signal. (Continued example: later in this section). The ability of the brain to create meaningful concepts and connections between them is the basis of consciousness. Consciousness can be viewed as a self-developing conceptual model of the surrounding world. Meaning is not information. Information exists only on a tangible medium. Human consciousness is considered immaterial. The meaning exists in the mind of a person in the form of words, images and sensations. A person can pronounce words not only out loud, but also "to himself." He can also "silently" create (or remember) images and sensations. However, he can recover information corresponding to this meaning by speaking or writing words.

Information is

Example (continued). If the words "tomato" and "red" are the meaning of the concepts, then where is the information? information is contained in the brain in the form of certain states of its neurons. It is also contained in the printed text consisting of these words, and when the letters are encoded in a three-bit binary code, its number is 120 bits. If you say the words out loud, there will be much more information, but the meaning will remain the same. The greatest amount of information is carried by the visual image. This is reflected even in folklore - “it is better to see once than hear a hundred times.” The information thus recovered is called semantic information, since it encodes the meaning of some primary information (semantics). Hearing (or seeing) a phrase uttered (or written) in a language that a person does not know, he receives information, but cannot determine its meaning. Therefore, for the transmission of the semantic content of information, some agreements are required between the source and the receiver on the semantic content of signals, i.e. words. Such agreements can be achieved through communication. Communication is one of the most important conditions for the existence of human society.

In the modern world, information is one of the most important resources and, at the same time, one of the driving forces of the development of human society. Information processes occurring in the material world, living nature and human society are studied (or at least taken into account) by all scientific disciplines from philosophy to marketing. The increasing complexity of scientific research tasks has led to the need to involve large teams of scientists of different specialties in their solution. Therefore, almost all theories discussed below are interdisciplinary. Historically, two complex branches of science - cybernetics and informatics - are directly involved in information research.

Modern cybernetics is a multidisciplinary branch sciences that investigate super-complex systems, such as:

Human society (social cybernetics);

Economics (economic cybernetics);

Living organism (biological cybernetics);

The human brain and its function is consciousness (artificial intelligence).

Informatics, which emerged as a science in the middle of the last century, separated from cybernetics and is engaged in research in the field of methods of obtaining, storing, transmitting and processing semantic information. Both of these industries use several fundamental scientific theories. These include information theory, and its sections - coding theory, algorithm theory and automata theory. Studies of the semantic content of information are based on a complex of scientific theories under the general name semiotics. Information theory is a complex, mainly mathematical theory that includes the description and assessment of methods for extracting, transferring, storing and classifying information. Considers information carriers as elements of an abstract (mathematical) set, and interactions between carriers as a way of arranging elements in this set. This approach makes it possible to formally describe the information code, that is, to define an abstract code and study it using mathematical methods. For these studies, he uses the methods of probability theory, mathematical statistics, linear algebra, game theory and other mathematical theories.

The foundations of this theory were laid by the American scientist E. Hartley in 1928, who determined the measure of the amount of information for some communication problems. Later, the theory was significantly developed by the American scientist K. Shannon, the Russian scientists A.N. Kolmogorov, V.M. Glushkov and others. Modern information theory includes as sections the theory of coding, the theory of algorithms, the theory of digital automata (see below) and some others. There are also alternative information theories, for example, "Qualitative information theory" proposed by the Polish scientist M. Mazur. Any person is familiar with the concept of an algorithm, without even knowing it. Here is an example of an informal algorithm: “Cut the tomatoes into slices or slices. Put chopped onions in them, pour over with vegetable oil, then sprinkle with finely chopped paprika, mix. Sprinkle with salt before use, put in a salad bowl and garnish with parsley. " (Tomato salad).

The first rules in the history of mankind for solving arithmetic problems were developed by one of the famous scientists of antiquity Al - Khorezmi in the 9th century AD. In his honor, formalized rules for achieving a goal are called algorithms. The subject of the theory of algorithms is to find methods for constructing and evaluating effective (including universal) computational and control algorithms for information processing. To substantiate such methods, the theory of algorithms uses the mathematical apparatus of information theory. The modern scientific concept of algorithms as methods of information processing was introduced in the works of E. Post and A. Turing in the 1920s (Turing Machine). A great contribution to the development of the theory of algorithms was made by Russian scientists A. Markov (Normal Markov Algorithm) and A. Kolmogorov. The theory of automata is a branch of theoretical cybernetics, in which mathematical models of actually existing or fundamentally possible devices that process discrete information at discrete moments of time are investigated.

The concept of an automaton originated in the theory of algorithms. If there are some universal algorithms for solving computational problems, then there must be devices (albeit abstract) for the implementation of such algorithms. Actually, the abstract Turing machine, considered in the theory of algorithms, is at the same time an informally defined automaton. The theoretical substantiation of the construction of such devices is the subject of the theory of automata. The theory of automata uses the apparatus of mathematical theories - algebra, mathematical logic, combinatorial analysis, graph theory, probability theory, etc. The theory of automata, together with the theory of algorithms, is the main theoretical basis for creating electronic computers and automated control systems. Semiotics is a complex of scientific theories that study the properties of sign systems. The most significant results have been achieved in the section of semiotics - semantics. The subject of semantics research is the semantic content of information.

A sign system is a system of concrete or abstract objects (signs, words), with each of which a certain meaning is associated in a certain way. It has been proven in theory that there can be two such comparisons. The first type of correspondence directly determines the material object that denotes this word and is called the denotatum (or, in some works, the nominee). The second type of correspondence determines the meaning of a sign (word) and is called a concept. At the same time, such properties of comparisons as "meaning", "truth", "definability", "following", "interpretation", etc. are investigated. The apparatus of mathematical logic and mathematical linguistics is used for research. The ideas of semantics, outlined by G.V. Leibniz and F. de Saussure in the 19th century, formulated and developed by C. Pierce (1839-1914), C. Morris (b. 1901), R. Carnap (1891-1970) and others. The main achievement of the theory is the creation of an apparatus of semantic analysis that allows the meaning of a text in a natural language in the form of a record in some formalized semantic (semantic) language. Semantic analysis is the basis for creating devices (programs) for machine translation from one natural language to another.

Information storage is carried out by transferring it to some material carriers. Semantic information recorded on a tangible storage medium is called a document. Humanity has learned to store information a long time ago. In the most ancient forms of storing information, the arrangement of objects was used - shells and stones in the sand, knots on a rope. A significant development of these methods was writing - a graphic representation of symbols on stone, clay, papyrus, paper. Of great importance in the development of this direction was invention typography. Throughout its history, mankind has accumulated a huge amount of information in libraries, archives, periodicals and other written documents.

At the present time, storage of information in the form of sequences of binary symbols is of particular importance. A variety of storage devices are used to implement these methods. They are the central link in information storage systems. In addition to them, such systems use information retrieval means (search system), information retrieval means (information and reference systems) and information display means (output device). Formed for the purpose of information, such information systems form databases, databanks and a knowledge base.

The transfer of semantic information is the process of its spatial transfer from the source to the recipient (addressee). A person learned to transmit and receive information even before storing it. Speech is a method of transmission that our distant ancestors used in direct contact (conversation) - we still use it now. To transmit information over long distances, it is necessary to use much more complex information processes. To carry out such a process, information must be formalized (presented) in some way. To represent information, various sign systems are used - sets of pre-agreed semantic symbols: objects, pictures, written or printed words of a natural language. Semantic information about an object, phenomenon or process presented with their help is called a message.

Obviously, in order to transmit a message over a distance, information must be transferred to some kind of mobile carrier. Media can move through space using vehicles, as is the case with letters sent by mail. This method ensures complete reliability of the information transfer, since the addressee receives the original message, but it takes a significant amount of time to transfer. Since the middle of the 19th century, methods of transmitting information have become widespread, using a naturally spreading medium of information - electromagnetic oscillations (electrical oscillations, radio waves, light). Implementing these methods requires:

Preliminary transfer of the information contained in the message to the carrier - encoding;

Ensuring the transmission of the signal thus obtained to the addressee via a special communication channel;

Reverse transformation of the signal code into the message code - decoding.

Information is

The use of electromagnetic carriers makes the delivery of a message to the addressee almost instantaneous, however, it requires additional measures to ensure the quality (reliability and accuracy) of the transmitted information, since real communication channels are subject to natural and artificial interference. Devices that implement the data transmission process form communication systems. Depending on the way information is presented, communication systems can be subdivided into sign (, telefax), sound (), video and combined systems (television). The most developed communication system in our time is the Internet.

Data processing

Since information is not material, its processing consists in various transformations. Any transfer of information from a carrier to another carrier can be referred to as processing processes. The information to be processed is called data. The main type of processing of primary information received by various devices is transformation into a form that ensures its perception by the human sense organs. Thus, X-ray photographs of space are converted into ordinary color photographs using special spectrum converters and photographic materials. Night vision devices convert infrared (thermal) images into visible images. For some communication and control tasks, conversion of analog information is necessary. For this, analog-to-digital and digital-to-analog signal converters are used.

The most important type of processing of semantic information is the definition of the meaning (content), which is contained in a message. In contrast to the primary, semantic information does not have statistical characteristics, that is, a quantitative measure - the meaning is either there or not. And how much of it, if there is one, is impossible to establish. The meaning contained in the message is described in an artificial language, reflecting the semantic connections between the words of the original text. The vocabulary of such a language, called a thesaurus, resides in the receiver of the message. The meaning of words and phrases of a message is determined by assigning them to certain groups of words or phrases, the meaning of which has already been established. Thus, the thesaurus allows you to establish the meaning of the message and, at the same time, is replenished with new semantic concepts. The described type of information processing is used in information retrieval systems and machine translation systems.

One of the widespread types of information processing is the solution of computational problems and problems of automatic control using computers. Information processing is always done for some purpose. To achieve it, the order of actions with information, leading to a given goal, must be known. This procedure is called an algorithm. In addition to the algorithm itself, you also need some device that implements this algorithm. In scientific theories, such a device is called an automaton. It should be noted as the most important feature of information that, due to the asymmetry of information interaction, new information appears during information processing, and the original information is not lost.

Analog and digital information

Sound is wave vibrations in a medium, for example, in air. When a person speaks, the vibrations of the ligaments of the throat are converted into wave vibrations of the air. If we consider sound not as a wave, but as vibrations at one point, then these vibrations can be represented as air pressure changing over time. With a microphone, pressure changes can be captured and converted into electrical voltage. The air pressure has been converted into voltage fluctuations.

Such a transformation can occur according to various laws, most often the transformation occurs according to a linear law. For example, for this:

U (t) = K (P (t) -P_0),

where U (t) is the electrical voltage, P (t) is the air pressure, P_0 is the average air pressure, and K is the conversion factor.

Both electrical voltage and air pressure are continuous functions over time. Functions U (t) and P (t) are information about throat ligaments vibrations. These functions are continuous and such information is called analogue. Music is a special case of sound and it can also be represented as some function of time. This will be an analogue presentation of the music. But music is also recorded in the form of notes. Each note has a duration that is a multiple of a predetermined duration, and a pitch (do, re, mi, fa, g, etc.). If we convert this data into numbers, then we get a digital representation of the music.

Human speech is also a special case of sound. It can also be represented in analog form. But just as music can be broken down into notes, speech can be broken down into letters. If each letter is given its own set of numbers, then we get a digital representation of speech. The difference between analog information and digital is that analog information is continuous, and digital is discrete. The transformation of information from one type to another, depending on the type of conversion, is called differently: simply "conversion", such as digital-to-analog conversion, or analog-to-digital conversion; complex transformations are called "coding", for example, delta coding, entropy coding; the conversion between characteristics such as amplitude, frequency or phase is called "modulation", for example, amplitude-frequency modulation, pulse-width modulation.

Information is

Usually, analog conversions are quite simple and various devices invented by man can easily cope with them. A tape recorder converts the magnetization on the film into sound, the recorder converts the sound into magnetization on the film, a video camera converts light into magnetization on the film, an oscilloscope converts an electric voltage or current into an image, etc. Converting analog information to digital is much more difficult. Some of the transformations the machine fails or succeeds with great difficulty. For example, converting speech to text, or converting a concert recording to sheet music, and even by its nature a digital representation: it is very difficult for a machine to convert text on paper into the same text in computer memory.

Information is

Why then use digital representation of information if it is so difficult? The main advantage of digital information over analog is noise immunity. That is, in the process of copying information, digital information is copied as it is, it can be copied almost an infinite number of times, while analog information becomes noisy during copying, its quality deteriorates. Usually, analog information can be copied no more than three times. If you have a two-cassette audio tape recorder, you can make such an experiment, try to rewrite the same song several times from cassette to cassette, after several such re-recordings you will notice how much the recording quality has deteriorated. The information on the cassette is stored in analog form. You can rewrite music in mp3 format as many times as you like, and the quality of the music does not deteriorate. The information in the mp3 file is stored digitally.

Amount of information

A person or some other receiver of information, having received a portion of information, resolves some uncertainty. Let's take all the same tree as an example. When we saw the tree, we resolved a number of uncertainties. We learned the height of the tree, the type of tree, the density of the foliage, the color of the leaves, and if it is a fruit tree, then we saw the fruits on it, how ripe they are, etc. Before we looked at the tree, we did not know all this, after we looked at the tree, we resolved the uncertainty - we received information.

If we go out to a meadow and look at it, then we will receive information of a different kind, how big the meadow is, how tall the grass is and what color the grass is. If a biologist comes to the same meadow, then, among other things, he will be able to find out: what types of grasses grow in the meadow, what type of this meadow, he will see which flowers have bloomed, which ones will only bloom, whether the meadow is suitable for grazing cows, etc. That is, he will receive more information than we do, since before he looked at the meadow, he had more questions, the biologist will resolve more uncertainties.

Information is

The more uncertainty was resolved in the process of obtaining information, the more information we received. But this is a subjective measure of the amount of information, and we would like to have an objective measure. There is a formula for calculating the amount of information. We have some uncertainty, and we have N-th number of cases of uncertainty resolution, and each case has a certain probability of resolution, then the amount of information received can be calculated using the following formula, which Shannon suggested to us:

I = - (p_1 log_ (2) p_1 + p_2 log_ (2) p_2 + ... + p_N log_ (2) p_N), where

I is the amount of information;

N is the number of outcomes;

p_1, p_2, ..., p_N are the probabilities of the outcome.

Information is

The amount of information is measured in bits - an abbreviation for the English words BInary digiT, which means a binary digit.

For equiprobable events, the formula can be simplified:

I = log_ (2) N, where

I is the amount of information;

N is the number of outcomes.

Let's take a coin for example and drop it on the table. She will fall either heads or tails. We have 2 equally probable events. After we flipped a coin, we got log_ (2) 2 = 1 bit of information.

Let's try to find out how much information we get after we roll the dice. The cube has six faces - six equally probable events. We get: log_ (2) 6 approx 2.6. After we rolled the dice on the table, we received approximately 2.6 bits of information.

The probability that we will see a Martian dinosaur when we leave the house is one in ten billion. How much information do we get about the Martian dinosaur after we leave the house?

Left (((1 over (10 ^ (10))) log_2 (1 over (10 ^ (10))) + left ((1 - (1 over (10 ^ (10)))) ight) log_2 left (( 1 - (1 over (10 ^ (10)))) ight)) ight) approx 3.4 cdot 10 ^ (- 9) bits.

Let's say we threw 8 coins. We have 2 ^ 8 options for falling coins. So after tossing coins we get log_2 (2 ^ 8) = 8 bits of information.

When we ask a question and can equally likely get an answer "yes" or "no", then after answering the question we get one bit of information.

Surprisingly, if we apply Shannon's formula to analog information, then we get an infinite amount of information. For example, the voltage at a point in an electrical circuit can take on an equiprobable value from zero to one volt. The number of outcomes we have is equal to infinity and, substituting this value into the formula for equiprobable events, we get infinity - an infinite amount of information.

Now I will show you how to code "war and peace" with just one risk on any metal rod. Let's encode all letters and signs that appear in " war and the world ”, with the help of two-digit numbers - they should be enough for us. For example, we will give the letter "A" the code "00", the letter "B" - the code "01" and so on, encode punctuation marks, Latin letters and numbers. Let's recode " war and the world "with the help of this code and get a long number, for example, such as 70123856383901874 ..., add a comma and zero before this number (0.70123856383901874 ...). The result is a number from zero to one. Let's put risk on a metal rod so that the ratio of the left side of the rod to the length of this rod equals just our number. Thus, if we suddenly want to read "war and peace", we simply measure the left side of the rod to risks and the length of the entire rod, divide one number by another, get the number and recode it back into letters ("00" in "A", "01" in "B", etc.).

Information is

In reality, we will not be able to do this, since we will not be able to determine the lengths with infinite precision. Some engineering problems prevent us from increasing the measurement accuracy, and quantum physics shows us that after a certain limit, quantum laws will already interfere with us. Intuitively, we understand that the lower the measurement accuracy, the less information we receive, and the higher the measurement accuracy, the more information we receive. Shannon's formula is not suitable for measuring the amount of analog information, but there are other methods for this, which are discussed in Information Theory. In computer technology, a bit corresponds to the physical state of the information carrier: magnetized - not magnetized, there is a hole - no hole, charged - not charged, reflects light - does not reflect light, high electrical potential - low electrical potential. In this case, one state is usually denoted by the number 0, and the other - by the number 1. Any information can be encoded with a sequence of bits: text, image, sound, etc.

Along with a bit, a value called a byte is often used, usually it is equal to 8 bits. And if a bit allows you to choose one equally probable option out of two possible ones, then a byte is 1 out of 256 (2 ^ 8). It is also customary to use larger units to measure the amount of information:

1 KB (one kilobyte) 210 bytes = 1024 bytes

1 MB (one megabyte) 210 KB = 1024 KB

1 GB (one gigabyte) 210 MB = 1024 MB

In reality, the SI prefixes kilo-, mega-, giga- should be used for multipliers of 10 ^ 3, 10 ^ 6 and 10 ^ 9, respectively, but historically, the practice of using factors with powers of two has developed.

A Shannon bit and a bit that is used in computer technology are the same if the probabilities of occurrence of a zero or one in a computer bit are equal. If the probabilities are not equal, then the amount of information according to Shannon becomes less, we saw this on the example of the Martian dinosaur. The computerized amount of information gives an upper estimate of the amount of information. Volatile memory, after energizing it, is usually initialized with some value, for example, all ones or all zeros. It is clear that after energizing the memory, there is no information there, since the values ​​in the memory cells are strictly defined, there is no uncertainty. The memory can store in itself a certain amount of information, but after the power is applied to it, there is no information in it.

Disinformation is deliberately false information provided to an adversary or a business partner for more effective conduct of hostilities, cooperation, verification of information leakage and direction of its leakage, identification of potential black market customers. Also, disinformation (also misinformed) refers to the very process of manipulating information, such as: misleading someone by providing incomplete information or complete, but no longer necessary information, distorting the context, distorting part of the information.

The goal of such an impact is always the same - the opponent must act as the manipulator needs. The act of the object against which the disinformation is directed may consist in making a decision necessary for the manipulator or in refusing to make a decision that is unfavorable for the manipulator. But in any case, the ultimate goal is the action that will be taken by the opponent.

Disinformation is thus product human activity, an attempt to create a false impression and, accordingly, push to the desired actions and / or inaction.

Information is

Types of disinformation:

Misleading a specific person or group of persons (including a whole nation);

Manipulation (by the actions of one person or a group of persons);

Creation of public opinion about a problem or object.

Information is

Misleading is nothing more than outright deception, the provision of false information. Manipulation is a method of influence aimed directly at changing the direction of people's activity. The following levels of manipulation are distinguished:

Strengthening the values ​​that are beneficial to the manipulator (ideas, attitudes) existing in the minds of people;

Partial change in views on a particular event or circumstance;

A radical change in attitudes.

The creation of public opinion is the formation in society of a certain attitude towards the chosen problem.

Sources and links

ru.wikipedia.org - the free encyclopedia Wikipedia

youtube.com - YouTube video hosting

images.yandex.ua - yandex pictures

google.com.ua - Google pictures

ru.wikibooks.org - wikibooks

inf1.info - Planet of Informatics

old.russ.ru - Russian Journal

shkolo.ru - Information guide

5byte.ru - Informatics website

ssti.ru - Information technology

klgtu.ru - Informatics

informatika.sch880.ru - site of the teacher of informatics O.V. Podvintseva

Encyclopedia of Cultural Studies

The basic concept of cybernetics, in the same way, economic intelligence is the basic concept of economic cybernetics. There are many definitions of this term, they are complex and contradictory. The reason for this, obviously, is that I. as a phenomenon is engaged ... ... Economics and Mathematics Dictionary


Wir verwenden Cookies für die beste Präsentation unserer Website. Wenn Sie diese Website weiterhin nutzen, stimmen Sie dem zu. OK

Information concept

Into the concept "information"(from lat. informatio- information, explanations, presentation) different meanings are put in according to the industry where this concept is considered: in science, technology, ordinary life, etc. Usually, information is understood as any data or information that interests anyone (reporting about any events, about someone's activities, etc.).

In the literature, you can find a large number of definitions of the term "information", which reflect different approaches to its interpretation:

Definition 1

  • Information- information (messages, data), regardless of the form of their presentation ("Federal Law of the Russian Federation from $ 27.07.2006, No. $ 149 $ -FZ On Information, Information Technologies and Information Protection");
  • Information- information about the surrounding world and the processes occurring in it, perceived by a person or a special device (Explanatory Dictionary of the Russian Language Ozhegov).

Speaking about computer processing of data, information is understood as a certain sequence of symbols or signs (letters, numbers, encoded graphics and sounds, etc.), which carries a semantic load and is presented in a form understandable for a computer.

In computer science, the following definition of this term is most often used:

Definition 2

Information- this is conscious information (knowledge expressed in signals, messages, news, notifications, etc.) about the world around them, which are the object of storage, transformation, transmission and use.

One and the same informational message (an article in a magazine, an advertisement, a story, a letter, a reference, a photograph, a TV show, etc.) may carry a different amount and content of information for different people, depending on their accumulated knowledge, on the level of accessibility of this message and on the level of interest in it. For example, a news item written in Chinese does not convey any information to a person who does not know this language, but may be useful for a person who knows Chinese. News presented in a familiar language will not contain any new information if its content is not clear or is already known.

Information is considered as a characteristic not of the message, but the relationship between the message and its recipient.

Types of information

Information can exist in various types:

  • text, pictures, drawings, photographs;
  • light or sound signals;
  • radio waves;
  • electrical and nerve impulses;
  • magnetic records;
  • gestures and facial expressions;
  • smells and tastes;
  • chromosomes through which the traits and properties of organisms are inherited, etc.

Distinguish main types of information, which are classified according to its form of presentation, methods of encoding and storing it:

  • graphic- one of the most ancient types, with the help of which information about the world around was stored in the form of rock paintings, and then in the form of paintings, photographs, diagrams, drawings on various materials (paper, canvas, marble, etc.), which depict pictures of the real world;
  • sound(acoustic) - a sound recording device was invented for storing sound information in $ 1877, and a method of encoding using special characters was developed for musical information, which makes it possible to store it as graphic information;
  • text- encodes a person's speech using special characters - letters (for each nation its own); paper is used for storage (notes in notebooks, book printing, etc.);
  • numeric- encodes the quantitative measure of objects and their properties in the surrounding world using special characters - numbers (for each coding system its own); became especially important with the development of trade, economy and monetary exchange;
  • video information- a way of storing "living" pictures of the surrounding world, which appeared with the invention of cinema.

There are also kinds of information for which methods of encoding and storage have not yet been invented - tactile information, organoleptic and etc.

Initially, information was transmitted over long distances using coded light signals, after the invention of electricity - the transmission of a coded signal in a certain way over wires, and later - using radio waves.

Remark 1

The founder of general information theory is considered Claude Shannon, who also laid the foundation for digital communication, writing the book "Mathematical Theory of Communication" in $ 1948, in which he was the first to substantiate the possibility of using binary code to transfer information.

The first computers were the means for processing numerical information. With the development of computer technology, PCs began to be used for storing, processing, transmitting various types of information (text, numerical, graphic, audio and video information).

You can store information using a PC on magnetic disks or tapes, on laser disks (CDs and DVDs), special non-volatile memory devices (flash memory, etc.). These methods are constantly being improved, and information carriers are also invented. All actions with information are performed by the central processor of the PC.

Objects, processes, phenomena of the material or non-material world, if viewed from the point of view of their information properties, are called information objects.

A huge number of different information processes can be performed on information, including:

  • creation;
  • reception;
  • combination;
  • storage;
  • broadcast;
  • copying;
  • treatment;
  • Search;
  • perception;
  • formalization;
  • division into parts;
  • measurement;
  • usage;
  • Spread;
  • simplification;
  • destruction;
  • memorization;
  • transformation;

Information properties

Information, like any object, has properties, the most important of which, from the point of view of informatics, are:

  • Objectivity. Objective information - existing independently of human consciousness, methods of fixing it, someone's opinion or attitude.
  • Credibility. Information reflecting the true state of affairs is reliable. Inaccurate information most often leads to misunderstandings or wrong decisions. The obsolescence of information can make it unreliable from reliable information, since it will no longer be a reflection of the true state of affairs.
  • Completeness. Information is complete if it is sufficient for understanding and making decisions. Incomplete or redundant information can lead to a delay in decision making or an error.
  • Accuracy of information - the degree of its proximity to the real state of an object, process, phenomenon, etc.
  • The value of information depends on its importance for decision-making, problem solving and further applicability in any types of human activity.
  • Relevance. Only the timeliness of obtaining information can lead to the expected result.
  • Comprehensibility. If valuable and timely information is incomprehensible, it is likely to become useless. Information will be understandable when it is at least expressed in a language that is understandable for the recipient.
  • Availability. The information should correspond to the level of perception of the recipient. For example, the same questions are presented in different ways in school and university textbooks.
  • Brevity. Information is perceived much better if it is presented not in detail and wordy, but with an acceptable degree of conciseness, without unnecessary details. Brevity of information is irreplaceable in reference books, encyclopedias, instructions. Consistency, compactness, convenient form of presentation facilitates understanding and assimilation of information.

The word "information" comes from Latin information, which translates as clarification, presentation. In the explanatory dictionary of V.I. Dahl does not have the word "information". The term “information” has come into use in Russian since the middle of the twentieth century.

To the greatest extent, the concept of information owes its dissemination to two scientific directions: communication theory and cybernetics... The result of the development of communication theory was information theory, founded by Claude Shannon. However, K. Shannon did not give a definition of information, at the same time, defining amount of information... Information theory is devoted to solving the problem of measuring information.

In science cybernetics, founded by Norbert Wiener, the concept of information is central (see. "Cybernetics"). It is believed that it was N. Wiener who introduced the concept of information into scientific use. Nevertheless, in his first book on cybernetics, N. Wiener does not define information. " Information is information, not matter or energy”- wrote Wiener. Thus, the concept of information, on the one hand, is opposed to the concepts of matter and energy, on the other, it is placed on a par with these concepts in terms of their generality and fundamental nature. Hence, at least it is clear that information is something that cannot be attributed to either matter or energy.

Information in philosophy

The understanding of information as a fundamental concept is dealt with by the science of philosophy. According to one of the philosophical concepts, information is a property of everything, all material objects of the world. This concept of information is called attributive (information is an attribute of all material objects). Information in the world arose together with the Universe. In this sense information is a measure of orderliness, structuredness of any material system... The processes of world development from the initial chaos that came after the “Big Bang” to the formation of inorganic systems, then organic (living) systems are associated with the growth of information content. This content is objective, independent of human consciousness. A piece of coal contains information about events that took place in ancient times. However, only the inquiring mind of a person can extract this information.

Another philosophical concept of information is called functional... According to the functional approach, information appeared with the emergence of life, as it is associated with the functioning of complex self-organizing systems, which include living organisms and human society. You can also say this: information is an attribute inherent only in living nature. This is one of the essential features that separate living things from non-living things in nature.

The third philosophical concept of information - anthropocentric, according to which information exists only in human consciousness, in human perception... Information activity is inherent only in humans and occurs in social systems. By creating information technology, a person creates tools for his information activities.

We can say that the use of the concept “information” in everyday life takes place in an anthropocentric context. It is natural for any of us to perceive information as messages that people exchange. For example, mass media - mass media are intended for dissemination of messages, news among the population.

Information in biology

In the twentieth century, the concept of information pervades science everywhere. Information processes in living nature are studied by biology. Neurophysiology (a branch of biology) studies the mechanisms of the nervous activity of animals and humans. This science builds a model of information processes in the body. Information coming from outside is converted into signals of an electrochemical nature, which are transmitted from the sense organs along nerve fibers to neurons (nerve cells) of the brain. The brain transmits control information in the form of signals of the same nature to muscle tissues, thus controlling the organs of movement. The described mechanism is in good agreement with N. Wiener's cybernetic model (see. "Cybernetics").

In another biological science - genetics, the concept of hereditary information is used, embedded in the structure of DNA molecules present in the nuclei of cells of living organisms (plants, animals). Genetics has proven that this structure is a kind of code that determines the functioning of the whole organism: its growth, development, pathologies, etc. Through DNA molecules, hereditary information is transmitted from generation to generation.

Studying computer science in basic school (basic course), one should not delve into the complexity of the problem of defining information. The concept of information is given in a meaningful context:

Information- this is the meaning, the content of messages received by a person from the outside world through his sense organs.

The concept of information is revealed through the chain:

message - meaning - information - knowledge

A person perceives messages with the help of his senses (mostly through sight and hearing). If a person understands meaning enclosed in a message, then we can say that this message carries a person information... For example, a message in an unfamiliar language does not contain information for a given person, but a message in a native language is understandable and therefore informative. The information received and stored in memory replenishes knowledge person. Our knowledgeis a systematized (linked) information in our memory.

When disclosing the concept of information from the point of view of a meaningful approach, one should start from the intuitive ideas about information that children have. It is advisable to conduct a conversation in the form of a dialogue, asking students questions to which they are able to answer. Questions, for example, can be asked in the following order.

- Where do you get the information from?

Surely you will hear in response:

From books, radio and television .

- In the morning on the radio I heard the weather forecast .

Grasping this answer, the teacher brings the students to the final conclusion:

- This means that at first you did not know what the weather would be like, but after listening to the radio you began to know. Therefore, having received information, you received new knowledge!

Thus, the teacher, together with the students, comes to the definition: informationfor a person, this is information that replenishes a person's knowledge, which he receives from various sources. Further, on the numerous examples familiar to children, this definition should be consolidated.

Having established a connection between information and knowledge of people, you inevitably come to the conclusion that information is the content of our memory, because human memory is a means of storing knowledge. It is reasonable to call such information internal, operational information that a person possesses. However, people store information not only in their own memory, but also in records on paper, on magnetic media, etc. Such information can be called external (in relation to a person). In order for a person to use it (for example, prepare a dish according to a culinary recipe), he must first read it, i.e. turn into an internal form, and then perform some actions.

The question of the classification of knowledge (and therefore information) is very complex. In science, there are various approaches to it. Specialists in the field of artificial intelligence are especially involved in this issue. Within the framework of the basic course, it is enough to limit ourselves to dividing knowledge into declarative and procedural. The description of declarative knowledge can begin with the words: “I know that ...”. Description of procedural knowledge - from the words: "I know how ...". It is not difficult to give examples for both types of knowledge and invite children to come up with their own examples.

The teacher should well understand the propaedeutic significance of discussing these issues for the future acquaintance of students with the device and operation of the computer. A computer, like a person, has internal - operative - memory and external - long-term - memory. The division of knowledge into declarative and procedural in the future can be linked with the division of computer information into data - declarative information and programs - procedural information. The use of the didactic method of analogy between the information function of a person and a computer will allow students to better understand the essence of the structure and operation of a computer.

Based on the position “human knowledge is stored information,” the teacher informs the students that smells, tastes, and tactile (tactile) sensations also carry information to a person. The rationale for this is very simple: since we remember familiar smells and tastes, we recognize familiar objects by touch, it means that these sensations are stored in our memory, and therefore are information. Hence the conclusion: with the help of all his senses, a person receives information from the outside world.

Both from the substantive and from the methodological point of view, it is very important to distinguish between the meaning of the concepts “ information" and " data”. To the presentation of information in any sign system(including used in computers) the term should be useddata”. A information- it the meaning in the data, inherent in them by a person and understandable only to a person.

Computer works with data: receives input data, carries out their processing, transfers the output data to the person - the results. The semantic interpretation of the data is carried out by a person... Nevertheless, colloquially, in the literature, it is often said and written about what the computer stores, processes, transmits and receives information. This is true if the computer is not separated from the person, considering it as a tool with which a person carries out information processes.

1. Andreeva E.V.,Bosova L.L.,Falina I.N... Mathematical foundations of computer science. Elective course. M .: BINOM. Knowledge Laboratory, 2005.

2. Beshenkov S.A.,Rakitina E.A... Informatics. Systematic course. Textbook for the 10th grade. M .: Laboratory of Basic Knowledge, 2001, 57 p.

3.Wiener N... Cybernetics, or Control and Communication in an Animal and a Machine. Moscow: Soviet radio, 1968, 201 p.

4. Informatics. Workshop book in 2 volumes / Ed. I.G. Semakina, E.K. Henner. T. 1.M .: BINOM. Knowledge Laboratory, 2005.

5. A.A. Kuznetsov, S.A. Beshenkov, E.A. Rakitina, N.V. Matveeva, L.V. Milokhina Continuous course in informatics (concept, system of modules, typical program). Informatics and Education, No. 1, 2005.

6. Mathematical encyclopedic dictionary. Section: “Dictionary of School Informatics”. Moscow: Soviet Encyclopedia, 1988.

7.Friedland A.I AM... Informatics: processes, systems, resources. M .: BINOM. Knowledge Laboratory, 2003.

Information is the main subject of study for science Informatics.

The word "information" is intuitive for most of them, since this concept is constantly used in everyday life. It is obvious that people transfer information to each other, process it, create new information.

But what is it information as a scientific concept? Can you give an unambiguous answer to this question? Currently, no. The definition of the term "information" depends on the context in which it is used. When a concept cannot be given an unambiguous definition, it suddenly becomes almost philosophical, and each author can claim his own definition. The fact is that information is a fundamental scientific concept, along with matter and energy. However, information is not material, perhaps the existence of information should be considered as a result of a person's conscious mental activity.

In certain sciences, the Universe is considered from the point of view of flows of matter and energy. However, you can look at the world in terms of information flows. For example, a biological object, creating a similar one, transfers it genetic information; a person who has received information can transform it into knowledge and, therefore, slightly change his consciousness.

From the literature, several definitions of information can be distinguished, characteristic of various sciences. For example, physics is characterized by the following definition. Information- it is an inalienable property of all existing elements and systems, expressing their meaning of existence and itself existing eternally. This definition does not include human activities. Creativity and inventions are new information that did not previously exist in the Universe.

And here is a rather interesting approach to defining information. Information is a reflection of diversity in the existing world. Of course, if everything is the same, then this is, in fact, emptiness and lack of information. It follows from this that the more variety there is in the system, the more information it contains.

Definition of K. Shannon: Information Is the removed uncertainty... To clarify this, one can resort to the following analogy: a person does not know the content of an object, but the more he studies it, the more information he has about it and the less uncertainty (uncertainty) he has on this subject. Can we assume that someday all the uncertainty in the Universe will be removed by a person (his mind, consciousness, activity, etc.)? After all, if you look, then gradually, as a result of the development of civilization, uncertainty decreases, and the amount of information that mankind has at its disposal is growing.

If we consider all of humanity in its evolution as a kind of unit, then we can see that this very humanity is engaged in receiving, accumulating, processing and creating new information.

We can talk about an informational breakthrough at the present time, because modern equipment and technologies make it possible to quickly process information and exchange it.

We can say that information in a broad sense is a reflection of the real world; and in the narrow sense - any information that is an object of storage, transfer and transformation.

From a practical point of view, information is always presented as a message. An informational message is associated with the source of the message, the recipient of the message, and the communication channel.

Messages from a source to a receiver are transmitted in material and energy form (electrical, light, sound signals, etc.). A person perceives messages through the senses. Information receivers in technology perceive messages with the help of various measuring and recording equipment. In both cases, the reception of information is associated with a change in time of any quantity that characterizes the state of the receiver. In this sense, the information message can be represented by the function x (t), which characterizes the change in time of the material and energy parameters of the physical environment in which information processes are carried out.

The function x (t) takes any real values ​​in the range of time t. If the function x (t) is continuous, then there is continuous (or analog) information, the source of which is usually various natural objects (temperature, pressure, humidity, etc.). If the function x (t) is discrete, then informational messages are clearly divided into separate elements (for example, like words in a text).

In the modern world, information is usually processed on computers. Therefore, the science of computer science, among other things, includes the study of computers. A computer (or computing machine) is a device for converting information by performing a number of operations by a program.

Another concept is very widely used - data... It is customary to apply it in relation to information presented in a form that allows you to store, transfer or process it using technical means. Therefore, along with the terms "information input", "information processing", "information storage", "information retrieval", the terms "data input", "data processing", "data storage" and the like are used.

Information exists not only in the form of data, but also in the form of knowledge. Here knowledge Is a set of objective facts, methods and technologies, systematized and giving a real idea of ​​objects, processes and phenomena, i.e. specially structured information.

Knowledge can be declarative or procedural. In the first case, this is some kind of information understood by a person, and in the second, it is the ability to solve certain problems, i.e. possession of algorithms for a specific action.

It is almost impossible to draw a clear line between information, data and knowledge.

Data and knowledge in the form of information flows circulate in information systems... Collection, accumulation, processing, storage and use of information in information systems are carried out using information technology. Information technologies are mechanized (engineering) methods of information processing that are implemented through automated information systems.

Top related articles