How to set up smartphones and PCs. Informational portal
  • home
  • Errors
  • It is true that the global web is called soton. What is the World Wide Web

It is true that the global web is called soton. What is the World Wide Web

The Internet is taking an increasing place in our lives. No other human-made technology has gained such widespread popularity. The Internet is the World Wide Web, which covers the entire globe, enveloping it with a network of TV towers. He began to gain his popularity back in the relatively distant 1990s. In the article, we will discuss where it came from and why it became so popular.

Internet as World Wide Web

The second name of such a plan was not without reason. The fact is that the Internet unites many users around the world. Like a spider's web, it envelops the entire globe with its threads. And this is not an ordinary metaphor, it really is. The Internet is wires and wireless network, the second of which is not visible to us.

But this is a lyrical digression; in fact, the Internet is connected to the World Wide Web (www, or Word Wide Web). It covers all computers connected to the Internet. On remote servers, users store necessary information, and can also communicate on the Internet. This name is often understood as the World Wide Web or Global Network.

It is based on several particularly important protocols, like TCP/IP. Thanks to the Internet, the World Wide Web, or in other words the Word Wide Web (WWW), carries out its activities, that is, transmits and receives data.

Number of users

At the end of 2015, a study was conducted, based on which the following data were obtained. There are 3.3 billion Internet users worldwide. And this is almost 50% of the entire population of our planet.

Such high rates were achieved thanks to the distribution cellular networks 3G and high speed 4G. Providers played an important role, thanks to the massive introduction of Internet technologies, the costs of maintaining servers and manufacturing have decreased fiber optic cables. In most European countries, Internet speeds are faster than in African countries. This is explained by the technical lag of the latter and the low demand for the service.

Why is the Internet called the World Wide Web?

Paradoxical as it may seem, many users are sure that the above term and the Internet are one and the same. This deep misconception that hovers in the minds of many users is caused by the similarity of concepts. Now we'll figure out what's what.

The World Wide Web is often confused with the similar phrase “World Wide Web”. It represents a certain volume of information based on Internet technology.

History of the World Wide Web

By the end of the 90s, the dominance of NSFNet over ARPANET technology was finally established in the world. Oddly enough, their development was carried out by one scientific center. ARPNET was developed by order of the US War Department. Yes, yes, the first to use the Internet were the military. And NSFNet technology was developed independently of government agencies, almost out of pure enthusiasm.

It was the competition between the two developments that became the basis for their further development and mass introduction into the world. The World Wide Web became available to the general public in 1991. It had to work somehow, and Berners Lee took up the development of a system for the Internet. After two years of successful work, he created hypertext, or HTTP, the famous electronic language HTML and URL. We don’t need to go into details, because now we see them as regular links for website addresses.

Information space

First of all, this is an information space, access to which is provided through the Internet. It allows the user to have access to data that is located on the servers. If we use a visual-figurative method, then the Internet is a volumetric cylinder, and the World Wide Web is what fills it.

Through a program called a "browser", the user gains access to the Internet to surf the Web. It consists of an innumerable number of sites that are based on servers. They are connected to computers and are responsible for storing, loading, and viewing data.

Spider webs and modern man

Currently, Homo sapiens in developed countries have almost completely integrated with the World Wide Web. We are not talking about our grandparents or about remote villages where they don’t even know about some kind of Internet.

Previously, a person in search of information went straight to the library. And it often happened that the book he needed was not found, then he had to go to other institutions with archives. Now there is no need for such manipulations.

In biology, all species names consist of three words, such as our full name Homo sapiens neanderthalensis. Now we can safely add the fourth word internetiys.

The Internet is capturing the minds of humanity

Agree, we get almost all our information from the Internet. We have tons of information at our fingertips. Tell our ancestor about this, he would eagerly stare at the monitor screen and sit there all his free time in search of information.

It is the Internet that has brought humanity to a fundamentally new level; it contributes to the creation new culture- mixed or multi. Representatives of different nations mimic and adapt, as if merging their customs into one cauldron. Where does the final product then come from?

It is especially useful for scientists; there is no longer a need to gather at consultations in a country that is located 1000 km from yours. You can exchange experiences without a personal meeting, for example, through instant messengers or social media. And if an important issue needs to be discussed, then this can be done via Skype.

Conclusion

The World Wide Web is a component of the Internet. Its operation is ensured thanks to storage servers, which provide information to the user upon request. The Network itself was developed thanks to scientists from the USA and their enthusiasm.

The Internet is a communication system and at the same time an information system - a medium for people to communicate. Currently, there are many definitions of this concept. In our opinion, one of the definitions of the Internet that most fully characterizes the information interaction of the planet’s population is: “The Internet is a complex transport and information system of mushroom-shaped (dipole) structures, the cap of each of which (the dipoles themselves) represents the brain of a person sitting at a computer , together with the computer itself, which is, as it were, an artificial extension of the brain, and the legs, for example, are the telephone network connecting computers, or the ether through which radio waves are transmitted.”

The advent of the Internet gave impetus to the development of new information technologies, leading not only to changes in the consciousness of people, but also the world as a whole. However, the worldwide computer network was not the first discovery of its kind. Today, the Internet is developing in the same way as its predecessors - telegraph, telephone and radio. However, unlike them, it combined their advantages - it became not only useful for communication between people, but also a publicly accessible means for receiving and exchanging information. It should be added that the capabilities of not only stationary, but also mobile television have already begun to be fully used on the Internet.

The history of the Internet begins around the 60s of the 20th century.

The first documentation of the social interaction made possible by the Internet was a series of notes written by J. Licklider. These notes discussed the concept of the "Galactic Network". The author envisioned the creation of a global network of interconnected computers, through which everyone could quickly access data and programs located on any computer. In spirit this concept is very close to current state Internet.

Leonard Kleinrock published the first paper on packet switching theory in July 1961. In the article, he presented the advantages of his theory over the existing principle of data transmission - circuit switching. What is the difference between these concepts? When packet switching occurs, there is no physical connection between two end devices (computers). In this case, the data necessary for transmission is divided into parts. Each part is appended with a header containing complete information about the delivery of the packet to its destination. When switching channels, two computers are physically connected “each to each” during the transmission of information. During the connection period, the entire volume of information is transferred. This connection is maintained until the end of the information transfer, i.e., just as it was when transmitting information over analog systems that provide connection switching. At the same time, the utilization rate of the information channel is minimal.

To test the concept of packet circuit switching, Lawrence Roberts and Thomas Merrill connected a TX-2 computer in Massachusetts to a Q-32 computer in California using low-speed telephone dial-up lines in 1965. Thus, the first ever (albeit small) non-local computer network was created. The result of the experiment was the understanding that time-shared computers could successfully work together, executing programs and retrieving data on a remote machine. It also became clear that telephone system with circuit switching (connections) is absolutely unsuitable for building a computer network.

In 1969, the American agency ARPA (Advanced Research Projects Agency) began research on creating an experimental packet-switching network. This network was created and named ARPANET, i.e. network of advanced research projects agency. A sketch of the ARANET network, consisting of four nodes - the embryo of the Internet - is shown in Fig. 6.1.

At this early stage, research was carried out on both network infrastructure, and for network applications. At the same time, work was underway to create a functionally complete protocol for computer-to-computer interaction and other network software.

In December 1970, the Network Working Group (NWG), led by S. Crocker, completed work on the first version of the protocol, called the Network Control Protocol (NCP). After work was completed to implement NCP on ARPANET nodes in 1971–1972, network users were finally able to begin developing applications.

In 1972, the first application appeared - email.

In March 1972, Ray Tomlinson wrote basic programs for sending and reading electronic messages. In July of the same year, Roberts added to these programs the ability to display a list of messages, selective reading, saving to a file, forwarding, and preparing a response.

Since then, email has become the largest network application. For its time, e-mail became what the World Wide Web is today - an extremely powerful catalyst for the growth of the exchange of all types of interpersonal data flows.

In 1974, the Internet Network Working Group (INWG) introduced a universal protocol for data transmission and network interconnection - TCP/IP. This is the protocol that is used on the modern Internet.

However, the ARPANET switched from NCP to TCP/IP only on January 1, 1983. This was a Day X style transition, requiring simultaneous changes to all computers. The transition had been carefully planned by all parties involved over the previous several years and went surprisingly smoothly (it did, however, lead to the proliferation of the "I Survived the TCP/IP Migration" badge). In 1983, the transition of the ARPANET from NCP to TCP/IP allowed the network to be divided into MILNET, the military network itself, and ARPANET, which was used for research purposes.

In the same year, another important event occurred. Paul Mockapetris developed domain system names (Domain Name System, DNS). This system allowed the creation of a scalable, distributed mechanism for mapping hierarchical computer names (eg, www.acm.org) to Internet addresses.

Also in 1983, a Domain Name Server (DNS) was created at the University of Wisconsin. This server (DNS) automatically and secretly from the user provides translation of the dictionary equivalent of the site into an IP address.

With the widespread spread of the Internet outside the United States, country codes first level ru, uk, ua, etc.

In 1985, the National Science Foundation (NSF) took part in the creation of its own network, NSFNet, which was soon connected to the Internet. Initially, the NSF included 5 supercomputer centers, however, less than in APRANET, and the data transmission speed in communication channels did not exceed 56 kbit/s. At the same time, the creation of NSFNet was a significant contribution to the development of the Internet, as it allowed for a new look at how the Internet could be used. The Foundation set the goal that every scientist, every engineer in the United States would be “connected” to a single network, and therefore began to create a network with faster channels that would unite numerous regional and local networks.

Based on ARPANET technology, the NSFNET network (the National Science Foundation NETwork) was created in 1986, in the creation of which NASA and the Department of Energy were directly involved. Six large research centers equipped with the latest supercomputers, located in different regions of the United States, were connected. The main purpose of this network was to provide US research centers with access to supercomputers based on an interregional backbone network. The network operated at a base speed of 56 Kbps. When creating the network, it became obvious that it was not worth even trying to connect all universities and research organizations directly to the centers, since laying such an amount of cable was not only very expensive, but practically impossible. Therefore, we decided to create networks on a regional basis. In every part of the country the institutions concerned connected with their nearest neighbors. The resulting chains were connected to the supercomputer centers through one of their nodes, thus the supercomputer centers were connected together. With this design, any computer could communicate with any other by passing messages through its neighbors.

One of the problems that existed at that time was that early networks(including ARPANET) were built purposefully in the interests of a narrow circle of interested organizations. They were to be used by a closed community of specialists; As a rule, the work of networks was limited to this. There was no particular need for network compatibility; accordingly, there was no compatibility itself. At the same time, alternative technologies, such as XNS from Xerox, DECNet, and SNA from IBM. Therefore, under the auspices of DARPA NSFNET, together with specialists from subordinate thematic groups Internet Engineering and Architecture Task Forces and members of the NSF Network Technical Advisory Group, developed “Requirements for Internet Gateways.” These requirements formally guaranteed interoperability between parts of the Internet administered by DARPA and NSF. In addition to choosing TCP/IP as the basis for NSFNet, US federal agencies adopted and implemented a number of additional principles and rules that shaped the modern face of the Internet. Most importantly, NSFNET had a policy of "universal and equal access to the Internet." Indeed, in order for an American university to receive NSF funding for an Internet connection, it, as the NSFNet program states, “must make that connection available to all qualified users on campus.”

NSFNET worked quite successfully at first. But the time came when she could no longer cope with the increased needs. The network created for the use of supercomputers allowed connected organizations to use a lot of information data not related to supercomputers. Network users in research centers, universities, schools, etc. realized that they now had access to a wealth of information and that they had direct access to their colleagues. The flow of messages on the Internet grew faster and faster, until, in the end, it overloaded the computers that controlled the network and the telephone lines connecting them.

In 1987, NSF transferred to Merit Network Inc. a contract under which Merit, with the participation of IBM and MCI, was to provide management of the NSFNET core network, transition to higher-speed T-1 channels and continue its development. The growing core network already united more than 10 nodes.

In 1990, the concepts of ARPANET, NFSNET, MILNET, etc. finally left the scene, giving way to the concept of the Internet.

The scope of the NSFNET network, combined with the quality of the protocols, led to the fact that by 1990, when the ARPANET was finally dismantled, the TCP/IP family had supplanted or significantly displaced most other global computer network protocols around the world, and IP was confidently becoming the dominant data transport service in the global network. information infrastructure.

In 1990, the European Organization for Nuclear Research established the largest Internet site in Europe and provided Internet access to the Old World. To help promote and facilitate the concept of distributed computing over the Internet, CERN (Switzerland, Geneva), Tim Berners-Lee developed hypertext document technology - the World Wide Web (WWW), allowing users to access any information located on the Internet on computers around the world.

At the core WWW technologies lie: the definition of specifications URL (Universal Resource Locator, universal resource locator), HTTP (HyperText Transfer Protocol), and the HTML language itself (HyperText Markup Language, hypertext markup language). Text can be marked up in HTML using any text editor. A page marked up in HTML is often called a Web page. To view a Web page, a client application—a Web browser—is used.

In 1994, the W3 Consortium was formed, which brought together scientists from different universities and companies (including Netscape and Microsoft). Since that time, the committee began to deal with all standards in the Internet world. The organization's first step was the development of the HTML 2.0 specification. This version has the ability to transfer information from the user's computer to the server using forms. The next step was the HTML 3 project, work on which began in 1995. For the first time, the CSS system (Cascading Style Sheets, hierarchical style sheets) was introduced. CSS allows you to format text without disrupting logical and structural markup. The HTML 3 standard was never approved; instead, HTML 3.2 was created and adopted in January 1997. Already in December 1997, the W3C adopted the HTML 4.0 standard, which distinguishes between logical and visual tags.

By 1995, the growth of the Internet showed that regulation of connectivity and funding issues could not be in the hands of NSF alone. In 1995, payments for connecting numerous private networks to the national backbone were transferred to regional networks.

The Internet has grown far beyond what it was envisioned and designed to be; it has outgrown the agencies and organizations that created it; they can no longer play a dominant role in its growth. Today it is a powerful worldwide communication network based on distributed switching elements - hubs and communication channels. Since 1983, the Internet has grown exponentially, and hardly a single detail has survived from those times - the Internet still operates based on the TCP/IP protocol suite.

If the term “Internet” was originally used to describe a network built on the Internet protocol (IP), now this word has acquired a global meaning and is only sometimes used as a name for a set of interconnected networks. Strictly speaking, the Internet is any set of physically separate networks that are interconnected by a single IP protocol, which allows us to talk about them as one logical network. The rapid growth of the Internet has caused increased interest in the TCP/IP protocols, and as a result, specialists and companies have appeared who have found a number of other applications for it. This protocol began to be used to build local area networks (LAN - Local Area Network) even when their connection to the Internet was not provided. In addition, TCP/IP began to be used in the creation of corporate networks that adopted Internet technologies, including WWW (World Wide Web) - world wide web to establish effective exchange of internal corporate information. These corporate networks are called "Intranets" and may or may not be connected to the Internet.

Tim Berners-Lee, who is the author of HTTP, URI/URL and HTML technologies, is considered the inventor of the World Wide Web. In 1980, for his own use, he wrote the Enquirer program, which used random associations to store data and laid the conceptual basis for the World Wide Web. In 1989, Tim Berners-Lee proposed the global hypertext project, now known as the World Wide Web. The project implied the publication of hypertext documents interconnected by hyperlinks, which would facilitate the search and consolidation of information for scientists. To implement the project, he invented URIs, the HTTP protocol, and the HTML language. These are technologies without which it is no longer possible to imagine the modern Internet. Between 1991 and 1993, Berners-Lee refined the technical specifications of these standards and published them. He wrote the world's first web server, "httpd", and the world's first hypertext web browser, called "WorldWideWeb". This browser was also a WYSIWYG editor (short for What You See Is What You Get). Its development began in October 1990 and was completed in December of the same year. The program worked in the NeXTStep environment and began to spread across the Internet in the summer of 1991. Berners-Lee created the world's first Web site at http://info.cern.ch/; the site is now archived. This site went online on the Internet on August 6, 1991. This site described what the World Wide Web was, how to install a Web server, how to use a browser, etc. This site was also the world's first Internet directory, because Tim Berners-Lee later posted and maintained a list of links to other sites.

Since 1994, the main work on the development of the World Wide Web has been taken over by the World Wide Web Consortium (W3C), founded by Tim Berners-Lee. This Consortium is an organization that develops and implements technological standards for the Internet and the World Wide Web. The W3C's mission is to "Unleash the full potential of the World Wide Web by establishing protocols and principles to ensure the long-term development of the Web." Two other major goals of the Consortium are to ensure complete “internationalization of the Network” and to make the Network accessible to people with disabilities.

The W3C develops uniform principles and standards for the Internet (called “Recommendations”, English W3C Recommendations), which are then implemented by software and hardware manufacturers. This ensures compatibility between software products and equipment various companies, which makes the World Wide Web more perfect, universal and convenient. All World Wide Web Consortium Recommendations are open, that is, not protected by patents and can be implemented by anyone without any financial contributions to the consortium.

Currently, the World Wide Web is formed by millions of Internet Web servers located around the world. A web server is a program that runs on a computer connected to a network and uses the HTTP protocol to transfer data. In its simplest form, such a program receives an HTTP request for a specific resource over the network, finds the corresponding file on the local hard drive and sends it over the network to the requesting computer. More complex Web servers are capable of dynamically allocating resources in response to an HTTP request. To identify resources (often files or parts thereof) on the World Wide Web, Uniform Resource Identifiers (URIs) are used. Uniform Resource Locators (URLs) are used to determine the location of resources on the network. Such URL locators combine URI identification technology and the DNS (Domain Name System) domain name system - a domain name (or directly an IP address in a numeric notation) is part of the URL to designate a computer (more precisely, one of its network interfaces), which executes the code desired Web–servers.

To view information received from the Web server, a special program, a Web browser, is used on the client computer. The main function of a Web browser is to display hypertext. The World Wide Web is inextricably linked with the concepts of hypertext and hyperlinks. Most of the information on the Web is hypertext. To facilitate the creation, storage and display of hypertext on the World Wide Web, HTML (HyperText Markup Language), a hypertext markup language, is traditionally used. The work of marking up hypertext is called layout; markup masters are called webmasters. After HTML markup, the resulting hypertext is placed into a file; such an HTML file is the most common resource on the World Wide Web. Once an HTML file is made available to a web server, it is called a “web page.” A collection of web pages makes up a website. Hyperlinks are added to the hypertext of web pages. Hyperlinks help World Wide Web users easily navigate between resources (files), regardless of whether the resources are located on local computer or at remote server. "Web" hyperlinks are based on URL technology.

In general, we can conclude that the World Wide Web stands at " three pillars": HTTP, HTML and URL. Although in Lately HTML began to lose its position somewhat and give way to more modern markup technologies: XHTML and XML. XML (eXtensible Markup Language) is positioned as the foundation for other markup languages. To improve the visual perception of the web, CSS technology has become widely used, which allows you to specify uniform styles design for many web pages. Another innovation worth paying attention to is the URN (Uniform Resource Name) resource naming system.

A popular concept for the development of the World Wide Web is the creation of a semantic web. The Semantic Web is an add-on to the existing World Wide Web, which is designed to make information posted on the network more understandable to computers. The Semantic Web is a network concept in which every resource on human language would be provided with a description that a computer could understand. The Semantic Web provides access to clearly structured information for any application, regardless of platform and regardless of programming languages. Programs will be able to find themselves necessary resources, process information, classify data, identify logical connections, draw conclusions and even make decisions based on these conclusions. If widely adopted and implemented wisely, the Semantic Web has the potential to spark a revolution on the Internet. To create a machine-readable description of a resource on the Semantic Web, the RDF (Resource Description Framework) format is used, which is based on XML syntax and uses URIs to identify resources. New products in this area are RDFS (English RDF Schema) and SPARQL (English Protocol And RDF Query Language) (pronounced “sparkle”), new language queries for quick access to RDF data.

Currently, there are two trends in the development of the World Wide Web: the semantic web and the social web. The Semantic Web involves improving the connectivity and relevance of information on the World Wide Web through the introduction of new metadata formats. The Social Web relies on the work of organizing the information available on the Web, carried out by the Web users themselves. Within the second direction, developments that are part of the Semantic Web are actively used as tools (RSS and other web feed formats, OPML, XHTML microformats).

Internet telephony has become one of the most modern and economical types of communication. Her birthday can be considered February 15, 1995, when VocalTec released its first soft-phone - a program used for voice exchange over an IP network. Microsoft then released the first version of NetMeeting in October 1996. And already in 1997, connections via the Internet between two ordinary telephone subscribers located in completely different places on the planet became quite common.

Why regular intercity and international telephone communications so expensive? This is explained by the fact that during a conversation the subscriber occupies an entire communication channel, not only when speaking or listening to the interlocutor, but also when he is silent or distracted from the conversation. This happens when voice is transmitted over the telephone using the usual analog method.

With the digital method, information can be transmitted not continuously, but in separate “packets”. Then, information can be sent simultaneously from many subscribers via one communication channel. This principle of packet transmission of information is similar to transporting many letters with different addresses in one mail car. After all, they don’t “drive” one mail car to transport each letter separately! This temporary “packet compaction” makes it possible to use existing communication channels much more efficiently and “compress” them. At one end of the communication channel, information is divided into packets, each of which, like a letter, is equipped with its own individual address. Over a communication channel, packets from many subscribers are transmitted “interspersed”. At the other end of the communication channel, packets with the same address are again combined and sent to their destination. This packet principle is widely used on the Internet.

Having Personal Computer, a sound card, a microphone and headphones (or speakers) compatible with it, the subscriber can use Internet telephony to call any subscriber who has a regular landline telephone. During this conversation, he will also only pay for using the Internet. Before using Internet telephony, the subscriber who owns a personal computer must install a special program on it.

To use Internet telephony services it is not necessary to have a personal computer. To do this, it is enough to have a regular telephone with tone dialing. In this case, each dialed digit goes into the line not in the form of a different number of electrical impulses, as when the disk rotates, but in the form of alternating currents of different frequencies. This tone mode is found in most modern telephones. To use Internet telephony using a telephone, you need to buy a credit card and call a powerful central computer server at the number indicated on the card. Then the server machine voice (optionally in Russian or English) communicates the commands: dial using the telephone buttons serial number and card key, dial the country code and the number of your future interlocutor. Next the server turns analog signal into digital, sends it to another city, to a server located there, which again converts the digital signal into analog and sends it to the desired subscriber. The interlocutors speak as if regular phone However, sometimes you feel a slight (a fraction of a second) delay in the response. Let us recall that to save communication channels, voice information is transmitted in “packets” of digital data: your voice information is divided into segments, packets, called Internet protocols (IP).

In 2003, the Skype program was created (www.skype.com), which is completely free and does not require virtually any knowledge from the user either to install it or to use it. It allows you to talk in video mode with interlocutors located at their computers in different parts of the world. In order for the interlocutors to see each other, the computer of each of them must be equipped with a web camera.

Humanity has come such a long way in the development of communications: from signal fires and drums to a cellular mobile phone, which allows two people located anywhere on our planet to communicate almost instantly. At the same time, despite the different distances, subscribers create a feeling of personal communication.

World Community of Networks;
♦ what is the World Wide Web;
♦ Web server, Web page, Web site;
♦ WWW hyperstructure;
♦ browser - WWW client program; problem of finding information on the Internet.

The Internet is a global community of networks

Would you like to look into the residence of the US President - the White House, or visit the Louvre - the largest art museum in the world, or find out what the weather is like in Antarctica, or get information about the performances taking place tonight in Moscow theaters? All this and much more can be achieved without leaving the table on which a personal computer is installed, connected to the world networks Internet.

The Internet unites thousands of local, industry, and regional computer networks around the world. An individual user who is not a subscriber to any of the listed networks can also connect to the Internet through the nearest hub.

All of the above computer network services ( Email, teleconferences, file archives, etc.) also work on the Internet. In this case, only problems of the language of communication may arise. The language of international communication in the global network is English. Here is another incentive for you to study diligently English language !

What is the World Wide Web

The most interesting service provided to Internet users since 1993 has been the ability to work with information World system Wide Web (abbreviated as WWW). This phrase can be translated as “world wide web.” It was working with the WWW that was meant when at the beginning of this paragraph you were offered all sorts of information miracles.

It is very difficult to give an exact definition of what the WWW is. This system can be compared to a huge encyclopedia, the pages of which are scattered across computer servers connected by the Internet. To get the right information, the user must get to the corresponding encyclopedia page. Perhaps with this analogy in mind, the creators of the WWW introduced the concept of a Web page.

Web server, Web page, Web site

The web page is the main one information item www. It is a separate document stored on a Web server. A page has a name (similar to a page number in an encyclopedia) by which it can be accessed.

The information on a Web page can be very different: text, drawing, photograph, multimedia. Web pages also contain advertising, reference information, scientific articles, the latest news, illustrated publications, art catalogs, weather forecasts and much, much more. To put it simply: Web pages have “everything.”

A number of Web pages can be related thematically and form a Web site. Each site has a main page, which is called home (Home page). This is peculiar title page, starting from which you can view documents stored on the server. Typically, the home page contains a table of contents - the names of sections. To access the desired section, just move the mouse pointer to the section name and click the button mice.

WWW hyperstructure

However, it is not at all necessary to view Web pages in a row, flipping through them, as in a book. The most important property of the WWW is the hypertext organization of connections between Web pages. Moreover, these links operate not only between pages on the same server, but also between different WWW servers.

Typically, hyperlinked keywords are highlighted or underlined on a Web page. By clicking on such a word, you will hidden link move on to view another document. Moreover, this document may be located on another server, in another country, on another continent. More often than not, the Internet user has no idea where the server with which he is currently communicating is located. Figuratively speaking, in one session you can “fly” around the globe several times.

The role of a key for communication can be played not only by text, but also by a drawing, a photograph, or a pointer to a sound document. In this case, instead of the term “hypertext” the term “hypermedia” is used.

You can reach the same Web page in many different ways. The analogy with the pages of a book no longer works here. In a book, the pages have a certain sequence. Web pages do not have such a sequence. The transition from one page to another occurs through hyperlinks, forming a network that resembles a web. This is where the name of the system comes from.

Summarizing the above, we can give the following definition:

The World Wide Web is a hyperconnected information system distributed throughout the world, existing on the technical basis of the World Wide Web.

Browser is a WWW client program. The problem of searching for information on the Internet

The user is helped to navigate the “web” by special software called a Web browser from the English “browse” - “inspect, study.” Using a browser, you can find the information you need different ways. The shortest way is using the web page address. You type this address on your keyboard, press the enter key, and you are taken straight to the location.

Another way is search. You can start moving with your home page via hyperlinks. At the same time, there is a danger of going the wrong way, getting entangled in the “web”, and ending up in a dead end. However, the browser allows you to go back any number of steps and continue searching along a different route. Such a search is similar to wandering in an unfamiliar forest (though less dangerous).

Special search programs are good assistants in navigating the WWW. They “know” everything or almost everything about the WWW. Such a program just needs to specify a set of keywords on a topic that interests you, and it will provide a list of links to suitable Web documents. If the list turns out to be too long, you need to add some more clarifying terms.

During Internet sessions, an Internet user appears to be immersed in an information space with unlimited resources. Recently, the term “cyberspace” has become widespread, which refers to the entirety of the world’s telecommunication systems and the information circulating in them.

The WWW system is developing very quickly. Already, all its resources are difficult to review. Thick reference books and catalogs are published that become outdated faster than phone books. Therefore, simultaneously with the increase in the volume of information, the search system on the World Wide Web is being improved.

Briefly about the main thing

The Internet is a worldwide global computer network.

World Wide Web - World Wide Web: a hyperconnected information system distributed throughout the world, existing on the technical basis of the global Internet.

A web page is a separate WWW document.

Web server is a computer on the Internet that stores Web pages and the corresponding software for working with them.

Web site - a collection of thematically related pages.

Hypermedia is a system of hyperlinks between multimedia documents.

A web browser is a client program for the user to work with the WWW.

Searching for the desired document on the WWW can occur: by specifying its address; by moving through a “web” of hyperconnections; by using search programs.

Cyberspace is a set of world telecommunications systems and information circulating in them.

Questions and tasks

1. What is the Internet?
2. How is the phrase “World Wide Web” translated?
3. What is WWW?
4. What information can be obtained from the WWW?
5. How is the connection between Web pages organized?
6. What is the analogy between the WWW and the web?
7. What is hypermedia?
8. What is a Web server?
9. By what methods can you find the desired page on the WWW?

I. Semakin, L. Zalogova, S. Rusakov, L. Shestakova, Computer Science, 9th grade
Submitted by readers from Internet sites

All computer science online, list of topics by subject, collection of computer science abstracts, homework, questions and answers, computer science abstracts Grade 9, lesson plans

Lesson content lesson notes supporting frame lesson presentation acceleration methods interactive technologies Practice tasks and exercises self-test workshops, trainings, cases, quests homework discussion questions rhetorical questions from students Illustrations audio, video clips and multimedia photographs, pictures, graphics, tables, diagrams, humor, anecdotes, jokes, comics, parables, sayings, crosswords, quotes Add-ons abstracts articles tricks for the curious cribs textbooks basic and additional dictionary of terms other Improving textbooks and lessonscorrecting errors in the textbook updating a fragment in a textbook, elements of innovation in the lesson, replacing outdated knowledge with new ones Only for teachers perfect lessons calendar plan for the year

Initially, the Internet was a computer network for transmitting information, developed at the initiative of the US Department of Defense. The reason was given by the first artificial Earth satellite launched by the Soviet Union in 1957. The US military decided that in this case they needed an ultra-reliable communication system. ARPANET was not a secret for long and soon began to be actively used by various branches of science.

The first successful remote communication session was conducted in 1969 from Los Angeles to Stanford. In 1971, an instantly popular program for sending email over the Internet was developed. The first foreign organizations to connect to the network were in the UK and Norway. With the transatlantic telephone cable to these countries ARPANET became international network.

The ARPANET was perhaps a more advanced communication system, but it was not the only one. And only by 1983, when American network filled with the first news groups, bulletin boards and switched to the use of the TCP/IP protocol, which made it possible to integrate into other computer networks, ARPANET became the Internet. Literally a year later, this title began to gradually pass to NSFNet, an inter-university network that had a large throughput and gained 10 thousand connected computers during the annual period. The first Internet chat appeared in 1988, and in 1989 Tim Berners-Lee proposed the concept of the World Wide Web.

World Wide Web

In 1990, ARPANET finally lost to NSFNet. It is worth noting that both of them were developed by the same scientific organizations, only the first was commissioned by the US defense services, and the second was on its own initiative. However, this competitive pairing led to scientific developments and discoveries that made the World Wide Web a reality, which became publicly available in 1991. Berners Lee, who proposed the concept, over the next two years developed the HTTP (hypertext) protocol, the HTML language and URL identifiers that are more familiar ordinary users such as Internet addresses, sites and pages.

The World Wide Web is a system that provides access to files on a server computer connected to the Internet. This is partly why today the concepts of the web and the Internet often replace each other. In fact, the Internet is a communication technology, a kind of information space, and the World Wide Web fills it. This spider network consists of many millions of web servers - computers and their systems that are responsible for the operation of websites and pages. To access web resources (download, view) from a regular computer, a browser program is used. Web, WWW are synonyms for the World Wide Web. WWW users number in the billions.

When talking about the Internet, they often mean the World Wide Web. However, it is important to understand that these are not the same thing.

Structure and principles

The World Wide Web is made up of millions of Internet web servers located around the world. A web server is a computer program that runs on a computer connected to a network and uses the HTTP protocol to transfer data. In its simplest form, such a program receives an HTTP request for a specific resource over the network, finds the corresponding file on the local hard drive and sends it over the network to the requesting computer. More sophisticated web servers are capable of dynamically generating documents in response to an HTTP request using templates and scripts.

To view information received from the web server, a special program is used on the client computer - web browser. The main function of a web browser is to display hypertext. The World Wide Web is inextricably linked with the concepts of hypertext and hyperlinks. Most of the information on the Internet is hypertext.

HTML (HyperText Markup Language) is traditionally used to create, store and display hypertext on the World Wide Web. The work of creating (marking up) hypertext documents is called layout, it is done by a webmaster or a separate markup specialist - a layout designer. After HTML markup, the resulting document is saved into a file, and such HTML files are the main type of resources on the World Wide Web. Once an HTML file is made available to a web server, it is called a “web page.” A collection of web pages makes up a website.

The hypertext of web pages contains hyperlinks. Hyperlinks help World Wide Web users easily navigate between resources (files), regardless of whether the resources are located on the local computer or on a remote server. To determine the location of resources on the World Wide Web, uniform resource locators URL (English Uniform Resource Locator) are used. For example, the full URL home page The Russian section of Wikipedia looks like this: http://ru.wikipedia.org/wiki/Main_page. Such URL locators combine URI identification technology (English Uniform Resource Identifier) ​​and the DNS domain name system (English Domain Name System). Domain name (in in this case ru.wikipedia.org) in the URL designates a computer (more precisely, one of its network interfaces) that executes the code of the desired web server. URL current page can usually be seen in the browser's address bar, although many modern browsers they prefer to show only the domain name of the current site by default.

Technologies

To improve the visual perception of the web, CSS technology has become widely used, which allows you to set uniform design styles for many web pages. Another innovation worth paying attention to is the resource naming system URN (Uniform Resource Name).

A popular concept for the development of the World Wide Web is the creation of the Semantic Web. The Semantic Web is an add-on to the existing World Wide Web, which is designed to make information posted on the network more understandable to computers. The Semantic Web is a concept of a network in which every resource in human language would be provided with a description that a computer can understand. The Semantic Web opens up access to clearly structured information for any application, regardless of platform and regardless of programming languages. Programs will be able to find the necessary resources themselves, process information, classify data, identify logical connections, draw conclusions and even make decisions based on these conclusions. If widely adopted and implemented wisely, the Semantic Web has the potential to spark a revolution on the Internet. To create a computer-readable description of a resource, the Semantic Web uses the RDF (English) format. Resource Description Framework), which is based on XML syntax and uses URIs to identify resources. New products in this area are RDFS (eng. RDF Schema) and SPARQL (eng. Protocol And RDF Query Language) (pronounced "sparkle"), a new query language for fast access to RDF data.

Story

Main article: History of the World Wide Web

Tim Berners-Lee and, to a lesser extent, Robert Cayo are considered the inventors of the World Wide Web. Tim Berners-Lee is the originator of HTTP, URI/URL, and HTML technologies. In 1980 he worked for the European Council for Nuclear Research (fr. conseil européen pour la recherche nucléaire, CERN) software consultant. It was there, in Geneva (Switzerland), that for his own needs he wrote the Enquire program, which used random associations to store data and laid the conceptual basis for the World Wide Web.

As part of the project, Berners-Lee wrote the world's first web server, called "httpd", and the world's first hypertext web browser, called "WorldWideWeb". This browser was also a WYSIWYG editor (short for what you see is what you get - what you see is what you get), its development began in October 1990, and was completed in December of the same year. The program ran in the NeXTStep environment and began to spread across the Internet in the summer of 1991.

Mike Sendall buys a NeXT cube computer at this time in order to understand what the features of its architecture are, and then gives it to Tim [Berners-Lee]. Thanks to perfection software system"NeXT cube" Tim wrote a prototype illustrating the main concepts of the project in a few months. This was an impressive result: the prototype offered users, among other things, such advanced capabilities as WYSIWYG browsing/authoring!... During one of the sessions of joint discussions of the project in the CERN cafeteria, Tim and I tried to find a “catching” name for the system being created . The only thing I insisted on was that the name should not once again be taken from the same Greek mythology. Tim suggested "world wide web". I immediately really liked everything about this name, but it’s hard to pronounce in French.

The world's first website was hosted by Berners-Lee on August 6, 1991, on the first web server, available at http://info.cern.ch/, (). The resource defined the concept “ World Wide Web", contained instructions for installing a web server, using a browser, etc. This site was also the world's first Internet directory, because Tim Berners-Lee later posted and maintained a list of links to other sites there.

The first photograph to appear on the World Wide Web was of the parody filk band Les Horribles Cernettes. Tim Berners-Lee asked the band leader for scanned photographs after the CERN hardronic festival.

But still theoretical basis The web was founded much earlier than Berners-Lee. Back in 1945, Vannaver Bush developed the concept of Memex - mechanical aids for “extending human memory”. Memex is a device in which a person stores all his books and records (and, ideally, all his knowledge that can be formally described) and which provides the necessary information with sufficient speed and flexibility. It is an extension and addition to human memory. Bush also predicted comprehensive indexing of text and multimedia resources with the ability to quickly find the necessary information. The next significant step towards the World Wide Web was the creation of hypertext (a term coined by Ted Nelson in 1965).

Since 1994, the main work on the development of the World Wide Web has been undertaken by the World Wide Web Consortium (English: world wide web consortium, abbreviated as W3C), founded and still led by Tim Berners-Lee. This consortium is an organization that develops and implements technological standards for the Internet and the World Wide Web. W3C Mission: “Unleash the full potential of the World Wide Web by establishing protocols and principles to ensure the long-term development of the Web.” The consortium's other two major goals are to ensure the full "internationalization of the Web" and to make the Web accessible to people with disabilities.

The W3C develops uniform principles and standards for the Internet (called “recommendations”, English W3C recommendations), which are then implemented by software and hardware manufacturers. In this way, compatibility is achieved between software products and equipment from different companies, which makes the World Wide Web more perfect, versatile and convenient. All recommendations of the World Wide Web Consortium are open, that is, they are not protected by patents and can be implemented by anyone without any financial contributions to the consortium.

Development prospects

Currently, there are two directions in the development of the World Wide Web: the semantic web and the social web.

  • The Semantic Web involves improving the coherence and relevance of information on the World Wide Web through the introduction of new metadata formats.
  • The social web relies on users to organize the information available on the network.

In the second direction, developments that are part of the semantic web are actively used as tools (RSS and other formats, web channels, OPML, XHTML microformats). Partially semantic sections of the Wikipedia category tree help users consciously navigate information space However, very lenient requirements for subcategories do not give reason to hope for the expansion of such areas. In this regard, attempts to compile knowledge atlases may be of interest.

There is also a popular concept Web 2.0, which summarizes several directions of development of the World Wide Web.

Ways to actively display information

Information presented online may be available:

  • read-only (“passive”);
  • for reading and adding/changing (“active”).

Methods for actively displaying information on the World Wide Web include:

This division is very arbitrary. So, say, a blog or a guest book can be considered as a special case of a forum, which, in turn, is a special case of a content management system. Usually the difference is manifested in the purpose, approach and positioning of a particular product.

Some information from websites can also be accessed through speech. India has already begun testing a system that makes the text content of pages accessible even to people who cannot read and write.

Safety

Spreading

Between 2005 and 2010, the number of web users doubled to reach two billion. According to early research in 1999, most existing websites were not indexed correctly by search engines, and the web itself was larger than expected. As of 2001, more than 550 million web documents had already been created, most of which, however, were located within the invisible network. As of 2002, more than 2 billion web pages were created, 56.4% of all Internet content was in English, followed by German (7.7%), French (5.6%) and Japanese (4. 9 %). According to studies conducted at the end of January 2005, at 75 different languages over 11.5 billion web pages have been identified and indexed on the open web. And according to data for March 2009, the number of pages increased to 25.21 billion. On July 25, 2008, Google software engineers Jesse Alpert and Nissan Hiai announced that Google's search engine had detected more than a billion unique URLs.

Monument

see also

Notes

  1. "Web like" next step“(NextStep) the personal computing revolution.”
  2. LHC: The first band on the web
  3. IBM developed voice Internet
  4. Ben-Itzhak, Yuval. Infosecurity 2008 – New defense strategy in battle against e-crime, ComputerWeekly, Reed Business Information (18 April 2008). Retrieved April 20, 2008.
  5. Christey, Steve and Martin, Robert A. Vulnerability Type Distributions in CVE (version 1.1) (undefined) . MITER Corporation (22 May 2007). Retrieved June 7, 2008. Archived April 15, 2013.
  6. “Symantec Internet Security Threat Report: Trends for July–December 2007 (Executive Summary)” (PDF). XIII. Symantec Corp. April 2008: 1-2 . Retrieved 11 May 2008.
  7. Google searches web's dark side, BBC News (May 11, 2007). Retrieved April 26, 2008.
  8. Security Threat Report (undefined) (PDF). Sophos (Q1 2008). Retrieved April 24, 2008. Archived April 15, 2013.
  9. Security threat report (undefined) (PDF). Sophos (July 2008). Retrieved August 24, 2008. Archived April 15, 2013.
  10. Fogie, Seth, Jeremiah Grossman, Robert Hansen, and Anton Rager. Cross Site Scripting Attacks: XSS Exploits and Defense. - Syngress, Elsevier Science & Technology, 2007. - P. 68–69, 127. - ISBN 1-59749-154-3.
  11. O'Reilly, Tim. What Is Web 2.0 (undefined) 4–5. O"Reilly Media (September 30, 2005). Retrieved June 4, 2008. Archived April 15, 2013.
  12. Ritchie, Paul (March 2007). “The security risks of AJAX/web 2.0 applications” (PDF). Infosecurity. Elsevier. Archived from the original (PDF) 2008-06-25 . Retrieved 6 June 2008.
  13. Berinato, Scott. Software Vulnerability Disclosure: The Chilling Effect, CSO, CXO Media (1 January 2007), p. 7. Archived April 18, 2008. Retrieved June 7, 2008.
  14. Prince, Brian. McAfee Governance, Risk and Compliance Business Unit, eWEEK, Ziff Davis Enterprise Holdings (9 April 2008). Retrieved April 25, 2008.
  15. Preston, Rob. Down To Business: It's Past Time To Elevate The Infosec Conversation, InformationWeek, United Business Media (12 April 2008). Retrieved April 25, 2008.
  16. Claburn, Thomas. RSA's Coviello Predicts Security Consolidation, InformationWeek, United Business Media (6 February 2007). Retrieved April 25, 2008.
  17. boyd, danah; Hargittai, Eszter (July 2010). “Facebook privacy settings: Who cares?” . First Monday. University of Illinois at Chicago. 15 (8). Uses deprecated |month= parameter (help)
  18. Lynn, Jonathan. Internet users to exceed 2 billion…, Reuters (19 October 2010). Retrieved February 9, 2011.
  19. S. Lawrence, C.L. Giles, "Searching the World Wide Web," Science, 280(5360), 98-100, 1998.
  20. S. Lawrence, C.L. Giles, "Accessibility of Information on the Web," Nature, 400, 107-109, 1999.
  21. (undefined) . brightplanet.com. Retrieved July 27, 2009.

Best articles on the topic