How to set up smartphones and PCs. Informational portal
  • home
  • Interesting
  • Transfer files of the windows server archiving system. How to better organize data backup in a small company or on a personal PC

Transfer files of the windows server archiving system. How to better organize data backup in a small company or on a personal PC

friend November 10, 2012 at 02:21 AM

Simple backup server

  • Lumber room *

One of the tasks of the IT service of an enterprise (outsourcing, or full-time) is to ensure the safety of data both in normal and in exceptional situations. Professionals, on the other hand, make exceptional situations regular, working through them in detail. For this purpose, about six months ago, we set up a backup server to back up our clients' data.

Backups can be manual or automatic. Also, do not confuse backup with archiving. The task of the backup is to keep several up-to-date versions of the data in case “everything is lost, you need to urgently restore it”. Archiving is intended to access data for the past year, or the past five years. We combine these two services. Typically, daily backups are kept for a month, and monthly backups are stored for life. The storage period can be changed if necessary.

Let's consider backup in the context of services.

1. Backup of 1C databases
There are two options, the first - simple and correct. The base is located on the 1C terminal server, on this server, according to the schedule, a script is launched that archives the base with a password that is known only to the client and sends this base via FTP to our backup server, which accepts the archive and stores it securely.
If the database is located on the accountant's computer and all colleagues connect to it via a shared network folder, then the backup schedule can be made in such a way that during the copying, all employees were already resting, but the computer was already turned on, for example, at 10 pm with the subsequent shutdown of the computer. ... Or run the backup script manually when the accountant goes home. But in this case, we advise you to switch to a terminal server, which we can also help to do.
Manual copying can be started at any time, this is done with one click.

In terms of storage safety, everything is also thought out. The server administrator (who does not know the password for the archive) and the client have access to the archives. The client can add new archives, read old ones, but cannot change or delete them. Modification access is closed to exclude an unlikely but still possible case of harm (user or user virus). Removal is possible only by the administrator at the request of the client. The data can be stored for several years. True, the price depends on the duration of storage and the size of the databases.

2. Backing up websites
Although hosting is a fairly reliable thing - you always want to have backups on hand in case you roll back the version or urgently restore the site on another site. Let's even say this: hosters themselves make backups, but usually these backups are not stored for a long time, are made in their own format, and are not available when the hoster is "down". In these cases, keeping the current version of the site in a format convenient for you on a separate technical site is a paranoid dream.
The site consists of two parts: database and files. We backup both. Technically, automatic site backups are performed as follows: every night at a specified time, the backup server requests the web server to give the backup, which, in turn, archives the database dump and files, and sends everything to the backup server.
It is possible to create a backup copy of the site manually. In the control panel of our CMS there is a magic "backup" button, by pressing which an archive is created and sent to the backup server. Depending on the size of the site (after 5 seconds, or 20), the backup copy is already in a safe place, and you will receive a report on its successful completion or an error (if any) in your email.

3. Backing up servers
Quite often there is a need to back up specific servers, for example, corporate portal, domain controller, version control servers for developers. All this is possible!

4. Backing up user data.
Ideally, all user data is located in the data storage system (DSS) and is reliably protected. But this is not always the case. The situation is especially complicated if the company has several small offices or has remote employees. In this case, you can configure the backup of the most important data directly from the working machines, or local file servers. If the amount of data is large and measured in gigabytes, you can organize a local backup server. Or, if a Windows domain is used, a file server for storing user profiles.
For modern interaction, cloud storage of information is now used. It also makes sense to back up data from Google Drive, because the owner can delete the data, and those who share it will have nothing left. We periodically back up Google documents using the GDocBackup program and the corresponding script that archives with a password and sends it to the backup server.

We have considered the main cases when there may be a need for data backup, of course, in addition to the main ones, there may be non-standard ones, therefore contact us and we will help you to backup.

What is our backup server?
Physically, the server is very easy to install, it is much more difficult and interesting to organize a backup system. The hard disks were installed in RAID-1 to ensure data mirroring, and we configured an email report when one of the disks failed. Processor - Intel Atom, simple and cheap, because the server's task is only to store data. Or rather, to receive several tens of gigabytes at night, and to give several gigabytes upon request (usually during the day). The server is located in a server room, powered by an uninterruptible power supply and connected to the Internet via an optical cable.
The operating system is Linux Debian, which also ensures stability.

Any office is full of information. It is often the most valuable asset of the company. It is bad that this is remembered when there is a real risk of losing it. And even after a failure, after recovering only a part of the information, this lesson is quickly forgotten.

Another administrator will throw up his hands and say: “What to do? There is no budget, no understanding on the part of the leaders, therefore we also have no backups. It will break down - on their conscience. " But this is only half the trouble, because you can break it yourself. Incorrect configuration, error in configuration, cryptor (encryptor virus) - and the data is irretrievably lost. Therefore, it is necessary to make backups. Having achieved this understanding, you can proceed to the practical part.

In this article, we'll take a look at a possible backup approach in a typical small office running on a Microsoft platform and recommend several storage hardware options. Of course, in a large office or company, things are different. There are backup storage systems, tape libraries, and expensive specialized products. And backing up a data center is both a science and an art, to which you can devote not only an article, but your whole life.

Types of data and how to back it up

File servers

For quick file recovery without backups, it is convenient to use the shadow copy mechanism - Shadow Copies of Shared Folders. For its operation, as a rule, it is enough to reserve 5-20% of the disk space on the file server itself. In the schedule for snapshot creation, you can specify the end of the working day and noon. A 5% reserve allows you to store about 14 snapshots, the actual number depends on the disk size and the rate of data change.

Backups can be done with the built-in Windows Backup tool. There are also fairly reliable Cobian Backup and Handy Backup tools. Cobian Backup is a free application that supports Unicode, FTP, compression, encryption, incremental and differential backups. Handy Backup has even more features, including synchronization and data recovery from copies. We will be looking at how Windows Backup works.

Please be aware that only one copy of data can be saved to a remote network folder on the media server. And the next backup job will overwrite it. But in any case, keeping a single copy of the data is risky.

There is a simple and effective way to get around this limitation. You need to connect the disk for backups from the backup server via the iSCSI protocol. Windows Backup will treat this disk as local.

The first backup will be equal to the size of the stored data. Since Windows Backup uses a block backup, not a file backup, the next incremental backup will take as many disk blocks as actually changed.

An incremental backup is a record of only changed data. That is, you do not need to copy the entire database every time, it is enough to create a full copy of it once, and then make actual changes to it. In this case, the previous version of the data is not saved, the new version is written over it.

Differential backups, on the other hand, mean keeping previous versions. For example, when you create a copy of a database every day, you keep all previous copies for the week. This allows you to quickly roll back to a specific state. Differential copying writes modified data separately from the full copy.

Windows Backup does not require any additional configuration and completely manages the storage:

Automatic management of full and incremental backups. You no longer need to manage full and incremental backups. Instead, Windows Server Backup will, by default, create an incremental backup that behaves like a full backup. You can recover any item from a single backup, but the backup will only occupy space needed for an incremental backup. In addition, Windows Server Backup does not require user intervention to periodically delete older backups to free up disk space for newer backups-older backups are deleted automatically.


It is advisable to allocate two volumes of actually stored data for backups. This will be enough to store daily copies with a depth of about one and a half to two months. Frequency - daily.

Microsoft SQL Servers

Microsoft SQL Servers support three types of backups:
  • Complete... The entire database is copied.
  • Differential... Database pages that have changed since the previous backup are copied.
  • Incremental... The transaction log is copied (for databases in Full Recovery).
It is necessary to decide how often we create a full backup.
One of the benchmarks is the duration of the backup. It must be done outside of business hours or on weekends. The backup operation places a noticeable load on the server. If it is not possible to perform a full copy at night or on a workday, then such a task is performed on weekends.

The second guideline is the volume of differential copies and the duration of differential copies. Each subsequent differential copy becomes larger, as it includes the previous one. The more time has passed since the last full copy, the non-linearly longer it takes to create the incremental. Indeed, for a full copy, you can read the database files sequentially, and for an incremental copy, you need to read the changed pages in random places.

The frequency of the incremental backups depends on how much of the database it is acceptable to lose in a crash. If you are ready to lose one hour of work (that is, restore the database to an hour ago), then an incremental backup should be performed once an hour. More often, but remember about the load on the server. It should be remembered that backing up the database is only one way to ensure the safety of data. If data loss is unacceptable, as well as downtime during data recovery, then use mechanisms such as AlwaysOn and Log Shipping.

An important setting that needs to be done right away on the server is to enable compression for backups. This will reduce the amount of backup data by almost half. It should be borne in mind that when starting a backup, a volume equal to the actual size of the database minus blank pages will be reserved for the backup file on the disk.

The recommended storage allocation is at least two full database sizes. But this is a minimum requirement: often accountants need to keep a full copy of the database for each of the previous years, as well as full copies for the previous reporting periods in the current year. You may also need to make copies every day with a depth of at least a month.

Typical Schedule:

To implement the plan, you can create a "Maintenance Plan" that includes all three types of jobs.

Microsoft Exchange servers

This product supports two types of backups:
  • Complete... Complete databases and transaction logs are copied.
  • Incremental... Only transaction logs are copied.
It is important to perform regular backups, as only it allows you to delete ("truncate") the transaction logs for mailboxes that are not in circular logging mode.

Windows Backup only supports full Microsoft Exchange backups. To minimize the volume of stored copies, you can use an iSCSI-connected disk, similar to a file server.

Virtual machines

Most backup products allow you to back up a virtual machine with all disks without using agents inside the operating system. Veeam Backup & Replication allows you to perform full and incremental backups, as well as synthesize a new full copy, "rolling" incremental backups on the old full copy.

The free version only allows you to make a full copy, which negatively affects the backup window and the amount of transferred data. The amount of backup data stored on disk can be reduced by enabling Windows Deduplication. When a copy is taken from a virtual machine, a * .vib file is saved on the disk, and so on for each virtual machine. They are deduplicated quite efficiently. We created a backup at night, deduplicated in a day. This is a proven scheme, but requires a paid version of the product.

Given that Windows Deduplication operates in post-processing mode, the recommendation for the amount of storage allocated on disks is at least three full virtual machine sizes. The copying frequency depends on the server. If it is a web server with static content, then there is no point in copying it more often than once a week.

Basic hardware requirements

Disk subsystem

Backups generally do not place high demands on the storage subsystem. The main job recording pattern is linear, and a high load with a random I / O profile occurs only during deduplication of backups.

You have a choice between 2.5 "SFF drives and 3.5" LFF drives. We see no compelling reasons for choosing SFF drives. This type of disc has less storage capacity and is more expensive. They are indispensable when you need to remove more IOPS from one server (twice as many disks - twice as many IOPS). For the same reason, most of the proposed SFF drives are SAS drives with a spindle speed of 10 thousand revolutions.

The best choice for a backup server is a large SATA / SAS disk with a spindle speed of 7200 rpm. At the same time, SAS drives, in theory, give a little more IOPS than their SATA cousins, so if the price difference is insignificant, then they are preferable. In general, however, for backup servers, the MTBF is much more important.

If you plan to use the feature, it is obvious that the performance of the backup storage should be relatively adequate for the workload. A braking machine is often even worse than a non-working one.

If you have purchased a backup software product, the size of the backup will depend on both how the data is stored on disk and the efficiency of the built-in deduplication / compression mechanisms.

RAM and CPU

RAM and processor requirements vary by backup media.
For example, for the popular Veeam Backup & Replication they are as follows:
  • One core per concurrent backup job
    (https://helpcenter.veeam.com/backup/hyperv/limiting_tasks.html)
  • 4 GB of memory for product operation, plus 500 MB for each concurrent backup job.
In fact, each concurrent backup job uses multiple agents — one for transferring data, one for compressing, and one for deduplicating backups. However, host performance rarely becomes a bottleneck. Note that deduplication on Windows is block, variable block length and compression.

Veeam's proprietary deduplication results are quite modest, we prefer to do it using Windows Server 2012 R2. If you plan to use Microsoft deduplication, then you need to focus on the following system requirements: 1 core and 350 MB of memory per volume to be deduplicated. The recommended maximum volume size is 2 TB.

The disk is 1.5Tb in size, the volume of stored data is 720Gb, without deduplication, the data would take more than 1Tb.

Net

The minimum network interface speed is 1Gbit / s. It is difficult to find equipment that meets this requirement, but can fail the switch - be careful when choosing a network port. At 100mbit / s, a backup of 1 TB of data will last from 28 hours, which looks relatively acceptable. But when you need to make an additional copy during the working day, waiting 10 times longer is more expensive for yourself.

You can try to increase the speed with EtherChannel or multiple IP addresses, but these configurations are more difficult to maintain and the resulting speed may not always be as expected.

If you use VMware virtualization and a dedicated SAN network, paid products can significantly increase the copy speed by reading data directly from VMFS volumes (SAN Transfer).

We will discuss a few subtleties when choosing a processor and memory in the chapter on choosing a server.

Simple NAS "business series"

A typical NAS is a proprietary firmware / operating system device designed for file storage in a small office. Most modern NAS functions include storing and serving files using the SMB / FTP / HTTP / iSCSI protocols. A friendly web interface is used for configuration. Manufacturers often use proprietary technologies to create RAID arrays. But convenience comes at a price. The business series usually differs from home devices with an on-board processor - instead of ARM, more productive Intel Atom or lower-end Intel Core i3 are installed.

A typical representative is NETGEAR RN314 (estimated price without discs is 50,000).

pros: relatively inexpensive, hot-swap replaceable, proprietary software RAID.
Minuses: Low disk capacity (4 disks), slow performance, unable to install backup software directly to device.

Almost any NAS, even the simplest, allows you to connect iSCSI drives. But under load they do not work very well, the less memory in the device and the larger the volume of disks, the more problems there may be. And the access latency is so high that such disks are not suitable except for backups, even the file server will slow down.

Regarding deduplication, Netgear itself writes that it should not be enabled for iSCSI devices. From their article, we can conclude that the method used in their piece of hardware is very similar to that of Oracle ZFS. And ZFS is famous for deduplicating large amounts of data using a huge amount of RAM that these modest devices lack.

As for Windows, the memory requirements are quite modest. But a Windows Server formatted iSCSI disk is a VHD file. VHD deduplication is only supported for VDI (Virtual Desktop Infrastructure) scenario, so validate at your own risk for backup. And risking backups is the last thing.

Deduplication of the data itself stored in Windows Backup archives makes no sense. Since each differential copy only stores the changed data, there is nothing to deduplicate.

A number of disadvantages can be leveled by purchasing a slightly more powerful and capacious device - NETGEAR ReadyNAS 516.

6 disks, Intel Core i3, with the ability to connect up to three additional five-disk modules. The problem is the price - without disks, the device will cost 150,000 rubles.

You can pick up a rack-mounted model of a similar price.

The speed of devices of this class is limited by the speed of two not the fastest gigabit network interfaces.

Advanced NAS "enterprise grade"

These devices are already entry-level servers with the same proprietary firmware and software RAID.

For example, Netgear RN4220S.

The dual-unit model supports 12 drives with a total raw capacity of up to 48 TB. Two PSUs improve resiliency so you won't be left without backups while a new unit is purchased. Packed with just a basic Intel Xeon E3-1225v2 Quad Core 3.2GHz, 8GB RAM and two SFP + slots for 10Gb Ethernet, this NAS will set you back £ 400,000 without disks. It is very expensive and not very flexible, especially for a small company.

General purpose servers

A regular server is a good option if you're ready to tinker with it. Regardless of which operating system you choose - Windows or Linux - you have ample opportunity to create a configuration for your needs. You can entrust data storage to a good RAID controller with a cache, you can build a software array on Windows Storage Spaces or ZFS - the choice is yours. The backup system itself can be installed on the same server.

When choosing a server form factor, it is optimal to stay on a 2U server. In such a server, as a rule, you can install 12 LFF (3.5 ") or 24 SFF (2.5") disks. In addition, it has now become popular to have two slots for SFF disks in the rear of the server. They can be used for a system partition or SSD cache.

One or two processors? Server processors can contain anywhere from 4 to the absolutely fantastic 22 cores on a single die, so for a backup server, two processors are not a vital necessity.

However, in some cases, two processors can cost slightly more or even less than one with the same number of cores. And if you install only one processor, you may face the fact that not all PCI-E slots will work.

An example of such a limitation is described on the Intel website. Lenovo also warns that in a x3650 server with a dual-processor motherboard, in a single-processor configuration, you will get only one slot at all:

With one processor, only two fixed onboard PCIe slots (Slots 0 and 4) can be used (Slot 5 requires the second processor). An internal storage controller occupies PCIe slot 0.


It is necessary to select the number of cores that will optimally match the performance of the network and disk subsystem.

For example, if you have two gigabit network cards, then at best the server will be able to transfer data in two or four streams up to 100 Mb / s. (in reality, one stream rarely exceeds 50-60 Mb / s). A 4-6 core processor is enough for this. If a 10-gigabit card is installed in the server and the configuration of the network equipment allows you to get the corresponding stream, then our choice is at least 8-12 cores.

It is not necessary to take a top-end processor, for our task the not very powerful E5 is more than enough.

When choosing RAM modules, one should take into account the multichannel operation of the processor with memory (optimally, one module per channel), as well as the number of processors. As a rule, each processor has the same number of modules.

Which server model should you choose?

If you choose from HP servers, even the HPE DL 180 Gen9 dual-unit server launch lineup offers servers with a 12-drive cage. Configuring the server does not require you to think about the required cables, available connectors, and other subtle points to overshoot. The configuration wizard will help you do this without errors.

The x3650 M5 is suitable for the backup server from IBM. With a TopSeller - 8871EAG configuration with only 8 disk slots, it will cost less if you don't need more disks. The most suitable platform is the standard model 8871D4x. Use the Standalone Solutions Configuration Tool (SSCT) to configure the server. Remember to select the correct country when starting the program.

Finally, among the products of the third manufacturer of the “big three” - Dell - we can recommend the R510 model.

Happy backup, we wish your data to be safe and sound.

Tags:

  • backup
  • backup
  • backup
Add tags

Backing up the server and workstations allows you to back up either directly on the server, or over the network, under control from the center console... Server network backup tasks are created by the network administrator, not the user.

Both server backup and workstation backup are performed by the Handy Backup client in background that allows you not to distract users from performing basic work tasks on computers in the network.

Server and workstation backup architecture

In order to prepare a network backup using Handy Backup Server Network, the system administrator must define a computer that will act as Server backup, usually the system administrator's own computer. It serves to manage all backup tasks and to write backups made to external storage media.

Remote servers and workstations in the network, information from which must be copied through the central console, are installed Network Agents Handy Backup. Network Agents do not have a graphical interface and are used to provide Server access to data stored on remote computers.

Handy Backup Server installed on the administrator's machine, allows you to manage backup, synchronization and data recovery operations on the server itself and network workstations.

In addition to copying files, the program provides backup of physical servers, backup of Hyper-V virtual machines, VMware Workstation and others, backup of Exchange Server, MSSQL, MySQL / MariaDB, PostgreSQL, Oracle, Lotus Notes on servers and workstations, as well as backup of any ODBC compliant databases on the server.

Handy Backup Network Agent(Network Agent) is installed on each user computer whose data needs to be copied. The client runs as a Windows service, does not have a graphical interface, and allows the program to collect data for backups from users' servers and workstations.

How network backup works

  • Network Agents at regular intervals contact Handy Backup Server thus the Backup Server has a list of computers available for remote backup over the network.
  • Administrator Server Handy Backup creates backup tasks for networked workstations. You can create both separate tasks for user machines and a single task for all machines.
  • Server Handy Backup accesses each machine from the list directly and requests a list of files and folders available for backup. The administrator can copy the required data and create disk images.
  • After completing the data selection, the administrator sets up a schedule for performing tasks, as well as sets various copy options.

In addition to backing up information on network computers - files and folders, disk images, you can backup databases (1C, MySQL, Oracle, etc.), MS Exchange and Lotus Notes data stored on the server itself. The program also allows you to back up VMware, Hyper-V and other virtual machines.

Handy Backup Server Network consists of Control Panel and Agents:

* Includes Control Panel and one Network Agent for server backup (installed on the same computer as the Control Panel).


Handy Backup Server and Network Agents work under OS Windows 10/8/7 / Vista or 2016/2012/2008 Server.

User feedback on Handy Backup

"I am pleased to use Handy Backup, a product for creating and restoring copies of programs and data. save data not only from one computer, but also from various workplaces ... They are attracted by reliable work, friendly interface, support possibilities. The program can perform work on a schedule, save and restore data on various devices. "

V.I.Krysh, Lead programmer of KB "Plast-M"

Your data can be encrypted by a virus, it can sink without a trace on a faulty hard drive. Several hours of work on one file can be ruined by accidentally saving another document over it.

The accounting base after a crooked code can turn into a mess and the contacts of all your counterparties will remain in it. And someday a competitor will set the authorities on you, which will confiscate your servers and paralyze the work of the entire company, making you bankrupt in the end.

Those files that you do not need now, for sure, you may need them tomorrow or in 5 years. Where are these files? - Yes, on an old computer / flash drive / formatted removable media ...

And all this should be stored in backups. In encrypted form (as appropriate), on a backup medium.

How to do this if you have a small company or a personal PC, and have a limited amount of money?

one#. Backing up data on each stand-alone computer:

User workstations must be configured with shadow backup using standard windows tools. (In windows 7 it is done through properties badge computer > Additional system parameters > System protection). You can enable both a backup of the registry upon changes (checkpoints) and saving file states on local disks. You will have to sacrifice free space on the HDD, but nerves are more expensive.

After unwanted (unintentional) changes to a folder or file, you can restore it to its previous state.

If regular backup cannot be used for any of the reasons, you can use third-party software such as acronis backup and recovery (paid) or (free). There are a lot of programs on this topic.

However, backing up data within the same physical disk will not save you from the danger of its failure. It is difficult to assess the value of a backup when it, along with the original data, is in bad sectors on the HDD :)

Let's just say - system backup by standard means: "must have". But try to duplicate important things in the network. To do this, you can:

a) Use VDS hosting (the cheapest tariff with 5GB of space is 100 rubles per month)

b) Use free space on cloud services (google drive, icloud, yandex disk, etc.). For example, google drive supports recovery of previous versions of files. And even if the unintentionally changed file is already synchronized, it can always be restored. You can read some helpful tips on google drive.

c) If there are very few files, everything can be stored in mail. By sending letters with important files to yourself or to a special mailbox. Finding such files will be difficult, but mail systems provide enough disk space for free. In one company maintained by the author, most of the files encrypted by the virus were recovered from the mail sent to counterparties :)

2 #. Backing up data in a company with several (and more than 10) workstations.

An ideal option for enterprise backup would be a centralized server inside the company (FTP server with RAID 1) or outside (VDS server with FTP service).

Storing, say, a 1C database or a contract on a Google drive is not entirely safe, because having lost access to mail or if access fell into the hands of intruders, the company will definitely suffer. Although the author has acquaintances of individual entrepreneurs who only work in this way. For the latter, everything is encrypted on a google drive;)

a) In the case of a server within a company, one-time costs for the file server itself (50-100 thousand rubles) are required, depending on the level of reliability. Then costs can arise when the iron breaks down (which is not often the case). Also consider electricity costs.

b) In the case of external storage on VDS, you pay 1 time for setting up by the administrator for IT outsourcing (in the region of 5 thousand rubles, depending on the number of computers for backup) and monthly 500-900 rubles (depending on the amount of information) for hosting VDS. Please note that in this case you need a faster Internet connection. At least 5 Mbps upstream speed.

Case b) additionally solves such piquant problems as a sudden server failure, confiscation of the server by the authorities :), theft of data by company employees who may have physical access to the server, etc.

Below is a schematic representation of backup options for a very small enterprise of 5-30 computers.

In the diagram above - option a). Data from all servers and user workstations are copied to a file server with a fault-tolerant disk subsystem. On the one hand, we have an operative backup at hand in the form of a shadow copy, and on the other hand, we can always get data from different servers at any time if the server (computer) physically fails.

If the company is small, then the roles of the web server, database server, and file server can be physically combined on the same platform, but there may not be an application server at all.

On the other hand, all servers can be located in a virtual environment on one physical server, and file arrays can be stored on disk shelves (but this is more suitable for large companies due to the higher cost).

Such a scheme has a number of disadvantages - the server must be provided with uninterrupted power supply, kept in a special room (ideally) and limited physical access of employees and other unauthorized persons to it. The author knows a company in which it is considered a tradition (among employees) upon dismissal - to drag a hard disk with a database, while the server is still under the table of one of the managers :)

As for the backup settings, the author recommends backing up important data to the file server once a day, and if you have critical data and work with it frequently - twice a day.

Software as an option, you can use Areca (cross-platform java application) + Windows Task Scheduler. Areca creates a script with backup parameters (where to copy, encryption, type and names of copies) which is added to the windows task scheduler or cron Unix. You can read the article on.

As it seems to the author, option b) is more preferable, because the company almost disappears from the headache regarding the safety of backup data. But there are also a couple of disadvantages here: - if you use VDS for backup, then this server cannot be combined with anything. You can, of course, put your applications there (1c), but then, in addition to disk space, you will also have to pay for additional processor time and memory (and these are already different amounts).

Another obvious disadvantage is this. And in the absence of a sane provider nearby, you only have option a).

So the second option with VDS (b):

The data goes in the same direction as in the first diagram (not shown in the figure), but now everything is sent via the Internet to a remote VDS server. Areca perfectly encrypts data on the user's side and in this form they are placed on the VDS using the FTP protocol. As an FTP server on a VDS You can quickly set up vsftpd, there is an example of how to configure it.

It is worth considering one nuance: "Copying files using the ftp protocol with SSL or TLS significantly slows down the process, and with large amounts of data it may freeze altogether."

You just need to think over the backup policy, namely: “Collect all important data first within the network on some network storage (for example, a shared folder) and then, under one FTP account, drop them to VDS at the appointed time. Or dump data from all computers at different times under different accounts. " The first option will be better if there are more than 5 computers. If the network is small, then there is no need to allocate a separate network storage.

How to make backups is up to you, here were presented the most budgetary backup options.

Users who read this post usually read:

In contact with

In this article I want to talk about the problem of saving data.

We all are once faced with the fact that we need to save important information, be it photos, text documents, 1C: Enterprise databases of all configurations, or any other important information for us. Many users do not want to waste their time or do not know how to write a small "batch file" correctly to copy their data. This is why there are small programs for backing up various data. You can read more about backup.

I faced the need for automatic backup when I needed to copy databases from 1C: Enterprise every day. When looking for a program that is convenient and meets my requirements, I found what I needed - this is the Cobian Backup 11 program. This program turned out to have a very understandable and not complicated interface, and most importantly, it is free, which is very good nowadays.

1. Installing the program

And so let's move on to installing the program. Download the distribution kit of the program from the official website and run the downloaded file. On the first page of the installer, select the language you need and click "Further" .

On the next page, we will be asked to read the license agreement and accept the terms of this agreement. We press "Further" .

Before us there is such a window with a choice: where and what to install.

  • "Shadow Copy Initiator" serves to copy files even when they are open or in use by applications. I recommend installing it, but if you need a backup for one time, then, in principle, the shadow copy initiator can not be installed. The shadow copy initiator requires Microsoft .NET Framework 3.5 to be installed. How to install the .NET Framework component in, read.
  • Installation script is needed in order to remember all the parameters of the current installation, and at the next installation, do the same without your participation.

By default, we are offered to install this program as a service, I recommend leaving everything as it is if you use it on a server or on a computer where there are several users. Firstly, as a service, the program will work even if no one has logged into the system, and secondly, the program will be able to use network resources to store data on an ftp server, on another local computer on your network or on a network storage.

If you need it, for example, for a one-time use, or you do not have to refer to it too often, then you can choose any of the items of your choice.

When installing the program as a service, you must specify an administrator account and password..

That's it, this completes the installation.

2.Setting up the program

Let's proceed to the main task of configuring the backup task. Run the program, select from the menu "Exercise" , then "New task".

The wizard for creating a new task will start. Here we set "Name" tasks, set the necessary checkboxes and select "Copy type» .

Go to the tab "Files", then press "Add" to indicate what to copy and where.

We indicate the file, directory or folder on the ftp server that you need to copy. We also indicate where to copy by choosing from the list. Everyone should understand that making a backup and saving it to the same hard disk, in case of its physical collapse, nullifies all your efforts, it will be problematic to restore information, when saving to different partitions of the disk, it will save you only in cases when you crash the section where the basic information is located or when deleting. Therefore, try to save it to an independent medium, such as network storage, external hard drive, USB flash drive or a computer located on the same network, and of course to an ftp server.

Go to the bookmark "Schedule"... As you can see, the settings are here for any occasion, you can set them up as you like: at least once a year, at least every day at a certain time.

Go to the tab "Cyclicity". Z Here you can set the priority of your job, and to save space, you can set the number of full copies to keep.

Go to the tab "Compression" and indicate what kind of compression and division into parts of your files we need.

3. Conclusion

In general, I really liked this program, I think many will like it. A very clear and simple interface will not cause problems with setting up those tasks that you may need in everyday life.

Did this article help you?

Top related articles