How to set up smartphones and PCs. Informational portal
  • home
  • Windows 7, XP
  • Game tests: Lock On: Modern Air Combat. Test results: performance comparison

Game tests: Lock On: Modern Air Combat. Test results: performance comparison

Graphics cards like the nVidia GeForce 7800 GT are clearly aimed at gamers. The GT line packs all the features of the flagship 7800 GTX card at a much more attractive price point. That is why we decided to select two models based on the 7800 GT for review.

7800 GT is an option nVidia processor G70 with the same DirectX 9.0c, OpenGL 2.0, and Pixel Shader 3.0 support, but with 20 pixel pipelines and seven vertex program units, i.e. 4 pixel pipelines and 1 vertex unit less than flagship model... The GT version is equipped with 256 MB GDDR3 video memory with a 256-bit interface. The average price of GT cards in Russia is $ 450, but you can find cheaper models. On the world market, the price of 7800 GT has dropped to $ 360.

We received two graphics cards: the EVGA 7800 GT and the XFX 7800 GT. The layouts of both cards are close to nVidia's reference design, but XFX and EVGA have taken the liberty of boosting clock speeds to set their products apart from the competition. Well, let's take a look at the capabilities of the cards.

EVGA7800 GT graphics card with interface PCI Express differs in a core clock speed of 425 MHz compared to 400 MHz according to nVidia specifications. The card ships with 256MB of 256-bit GDDR3 memory clocked at 1.07 GHz (DDR).

Like most nVidia 7800 cards, this model occupies one slot while providing sufficient cooling for the GPU and memory modules. The cooler is quite quiet even in 3D mode.

Characteristics table

general information
Card name
Manufacturer EVGA
GPU nVidia GeForce 7800 GT
Tire standard PCIe
Memory 256 MB
Memory type GDDR3
Core frequency 445 MHz
Memory frequency 1.07 GHz
Memory bus 256 bit
Number of pixel pipelines 20
Number of vertex blocks 7
3D functions
DirectX support DirectX 9.0c
Texture filtering Bilinear
Trilinear
Anisotropic max. 16x
Smoothing 2x / 4x / 8x / 16x
Outputs
VGA Not
DVI 2x
S-Video out Yes
HDTV TV Out (Y, Pb, Pr) Yes
Inputs
Power cable Yes
Audio input Not
S-Video input Yes
Composite video input Yes
Equipment
Cooling Single slot cooler with fan
Rotation speed control Automatic
Hardware monitoring Auto
Adapter DVI-I -> VGA 2x
Cables I / O cable with external block
S-Video cable 1x
Component output YPrPb (HDTV) Yes
Software Cyberlink PowerDirector 2.55 ME Pro
Games Battlefield 2
Other N / A

Kit software that comes with the card is rather meager. EVGA only ships one app and one game. As for the application, it includes CyberLink's PowerDirector 2.55 ME Pro CD / DVD burning software. The game part is represented by Battlefield 2. The game is relatively new and therefore has a certain value. In addition, EVGA announced on October 20 that 7800 GT cards will ship with Call of Duty 2 when this game launches. Until then, customers can register on the company's website and receive a copy of themselves when the game is released.

The card ships with the latest driver at the time of product release, so we recommend downloading the driver from the nVidia website. NVidia has already released version 80 WHQL-certified drivers following the release of this article and testing the cards.

The card comes with a full set of cables, which gives it a bonus. An additional power cable is included so that owners of older PSUs can plug in two Molex plugs and get a 6-pin power plug.

After the summer announcements of new generation video cards - GeForce 7800 GTX and GeForce 7800 GT, and the first results of their tests, Californian NVIDIA instantly became the "king of the hill" for a long time (by computer standards). The Canadian ATI was able to give a decent answer only by November, finally giving birth to the RADEON X1800 line. It is common knowledge that quite for a long time after the announcement of any Hi-End class video card, all its copies, ordered by the chip developer, are produced at the same plant (or several), and all other brands only buy ready-made adapters, stick their stickers on them, assemble, pack and sell under their own name ... And if most manufacturers quietly put up with this state of affairs, then such an ambitious company like ASUS, on the contrary, seeks to introduce at least something of its own into the product design, be it just a cooling system. Just remember how even before the official announcement of the i945P chipset in May, the company had already demonstrated its motherboard design. And this has always been one of the components of the company's success and respect.

Specifications

Nothing revolutionary can be seen in the architecture of the new G70 chip or the previous classification of NVIDIA - NV47. The mere fact that the new chip is pin-compatible with the old NV45 speaks volumes. The processor is manufactured according to the technological process tested and tested on junior chips for 6xxx generation cards. The main change is the increased number of pixel pipelines to 24 (6x4) and vertex pipelines to 8. The maximum addressable volume of promising video memory GDDR3 SDRAM has increased to 1 GB. Support for shaders 3.0, which the new generation of ATI video cards can boast, was inherited by the G70 from the predecessors of the NVxx series. Thanks to the power optimizations of the chip, it has been possible to significantly reduce its overall power consumption. Added hardware support for HDTV-Out and some special features of the upcoming graphical environment Windows Vista- options, the practical value of which tends to zero today.

The graphics processor became the basis for the GeForce 7800 GTX - the senior video adapter in the new line. In order to obtain a simplified modification - GeForce 7800 GT - intended for a more complete capture of the Hi-End market segment, NVIDIA applied the classic cut-back scheme. Four pixel and one vertex processors were disabled, and the operating frequencies of the core and video memory were reduced.

ATIRADEONX850 XTPE

ATI Radeon X1800 XL

NVIDIA GeForce 6800 Ultra

NVIDIAGeForce 7800 GT

GPU

Technological process, micron

Number of transistors million

Core frequency,MHz

Video memory type, bus width

GDDR3 SDRAM 256 bit

GDDR3 SDRAM 256 bit

GDDR3 SDRAM 256 bit

GDDR3 SDRAM 256 bit

Memory frequency,MHz

Peak video memory bandwidth,Gbit/ s

Pixel shader support

Vertex shader support

Version supportDirectX

Number of active pixel pipelines

Number of texture units per pipeline

Number of vertex processors

Theoretical pixel fill rate (Fillrate),MPixel/ s

ASUS Extreme N7800 GT

The video card arrived for testing in a huge box.

It would easily be possible to pack two Micro ATX form-factor motherboards (stacking them side by side, not on top of each other) with all the complete set and, perhaps, there would still be free space. In general, it seems that all the designers of boxes for video cards together fell ill with gigantomania (or megalomania?). Otherwise, how to explain the fact of a constant increase in boxes with practically unchanged overall dimensions boards? And even the impressive package bundle of the product cannot justify this. The package was found to contain:

  • ASUS Extreme N7800 GT graphics card;
  • VIVO adapter cord;
  • an adapter from the MOLEX power connector to a six-pin one for a video card;
  • two adapters DVI to D-Sub;
  • 5 CDs with drivers and software from ASUS, and 3 DVDs with games;
  • instructions for installing drivers and utilities
  • case for disks.

Probably, there is no point in talking about the practical value of disks in the countries of the former USSR - and so everything is clear. The fact that there is a case for complete CDs looks touching: until you take it out of the bag and see that the same quality, only without the ASUS inscription, can be bought for a few dollars at the nearest flea market. However, it's still nice to see such things in the standard configuration of a video card - this is exactly what ASUS has always been famous for. The instruction, despite the mistake in the word "Russian" right on the cover, is translated well, and all screenshots and illustrations are color. The adapter, in addition to the standard for VIVO video cards, low-frequency and S-VIDEO inputs and outputs, also has a composite (three-component) RGB output, which will be appreciated by owners of Hi-End TVs.

The appearance of the video card is impressive. The first thing you pay attention to is its enormous size. The board is so long that ASUS engineers, fearing sagging, installed a stainless steel plate on it, which serves as a stiffener. Installing the accelerator into the case, you begin to seriously fear that its edge will rest against the basket with hard drives... In a medium-sized case (for example, INWIN S-523), only a few centimeters are left to the hard drives.

Most of the surface of the PCB from the front side is covered by the cooling system. Only power elements nutrition. On the reverse side of the board, you can see only a scattering of resistors, capacitors, voltage stabilization microcircuits and other power elements - all this is of interest only for voltmodders.

The connector panel contains two DVI-I digital outputs with additional analog outputs and one nine-pin jack for connecting a VIVO adapter.

Having admired the shape of our beauty, we take out a screwdriver and start preparation :). By unscrewing three inconspicuous screws, you can remove the turbine with a plexiglass casing, equipped with a thin metal cover, on the front side of which the ASUS inscription flaunts. On the reverse side, in the area of ​​the impeller and on the edge of the casing, you can see small pieces of textolite on which rows of super-bright blue LEDs are installed, which effectively illuminate both the video card itself and the surrounding space during operation.

After removing the turbine, it becomes clear that the cooling system for the video memory and the chip is not integral.

The GPU has its own all-copper heatsink. To dismantle it, you have to deal with four more screws screwed into the metal bushings of the plastic substrate, which serves to eliminate distortions, as well as to prevent the board from bending around the GPU. The polish quality of the sole of the copper cooler is disgusting. Traces left by the abrasive of the grinding wheel are clearly visible, and the surface roughness is clearly felt even tactilely. A moderate amount of white thermal paste is applied to the processor die.

The heatsink for cooling memory chips is made of aluminum and has a more complex design. Thin ribs are welded to the base of the part covered by the turbine casing, and thick needles are made on the side without forced ventilation to increase the surface area.

In order to improve the heat transfer of memory chips, a suspicious thermal interface is used, the quality of which is even more questionable than that of the notorious thermal gum. It is based on a synthetic reinforcing mesh, and the substance itself resembles plastic for modeling (similar to plasticine, only fragile).

By implementing cooling of the video memory and GPU using various radiators, ASUS developers have solved a rather urgent problem when, when using an integral system, a powerful GPU, whose temperature can reach more than 80 ºС, forcibly heats up the video memory chips. All screws holding the system are equipped with coil springs that create additional tension in the thread, which excludes spontaneous loosening due to vibration.

It is a pity, but the speed of rotation of the turbine impeller is constant and rather high. Therefore, the cooling system permanently exerts a rather high sound pressure on the user. This miscalculation by engineers is a significant drawback. Well, against the background of the fact that the video card, like all high-end NVIDIA products, starting with the FX line, has a special 2D mode with reduced GPU and memory frequencies, this omission looks ridiculous at all. Moreover, during maximum load the GPU temperature was 73 ºС.

So, stripping the card, we see that its PCB design is completely identical to the reference one from NVIDIA. More precisely, the board was clearly ordered by the chip developer himself, and then sold to ASUS. The textolite is painted in a pleasant dark blue color. The memory chips are evenly semicircle around the GPU, as a result of which the track length to each microcircuit is the same, which makes it possible for them to work at frequencies above 1200 MHz. Maximum amount video memory provided by the PCB design is 256 MB - exactly the same amount installed on the tested card. GDDR-3 chips are packed in FBGA packages. Unfortunately, the markings from the microcircuits have been removed. In some places, you can even see its remains in the form of individual numbers engraved on the lid. Why the manufacturer needed to remove the marking is unclear.

Thus, it is not possible to obtain any information about the manufacturer or the nominal operating frequency of the chips. To ensure the operability of a video card with the memory frequency specified in the 7800 GT specification, microcircuits with a 2.0 ns access time are sufficient.

Almost in the very center of the board, on the front side, there is a G70 (NV47) graphics processor.

Judging by the markings on the surface of the silicon wafer, the chip was produced in Korea (naturally, South) on week 26 (June) 2005. The inscription GF-7800-GT indicates that four pixel pipelines and one vertex processor are disabled in the processor. Unfortunately, the more "tinsmiths" around the world try to find a way in which NVIDIA disabled the rendering lines, the more it becomes clear that it is unlikely that they will be able to remove the lock. At the moment, it is reliably known that the pixel and vertex units were disabled not by writing certain variables to programmable registers, not BIOS means without burning through the jumpers laser beam and not soldering resistors to specific pins. All these protection methods, which could be bypassed by any user with a goal, are in the past. It seems that the number of working pipelines is set in some way not previously used immediately after testing the chip. NVIDIA developers have done a good job of not making any more gifts for overclockers like the programmable registers of the first NV4x.

VIVO functions are implemented using the Philips 7115 chip. In general, to be precise, the chip provides only the ability to capture video, and the video output support is built into the GPU itself.

Testing

The stand of the following configuration was used:

  • motherboard: DFI LANPATY NF4 SLI-DR (nForce4 SLI);
  • processor: AMD Athlon64 3500 (Venice E6), 2200 MHz (11 x 200);
  • RAM: 2 x 512 MB, DIGMA DDR400 SDRAM (SPD 3.0-4-4-8 200MHz);
  • hard drives: Seagate ST3200822AS 200 GB SATA 7200 rpm and Seagate ST3120827AS 120 GB SATA 7200 rpm;
  • power supply unit: PowerMan 350 W;
  • cooler: Spire SP779 V3-1, 3000-6000 rpm ;;
  • operating system: Windows XP Professional SP1.

The processor was overclocked to 2728 MHz, for which the supply voltage was raised to 1.55 V. RAM operated at a frequency of 248 MHz with delays of 3-3-3-8 1T and a voltage of 3.1 V. Video card drivers were used ATI Catalyst 5.11 and NVIDIA ForceWare 81.85, as well as NVIDIA nForce 6.66 system drivers. In addition to our today's heroine, the following video cards took part in testing:

I think that comments on the test results are unnecessary. ASUS Extreme N7800 GT, even without overclocking, is consistently ahead in all tests from our today's set. Insofar as ASUS board Extreme N7800 GT was made by order of NVIDIA itself, the performance of the video card can be considered with the same success as the work of the reference card GeForce 7800 GT from the chip developer. The adapter easily bypasses both the SLI bundle of the recent 6600GT sales hits and top solution the previous generation from ATI (if the reader may say so, the performance of ATI RADEON X800 GTO in overclocked mode and ATI RADEON X850 XT PE is considered approximately equal in the nominal).

conclusions

ASUS Extreme N7800 GT is a solution that left a mixed impression. Of course, it can be praised for the bundle, traditionally rich for ASUS products. It was nice to finally see the cooling system installed by the brand that released the card to the market, and not a heatsink made by NVIDIA contractors with a re-glued sticker. Somewhere closer to the zero of the scale of impressions, the sensations of successful overclocking arise (although ASUS is not to blame for this). Well, in the minus sector we ended up with a noisy turbine with an uncontrolled angular velocity of the impeller. Overall, the ASUS Extreme N7800 GT is a good product for those gamers who can afford to spend over $ 400 on a video card.

Summer has crossed its zenith and is inexorably moving towards the finale. September is on the horizon. If it is still hot during the day, the nights are colder and colder. This is a clear sign of the coming autumn ... and a sharp rise in demand for all kinds of computer purchases, which include 3D accelerators.

Alas, the Canadian ATI still shows no signs of life from across the ocean (only deciding to make another present in the form of a new-old RADEON X800GT card, but we'll talk about it next time). There are no new products from Toronto yet. On the other hand, Californian NVIDIA, having gotten hotter at the beginning of the year, when it overcame the terrible depression with the deficit of GeForce 6800 senior models; having presented a new High-End GeForce 7800 GTX, it hastens to present its more modest brother - GeForce 7800 GT.

The fact that after the new most powerful accelerator to date will be followed by a slightly less productive, but still a High-End accelerator, was known almost back in June. But why did it take 1.5 months between the releases of similar members of the new Hi-End family? The GeForce 6800 GT came out a month after the GeForce 6800 Ultra, and the 6xxx GT and Ultra families differ only in frequencies. Considering that at that time there were very few chips, and it was possible to collect the required number of chips that did not pull the frequencies of the "ultra" and release GT on them only after a certain period of time.

And in the case of the 7xxx, the situation is different: the 7800 GT differs from the GTX not only in lower frequencies, but also in the reduced number of rendering pipelines. And, consequently, it will take more time for the accumulation of rejections, even taking into account the fact that the mass production of the G70 began in the spring. In addition, NVIDIA turned out the crystal very successful, with good interest the release of good ones, therefore, or wait a long time until a certain number of chips accumulate that are not able to work as GTX, but can potentially pull the GT load, or "cut" normal crystals along pipelines. I believe that the second is more painful than the first.

Plus marketing interests: they dictate when it is best to announce a new product.

And so, on August 11, the GeForce 7800 GT was released. The suffix GT indicates that this is a mid-range in the 7800 family. Logically, there may also appear a GeForce 7800. However, like the 7800 Ultra, which is replaced by a 7800 GTX.

I note once again that the GeForce 7800 GTX has 24 pixel and 8 vertex pipelines, the chip operates at a frequency (Shader Unit / ROP unit / Vertex unit) 430/430/470 MHz, and cards based on it are equipped with GDDR3 memory, which operates at the frequency 600 (1200) MHz.

The GeForce 7800 GT has the following characteristics: 20 pixel and 7 vertex pipelines, core frequencies: 400/400/440 MHz, the cards will be equipped with memory with a frequency of 500 (1000) MHz. The recommended retail price is $ 449, which is about the level of the RADEON X850 XT, which will be the main opponent of the new product. There is information that the price for the GeForce 6800 Ultra will drop to $ 399, on the 6800 GT - to $ 299, on the 6800 - to $ 199. As a result, 6600GT will cost 149, 6600 - 99, 6600LE - 79. I believe that this will hit the interests of the Canadian ATI hard if it does not provide for reciprocal price cuts. Although in this case too, one cannot get super profits (it is no secret that 6% of sales in the Hi-End sector bring NVIDIA and ATI up to 40% of all profits from sales of video chips, no matter what local sellers or marketers like to say). And you should, of course, take into account the fact that at the beginning of sales prices will be higher, as is always the case with new products.

As for the rest, these are chips (cards) completely identical in their capabilities in 3D and 2D. Therefore, if anyone does not know what the G70 (GeForce 7800-series) is, then you can read our basic material on the GeForce 7800 GTX.

Let's move on to the map. Unlike the base materials published earlier, this review will be based on testing not a reference card from NVIDIA, but a serial board from Palit. However, there is nothing to suggest that Palit made it. Before us will be just the reference card, offered only by NVIDIA's partner Palit. I have already said more than once that all Hi-End products that ATI and NVIDIA have are made from the very beginning at the same factory by the orders of the chip makers themselves, and not chips are sold to partners, but ready-made cards. Only then, after a while (usually six months later), a number of large manufacturers start to independently produce such cards, and products based on ATI RADEON X850XT / PE / PRO chips never leave the production lines of ATI partners (they buy all such cards from ATI and sell under their own brand name).



Comparison with reference design, front view
Palit GeForce 7800 GT 256MB

Comparison with reference design, rear view
Palit GeForce 7800 GT 256MBReference card NVIDIA GeForce 7800 GTX
Reference card NVIDIA GeForce 6800 Ultra

It is interesting to note that the design of the 7800 GT is somewhat simplified, relative to the 7800 GTX, in particular, the PCB is designed for mounting only 256 MB of GDDR3 memory, its length is equal to the length of the 6800 Ultra, however, as can be seen from the pictures, relative to the last PCB of the 7800 GT is simplified ... By the way, it is quite possible to transfer the 6800GT / Ultra to a new cheaper design, because since it ensures the operation of the card at 400/1000 MHz, the 6800 GT with its 350/1000 MHz will be able to work even more (the G70 and NV45 chips are pins compatible ). And the 425/1100 MHz of the 6800 Ultra will be able to work with such a PCB without any problems. In addition, an upcoming price cut of $ 100 for these cards has already been announced.

It is also worth mentioning that the 7800 GT is equipped with a pair of DVI connectors and has contacts for connecting in an SLI tandem.

Palit GeForce 7800 GT 256MB

The cooling system was thoroughly studied in the basic material on the G70, I just remind you that the central heatsink is combined with a heat pipe with a heatsink that cools the memory. Further: the device as a whole is very narrow, single-slot, which is good news. The cooler as a whole is very quiet, only in the first seconds after starting you can hear a small noise from the turbine rotating at high speed. The noise can be estimated on this video (1.5MB, WMV) Compared to the cooler from the 7800 GTX, this model differs in the reduced length of the heatsink. A copper plate is installed in the place of contact with the crystal of the core.


Here is the GPU itself

Let's compare with the 7800GTX:

It is clearly seen that the chips are absolutely identical from a visual point of view. No bridges or differences were found on the GPU substrate. Consequently, the disabling of four pixel and one vertex pipelines was done either inside the chip itself, or using programmable registers, or through the BIOS. The last two methods give a chance to unlock. First, you can say goodbye to the dream of converting GT into GTX. To the regret of overclockers, it turns out that the developers from NVIDIA came up with the first method. But more on that below.

In conclusion of the review of the card itself, it should be noted that it is equipped with a codec that VIVO gives us (VideoIn, VideoOut). That is, the product can at the amateur level capture an analog signal and record it in digital form. I would like to take a promise from our permanent author of this topic, Alexei Samsonov, that, despite all his busyness, he will finally study this codec and write about its capabilities in a separate article.

Equipment.


Package.

Installation and drivers

Testbed configurations:

  • Athlon 64 based computer (939Socket)
    • AMD processor Athlon 4000+ (2400MHz) (L2 = 1024K);
    • ASUS A8N SLI Deluxe mainboard based on NVIDIA nForce4 SLI chipset;
    • RAM 1 GB DDR SDRAM 400MHz (CAS (tCL) = 2.5; RAS to CAS delay (tRCD) = 3; Row Precharge (tRP) = 3; tRAS = 6);
    • hard drive WD Caviar SE WD1600JD 160GB SATA.
  • RADEON X850 XT PE (PowerColor RX850XT PE, PCI-E, 256MB GDDR3, 540/1180 MHz);
  • RADEON X850 XT (HIS, PCI-E, 256MB GDDR3, 520/1080 MHz);
  • GeForce 6800 Ultra (ASUS EN6800 Ultra, PCI-E, 256MB GDDR3, 425/1100 MHz);
  • GeForce 7800 GTX (ASUS, PCI-E, 256MB GDDR3, 430/430/470/1200 MHz);
  • operating system Windows XP SP2; DirectX 9.0c;
  • monitors ViewSonic P810 (21 ") and Mitsubishi Diamond Pro 2070sb (21").
  • ATI drivers 6.553 (CATALYST 5.7); NVIDIA version 77.77.

VSync is disabled.

In the course of work, the temperature was monitored, as expected, there are no special surges. Limit values ​​up to 80 degrees for core heating.

Overclocking. As you know, ROP / Shader units can operate only at 432,459,486,513 MHz (potentially higher, but such chips are unlikely to be found). Only the geometry block allows you to smoothly adjust its frequency of operation. The video card did not pull the core frequency at 486/486/530 MHz precisely because of the first two blocks, so stability during overclocking was obtained only at 459/459/510 MHz. The memory limit is 1120 MHz. Thus, we can state that the 7800GT is really rejected from the GTX, the chips even with 20 pipelines are capable of barely keeping 460 MHz on the decision blocks.

So, we have a GeForce 7800 GT. As you can see, it was easily recognized by the RivaTuner utility (by A. Nikolaychuk AKA Unwinder), and showed the correct number of pipelines and frequencies. And monitoring works (as you can see above).

Of course, I tried to unblock 1 quad and 1 vertex pipeline, however, as I noted above, the attempt was unsuccessful. Although the register is reprogrammed, this number of pipelines is hardwired into the kernel itself. NVIDIA got rid of the "rake" that it has been stepping on since the days of NV4x. Back in the days of the later NV40s, information appeared that unlocking for the 6800 / 6800LE had already become impossible (there is a possibility that these cards are already equipped with NV48 - manufactured by TSMC at 0.11 microns, where the number of pipelines is flashed in hardware at the stage of testing the crystal or even before). And NV41 / NV42 that go to the 6800LE (8 pipelines) are not unlocked more (12 pipelines). So, for overclockers and those who want to buy cheaper and easily get more power for free. expensive cards- unpleasant times come.

Test results: performance comparison

As a toolkit, we used:

  • Tomb Raider: Angel of Darkness v.49 (Core Design / Eldos Software) - DirectX 9.0, Paris5_4 demo. Testing was carried out at maximum established quality, only Depth of Fields PS20 were turned off.
  • Half-Life2 (Valve / Sierra) - DirectX 9.0, demo (ixbt01, ixbt02, ixbt03 Testing was carried out at maximum quality, option -dxlevel 90, presets for map views removed in dxsupport.cfg file.
  • FarCry 1.3 (Crytek / UbiSoft), DirectX 9.0, multitexturing, 3 demo from Research, Pier, Regulator levels (start the game with the -DEVMODE option), test settings are all Very High.
  • DOOM III (id Software / Activision) - OpenGL, multitexturing, testing settings - High Quality (ANIS8x). There is an example of starting automation with an increase in speed and a decrease in the number of jerks (precaching). (DO NOT BE AFRAID of the black screen after the first menu, it should be so! It will be 5-10 seconds, and then the demo should go)
  • 3DMark05 (FutureMark) - DirectX 9.0, multitexturing, test settings - trilinear,
  • F.E.A.R. (Multiplayer beta) (Monolith / Sierra) - DirectX 9.0, multitexturing, test settings - maximum, Soft shadows.
  • Splinter cell Chaos of Theory v.1.04 (Ubisoft) - DirectX 9.0, multitexturing, test settings - maximum, shaders 3.0 (for NVIDIA cards) / shaders 2.0 (for ATI cards); HDR OFF!
  • The Chronicles Of Riddick: Escape From Butcher Bay (Starbreeze / Vivendi) - OpenGL, multitexturing, test settings - maximum quality textures, Shader 2.0.

    I express my gratitude Rinat Dosayev (AKA 4uckall) and Alexei Ostrovsky (AKA Ducce) for writing a demo for this game, and also many thanks Alexey Berillo AKA Somebody Else for help

    First of all, it should be said that 7800GT emulation practically does not differ from a natural product in performance. As far as the rivalry with ATI is concerned, it is very successful: a victory even over the X850 XT PE. Of course, the 7800 GT falls somewhere between the 6800 Ultra and the 7800 GTX. I note how very important factor that the memory bandwidth of the 7800 GT is lower than that of the 6800 Ultra! Therefore, in applications where performance depends heavily on memory bandwidth, there may even be a lag behind NV45 (although such games are gradually becoming a thing of the past, more and more games are appearing on the scene, predominantly loading shader units). In this case, this game entirely depends on the performance of just such units, and therefore we see a strong lead of the 7800 GT against the 6800 U, despite the lower memory bandwidth.

    Game tests that heavily load vertex shaders, mixed pixel shaders 1.1 and 2.0, active multitexturing.

    FarCry, Research

    Test results: FarCry Research

    FarCry, Regulator

    Test results: FarCry Regulator

    FarCry, Pier

    Test results: FarCry Pier

    We can see that the 7800GT emulation was also successful, the difference in performance between it and the real 7800GT does not exceed 5%. As for the rest of the comparisons, firstly, it should be noted that modes without AA + AF should not be taken into account at all, since everything depends on the CPU, and secondly, even AA + AF is not capable of loading the cards so much, unfortunately , AA modes above 4x are already very different for ATI and NVIDIA. Plus, I must say that when researching performance in Far cry over the past year, it has long been clear to us that this game has become the hobbyhorse of a Canadian company that has debugged its drivers in this game to shine. And for a long time already ATI products in the same price category outperform NVIDIA products in this test. Therefore, the 7800 GT lags behind the X850 XT PE, and the approximate parity with the X850 XT is not surprising. This is fine. Besides, the price factor still speaks in favor of the 7800 GT (let's not forget about the HDR potential that can work on NVIDIA cards in this game).

    The difference between the emulation and the real 7800GT has reached 10%. What is the reason for this? - Hard to say. I think that with the drivers. Maybe there was a game that “probes” the cards and adjusts to them (it is quite possible that the demo version already “knows” about the 7800 GTX, and is not aware of the 7800 GT). However, the difference is still not so big. But the fact that memory bandwidth in this test has already played a cruel joke with the 7800 GT is obvious. See how bravo the 7800 GT performs in lower resolutions! And how sharply it lost ground in 1600x1200 ... However, the level of playability in F.E.A.R. at the maximum quality it is such that you can play only in 1024x768, and then only on the most powerful accelerators. Therefore, the loss in 1600x1200 is purely nominal.

    This is a new test, introduced for the first time in our toolkit, the game also loads accelerators very heavily, shaders 3.0 are used. Unfortunately, although version 1.04 of the game includes shaders 2.0 for ATI cards, this model works only on them. That is, only shader models 1.1 and 3.0 are available for NVIDIA cards. Therefore, I had to compare products that work on different models but with HDR off (which is also not AA compatible).

    The situation is similar to the previous one, at high resolutions it is obvious that the 7800 GT is already limited by the memory bandwidth. But on the whole the picture is more gratifying for the new product than the higher: even the X850XT PE is practically left behind (except for 1600x1200 with AA + AF).

    Game tests that heavily load both vertex shaders and pixel shaders 2.0

    Half-Life2: ixbt01 demo

    The results are very similar to what we saw in TR; AoD, where almost everything depends on the power of the shader units.

    So: Palit GeForce 7800 GT 256MB is a very interesting product. The card demonstrated excellent results in speed, outperforming even the more expensive RADEON X850 XT PE, not to mention the GeForce 6800 Ultra and X850 XT. I think that any further praises are superfluous, the readers are already convinced that the card works out its price, and in fact on the horizon there is already a fall in prices.

    A fly in the ointment for overclockers and lovers of "freebies" is the impossibility of unblocking the conveyors, but the card already justifies the costs by 100%, and if it is also overclocked, then by 150% (just kidding :).

    It should be noted the compact size, quiet cooler. And the 2D quality of this sample is simply excellent: 1600x1200 at 100Hz - the clarity is the highest level(together with the monitor specified in the stand configuration).

    Therefore, if there are no distortions with prices, and the products go to the market at prices at least equal to the X850XT PE, then the success is obvious. Let me emphasize once again that at the beginning of sales there are never prices recommended by the manufacturer, they are always higher. Therefore, if we see these cards in August at $ 470-490 retail - that's fine, now the price of 7800 GTX is still overpriced relative to the recommended one. And don't forget about support for Shaders 3.0, which ATI's competitors lack. Now this factor is becoming more and more active.

    And we can congratulate NVIDIA on another successful novelty that the market should like. And we are waiting for a reduction in prices for previous products, which will become even more attractive in their price categories.

    More complete comparative characteristics You can also see video cards of this and other classes in our 3Digest.

    According to the results of the study, NVIDIA GeForce 7800 GT 256MB receives an award in the nomination “ Original design"(For August). This is the first time a chipmaker has received this award! For creating a compact design with a quiet cooler for such a high-performance card.

    Processor for testing provided by the company

    Thank the company ZEON as well as personally Alexey Lebedev for the provided video card

NVIDIA GeForce 7300 GS.

Specifications

GeForce 7300 GS

Processor codename

G72

Core frequency in 3D mode

550 MHz

Chip manufacturing process

90 nm

Number of conveyors

4

Number of texture units (TMU) per pipeline

1
4
4
2200/2200

Hardware T&L

+
3
413

Supported memory type

DDR2 / 3

Bit depth (width) of the memory interface

64 bit

Maximum supported video memory

512 Mb

Memory frequency (DDR / 2)

405 MHz

Memory bandwidth (bandwidth)

6.5 Gb / s
+

Shared memory cache

+

Z-buffer compression

+
+

Fast Z-Buffer Clearing

+
+

Quickly clear the color buffer

+

Compress normal maps

+

NVIDIA SLI support

-
PCI Express, 16x

Turbo Cache support

+
3.0
3.0

Displacement mapping

+

Environment cube maps

+

Texture compression support

+
-

W-Buffer support

-

EMBM support

+

UltraShadow support

2.0
2, 4, 8
32
+
9.0+

Supported OpenGL Version

2.0

Integrated RAMDAC frequency

2x400 MHz

Maximum resolution

2048 * 1536 * 85 Hz
+
+
+
+
+
+

Thermomonitoring

+

GeForce Go 7400 -7800

Parameters GeForce Go 7800 GeForce Go 7600 GeForce Go 7400

Mhz memory

1100 1000 900
450 450 450

Processor codename

G73 G73 G73
256 128 64

Number of conveyors

8 pixel and 5 vertex 8 pixel and 5 vertex 8 pixel and 5 vertex
8 8 8

ROP (raster operation)

8 8 8

Technology support

HD PureVideo, Shader Model 3.0, PowerMizer, CineFX 4.0, HDR HD PureVideo, Shader Model 3.0, PowerMizer, CineFX 4.0, HDR

GeForce 7800 Specifications

Specifications 7800 GS 7800 GT 7800 GTX 7800 GTX 512
Processor codename G70G70G70G70
PCI Device ID 091 090 092
Core frequency in 3D mode, MHz 375 400* 430* 550
Those. chip manufacturing process, μm 0,11 0,11 0,11 0,11
Number of conveyors 16 20 24 24
Number of texture units (TMU) for pipelines 1 1 1 1
The maximum number of textures to be applied per pass 16 16 16 16
Pixel shader execution units 16 20 24 24
Maximum scene fill rate, million pixels / texels per second. 3000/6000 8000/6400 10320/6880 13200/8800
Hardware T&L + + + +
Number of vertex shader execution units 6 7 8 8
Geometry block performance, mln. Vertices per sec. 562 700 860 1100
Supported memory type DDR3DDR3DDR3DDR3
Bit depth (width) of the memory interface, bit 256 256 256 256
Maximum supported video memory, Mb 512 512 512 512
Memory frequency (DDR / 2), MHz 600 500 600 850
Memory bandwidth (bandwidth), Gb / s 38.4 32.0 38.4 54.4
Split memory controller + + + +
Shared memory cache + + + +
Z-buffer compression + + + +
Early Hidden Pixel Clipping (HSR) + + + +
Fast Z-Buffer Clearing + + + +
Compressing color information + + + +
Quickly clear the color buffer + + + +
Compress normal maps + + + +
NVIDIA SLI support + + + +
Supported computer interface AGP 3.0, 8xPCI Express, 16xPCI Express, 16xPCI Express, 16x
Turbo Cache support - - - -
Supported Vertex Shader Version 3.0 3.0 3.0 3.0
Supported Pixel Shader Version 3.0 3.0 3.0 3.0
Displacement mapping + + + +
Environment cube maps + + + +
Texture compression support + + + +
Hardware support for paletted textures - - - -
W-Buffer support - - - -
EMBM support + + + +
UltraShadow support 2.0 2.0 2.0 2.0
Anisotropic Filtering (AF) 2, 4, 8 2, 4, 8 2, 4, 8 2, 4, 8
Full Screen Anti-Aliasing Multisampling Rate (FSAA) 2x, Quincunx 4x, 4xS, 6xS, 8x, 16x2x, Quincunx 4x, 4xS, 6xS, 8x, 16x2x, Quincunx 4x, 4xS, 6xS, 8x, 16x2x, Quincunx 4x, 4xS, 6xS, 8x, 16x
Maximum color depth per channel 32 32 32 32
Extended support dynamic range color rendering (HDPT) + + + +
Supported DirectX Version 9.0+ 9.0+ 9.0+ 9.0+
Supported OpenGL Version 2.0 2.0 2.0 2.0
Integrated RAMDAC frequency, MHz 2x4002x4002x4002x400
Maximum resolution 2048 * 1536 * 85 Hz2048 * 1536 * 85 Hz2048 * 1536 * 85 Hz2048 * 1536 * 85 Hz
Integrated TV-out support + + + +
Integrated DVI support + + + +
Integrated HDTV support + + + +
MPEG4 / WMV decoding + + + +
No power amplification required - - - -
Passive cooling allowed - - - -
Thermomonitoring YesYesYesYes

*The geometry unit operates at an increased frequency of 40 MHz.

GeForce Go 7800 GTX

GeForce 7800 GT from Albatron, AOpen, MSI

Manufacturer Model CPU Mhz Mhz memory
Albatron 7800 GT 430 1200
AOpen Aeolus 7800 GT-DVD256 / DVDC256 430 1100
ASUStek Computer Extreme N 7800 GT 400 500
Chaintech computer AE78GT 400 1000
Gigabyte Technology GV-NX78T256V-B 400 500
Micro-Star International (MSI) NX 7800 GT 430 1000

All models are equipped with 256MB GGDR3 memory (except ASUS), 400MHz RAMDAC (max. 2048 x 1536 pcs), DVI-out and VIVO.

Leadtek WinFast PX7800 GT Extreme and WinFast PX7800 GT
Leadtek Research has announced the high performance WinFast PX7800 GT Extreme and WinFast PX7800 GT graphics cards based on NVIDIA's GeForce 7800GT GPU.

The WinFast PX7800 GT series offers high image quality and roughly double the performance over the previous generation of video adapters.

Second generation NVIDIA technologies SLI allows you to combine the power of two GPUs in one PC for maximum frame rates and high quality 3D rendering.

For further support already high performance Leadtek has developed an advanced cooling system with below average noise levels. Leadtek Advance Cooling will be available in September.

With VIVO function, the WinFast PX7800 GT series supports HDTV output, which allows you to connect a high definition TV to your PC to watch video on your TV screen. This feature, paired with video capture software, will allow you to record video from VHS, VCD, DVD, camcorder or other sources on your PC, as well as directly to disc (CD / DVD).

Leadtek includes popular video editing programs such as Video Studio 8, DVD Movie Factory 3 SE, and PowerDVD 6.0 software DVD player. The WinFast PX7800 GT series also comes with two games: "Splinter Cell Chaos Theory" and "Prince of Persia: Warrior Within".

WinFast PX7800 GT Extreme and WinFast PX7800 GT specifications:

PX7800 GTPX7800 GT Extreme
GPU NVIDIA GeForce 7800 GT
TirePCI-Express х16
Memory (volume, type) 256 MB DDR3
Interface256-bit
Bandwidth(GB / sec.)32
Core / memory frequency, (MHz) 400/500 450/525
RAMDAC, (MHz)400 400
Connectors, outputsD-SUB / HDTV / DVI-I
GeForce 7800 GT The 7800 GT graphic processor operates at 400MHz, the GDDR3 memory, of course, has a 256-bit interface and 1000MHz, the scene fill rate is 8 Gigapixels / sec, the vertex block is 700 million vertices / sec, and the memory bandwidth is 32Gb / sec.

An interesting fact is that 1.6 ns memory, which works at 1200 MHz in the standard, is underestimated in GeForce 7800 GT series cards.

ASUS on GeForce 7800 GTX EN7800GTX
The ASUS EN7800GTX video card has an increased relative reference design frequency of the graphics processor - 470 MHz, DDR3 memory has a standard operating frequency for these cards - 1.2 GHz. EN7800GTX possesses effective system heat removal of our own design.

In addition to the standard features inherent in all G70-based cards, the video card offers exclusive technologies and functions from ASUS and new games in the bundle.

Splendid Video Enhancement is a technology built into ASUS graphics card drivers that automatically optimizes picture quality by visibly increasing color intensity.

GameFace Messenger - This technology from ASUS will allow the user to invite other players to join the game over the network. You just need to register - GameFace Messenger technology is as easy to use as messaging services.

GameLiveShow is another new feature on ASUS graphics cards that allows gamers to share their experiences on the Internet. Other players can watch the professionals play online.

GameReplay technology allows you to record episodes of the game in MPEG4 format, and players can watch all games to account for their mistakes and prepare for the next competition. In addition, the recorded files can be used as screensavers or online diary entries.

Games and software included: Project Snow Blind, XPand Rally, Joint Operations, ASUS DVD, Media & Show, Power Director.

Specifications:

ModelEN7800GTX / 2DHTV / 256M
GPU GeForce 7800 GTX
Video memory256 MB DDR3 -1.6 ns
Graphics core frequency 470 MHz
Memory frequency1.2 GHz (600 MHz DDR3)
RAMDAC400 MHz
TirePCIe
Memory interface 256-bit
Maximum resolution 2048x1536
VGA outwith DVI-VGA adapter
TV-outvideo input and video output (VIVO) with HDTV support
DVI outDual DVI-I
2nd VGA outYes
Adapters and cables included

VIVO cable, DVI adapters, power cord

24.06.2 005 PROLINK Microsystems releases a video adapter based on the flagship chip from NVIDIA - "PixelView GeForce 7800 GTX".
Specifications:

  • NVIDIA technologies: CineFX 4.0, Intellisample 4.0, PureVideo and SLI mode
  • Microsoft DirectX 9.0 Support (Shader Model 3.0)
  • Longhorn OS Support
  • 64-bit texture filtering
  • Supports HDTV, WMV9 1080p format

Specification:

Bus type

PCI Express

Memory interface

256-bit

Memory bandwidth(GB / sec.)

38,4

Scene fill rate (billion pixels / sec.)

10,32

Vertex block (million vertices / sec.)

860

Number of pixel pipelines

24

RAMDACs (MHz)

400

The PixelView GeForce 7800 GTX video adapter is available now, its Part Number: PV-N70GXE (256JV). Aeolus 7800GTX by AOpen

23/06/2005 AOpen releases a card based on the latest nVidia GeForce 7800GTX chip - Aeolus 7800GTX-DVD256. The card is optimized to work under 64-bit systems; compatibility with Longhorn is assumed. The novelty, of course, supports SLI technology and is equipped with 256 MB GDDR3; when to go on sale is still unknown. The set will be supplied with the game Second Sight.
Main characteristics of the chip:
- manufacturing process: 0.11 microns;
- chip / memory frequency 430MHz / 1200MHz;
- the number of transistors: 302 million;
- memory bus: 256 bit;
- memory bandwidth: 38.4 GB / s;
- used memory: GDDR3;
- number of pixel pipelines: 24;
- number of vertex shaders: 8;
- two built-in RAMDACs of 400 MHz each;
- interface: PCI Express x16;
- support for API DirectX 9.0c and OpenGL 2.0; support for CineFX 4.0 technologies, IntelliSample 4.0 and 64-bit texture filtering.

Just last week we examined and tested two new video cards of the budget segment, NVIDIA GeForce 6600 DDR2 and ATI Radeon X1300 Pro. Let me remind you that then a complete and unconditional victory was won by a video card based on an NVIDIA chip. Today's article will be devoted to the battle, although not belonging to the Top-End category, but still quite expensive video cards that fall in the retail price range from 400 to 500 US dollars: ATI Radeon X1800 XL and NVIDIA GeForce 7800 GT. Since the video cards of the GeForce 7800 GT series are already well known to us from the articles about LeadTek WinFast PX7800 GT TDH MyVIVO 256 Mb and Gainward PowerPack! Ultra 3400 / PCX Golden Sample 256 Mb, then the review part of the article will be devoted only to acquaintance with the new product on the ATI chip - Sapphire Radeon X1800 XL 256 Mb... But first, I suggest you look at the technical characteristics of the Radeon X1800 XL in comparison with the fast enough video card of the previous generation from ATI, the Radeon X800 XL, and the current rival of the Radeon X1800 XL represented by the NVIDIA GeForce 7800 GT.

Technical characteristics of ATI Radeon X1800 XL versus ATI Radeon X800 XL and NVIDIA GeForce 7800 GT

Specifications ATI Radeon X800 XL ATI Radeon X1800 XL GeForce 7800 GT
Graphics chip R430 R520 G70 (NV47)
Technological process, microns 0.11 0.09 0.11
Number of transistors, million 160 321 302
Operating frequencies (GPU / memory) 400/1000 500/1000 400/1000 (275 GPUs in 2D mode)
Memory size, Mb 256 / 512 256 256
Memory type and bus width GDDR-3, 256 Bit GDDR-3 (4), 256 Bit GDDR-3, 256 Bit
Interface PCI-Express x16 PCI-Express x16 PCI-Express x16
Number of pixel conveyors, pcs. 16 16 20
TMU per conveyor, pcs. 1 1 1
Number of vertex processors, pcs. 6 8 7
Pixel Shaders version support 2.0b 3 3
Vertex Shaders version support 2.0b 3 3
Theoretical fillrate (Fillrate), Mpix./s 6400 8000 8000
Memory bandwidth, Gb / s 31.3 31.3 31.3
DirectX version support 9 9.0c 9.0c
Power consumption in nominal operating mode, W ~100 ~80
Power supply requirements, W ~350 ~350 ~ 350 (~ 550 for SLI)
Dimensions of the reference design video card, mm (L x H x T) 205 x 100 x 15 205 x 100 x 15
Connectors 2 x DVI, TV-Out, HDTV-Out, VIVO support 2 x DVI, TV-Out, HDTV-Out, SLI, HDCP support
Estimated cost * at the time of publication of the article, US dollars
* - according to www.price.ru data.

Meet Sapphire Radeon X1800 XL 256 Mb

Sapphire is the official partner of ATI in the production of video cards and motherboards based on chips from the Canadian company, so it is not surprising that products of the new Radeon X1000 (X1K) line are starting to appear on the market under this brand. For testing, we were provided with a Sapphire Radeon X1800 XL video card in a cardboard box, made in light colors:

The front side of the package depicts an alien creature, this time with scissor-like fingers and a formidable gaze. In addition to the name of the video card, the alien is adjacent to: ATI certificate; an indication of the amount of memory of the video card; information about two digital and TV-outputs and support for VIVO functions; a short list of software supplied with the Sapphire Radeon X1800 XL; Sapphire Select sticker with a list of games; information on support of the video card for CrossFire technology.

On the back of the box is a large photo of the board, and in addition to everything that is listed on the front side, the frequency of the video memory is indicated:

Completing the picture is a list of components, information on new graphics chip technologies and a list of awards that Sapphire has received from various print and online publications for its products since 2002.

As it turned out, inside a colored box made of glossy cardboard there is another one, made simpler, in which, in specially designated compartments, there are a video card and various accessories:

You have already looked at the delivery set, it remains only to list what is included in it:

  • splitter adapter for HDTV and VIVO;
  • S-Video cable;
  • HDTV cable and adapter;
  • one 15 pin DVI / D-Sub adapter;
  • cable for connecting additional power to the video card;
  • CD with drivers and proprietary utility triXX;
  • CD with licensed version Power programs DVD 6;
  • CD with licensed version of Power Director;
  • DVD "Sapphire Select";
  • user manual in six languages ​​for installation and operation of the video card.

Contrary to the prevailing opinion in our conference that Radeon graphics The X1800 XT and XL are very large in size, I will say that the dimensions of the Sapphire Radeon X1800 XL 256 Mb tested today correspond to the dimensions of the reference GeForce 7800 GTX (GT) and are 205 x 100 x 15 mm (L x H x T). In addition, thanks to the compact cooling system when installed in the slot PCI-Express motherboard the video card does not block the adjacent PCI slot, and its edge does not go beyond the dimensions of the motherboard and does not rest against the hard drive cage.

The Sapphire Radeon X1800 XL video card is made on a red PCB board:

There is a crooked sticker on the cooling system cover with, uh, alien face and Sapphire logo. Under the sticker is Ruby and the ATI logo.

The rear part of the front side of the board is equipped with a narrow aluminum radiator, which vertically covers the power elements responsible for powering the video card. There is also a six-pin connector for connecting additional power.

Even without removing the cooling system, but just looking at the reverse side of the board, you realize that all the video memory chips are located on the front side. The reverse side of the video card got practically nothing:

Cruciform metal plate equipped with plastic cushions designed to prevent the chip from skewing and chipping.

The video card is equipped with two digital (DVI-I) outputs and a TV-out:

Removing the video card cooling system:

Eight GDDR-3 memory chips are fanned out around the upside-down graphics processor. This arrangement of video memory chips is similar to the approach that is used on currently produced video cards based on NVIDIA GeForce 7800 GTX (GT) series chips. It is necessary to ensure stable operation of the GDDR-3 video memory at frequencies from 1000–1200 MHz and higher.

To the left of the GPU is the ATI Rage Theater chip responsible for capturing and processing video images. Judging by the marking, the chip was released in Taiwan on the 38th week of this year.

The R520 GPU, also Taiwanese, is slightly older - its release date falls on the 36th week of 2005:

The nominal frequency of the chip is 500 MHz, and with it technical characteristics you, I think, have already looked at the above table.

256 Mb of video card memory are recruited with eight GDDR-3 standard microcircuits with a nominal access time of 1.4 ns:

The default video memory frequency is 1000 MHz, which is noticeably lower than the theoretical one at ~ 1400 MHz. The memory is manufactured by Samsung. The marking of the chips is K4J55323QG-BC14, and you can always download its specifications from the official website ( PDF format, 1.42Mb).

Consider the cooling system for the Sapphire Radeon X1800 XL 256 Mb video card:

It is based on a copper radiator with two copper U-shaped heat pipes that remove heat from the base of the radiator and distribute it evenly over its fins:

From above, the entire structure is covered by an aluminum casing, and on the left edge there is a small straight-blade turbine designed to drive air between the radiator fins, thus cooling the entire structure, including the video card memory. I must say that the memory, through rather thick (about 1.5 mm) thermal spacers, contacts only the aluminum casing.

Turning on the system unit for the first time with the installed Radeon X1800 XL, at first I thought that at the same time I had changed my quiet Thermaltake Big Typhoon to the GlacialTech Igloo 7300 Pro - the turbine of the video card makes so much noise :(... A couple of seconds after the start, its noise dies down and becomes inaudible in the background system unit, but at the same time, 10-15 minutes after starting testing or a game, the noise becomes unbearable again. Interestingly, the regulation of the speed of rotation of the turbine occurs abruptly, and it is easy to determine from the sound that there are only two steps: very quiet and very loud. Unfortunately, the efficiency of the native Sapphire Radeon X1800 XL 256 Mb cooling system could not be tested due to the lack of programs for monitoring the temperature of the GPU and memory of video cards on the new Radeon X1000 chips (the proprietary triXX utility also does not see the new Sapphire video card). However, we already have a program for overclocking.

By manually setting the rotational speed of the turbine of the video card cooling system to the maximum, the Sapphire Radeon X1800 XL was able to overclock to frequencies of 560/1470 MHz (graphics processor / video memory):

GPU overclocking turned out to be "none": most likely, high-frequency chips are selected for the Radeon X1800 XT, and what remains is used for the production of the Radeon X1800 XL. Hopefully, over time, ATI will debug the technical process, and the percentage of new GPUs will increase. Unlike overclocking a graphics processor, the memory was frankly pleased: 47% of a frequency increase without a voltmod and a software increase in voltage is an infrequent phenomenon when overclocking video cards. However, the nominal access time of 1.4 ns video memory almost requires overclocking the memory.

When overclocking the video card, the GPU and memory voltages were not increased. As I said above, in the absence of monitoring programs, this venture on the test card is to some extent risky, and extreme enthusiasts should also leave their bread.

Testbed configuration, testing methodology, drivers, benchmarks and games

All tests were carried out in a closed case of the system unit of the following configuration:

  • Motherboard: ASUS A8N-SLI rev. 1.02 (nForce 4 SLI), Socket 939, BIOS v.1013.
  • Processor: AMD Athlon 64 3000+ 1800 MHz, 512 Kb, Cool & Quiet Disable (Venice, E3).
  • Cooling system: Thermaltake Big Typhoon (CL-P0114), ~ 1350 RPM, ~ 16 dBA.
  • Thermal interface: KPT-8 (OOO "Himtek").
  • RAM: 2 x 512 Mb PC3200 Corsair TWINXP1024-3200C2.
  • Disk subsystem: SATA 200 Gb Seagate Barracuda 7200.8 (3200826AS) 7200 RPM, 8 Mb.
  • Drive: DVD ± R / RW & CD-RW TSST SD-R5372.
  • Case: ATX ASUS ASCOT 6AR2-B Black & Silver + 420 W power supply (Thermaltake W0009) + two 120 mm Sharkoon Luminous Blue LED fans (~ 1000 RPM, ~ 21 dBA).
  • Monitor: LCD DELL 1800 / 1FP UltraSharp (1280x1024, DVI, 60 Hz).

The AMD Athlon 64 3000+ 1800 MHz processor of the E3 stepping was overclocked to 2790 MHz when the voltage was increased from the nominal 1.4 V to the maximum allowable motherboard 1.55 V:

During overclocking, the Corsair RAM operated at 199 (398) MHz with 2-2-2-8-1T timings at 2.7 V:

As operating system selected Windows XP Professional Edition Service Pack 2, installed on the first partition of the hard disk with a capacity of 8 Gb. All benchmarks and games are installed on the third partition of a 65 Gb hard drive. To minimize the impact on the final results during the tests of the speed of video cards, all services, except for four necessary ones, were disabled. No additional programs were also installed ("clean" tray). The system is tuned for maximum performance.

Testing was carried out using system NVIDIA drivers nForce 6.70 and DirectX 9.0c Libraries (Released April 2005). ATI Catalyst 5.10a and NVIDIA ForceWare 81.85 were used as video card drivers, in which trilinear and anisotropic filtering optimizations were activated. Vertical synchronization (VSync) is forcibly disabled during tests in the ForceWare drivers control panel and using ATI Tray Tools. In addition, in the Catalyst drivers, the so-called A.I. set to "High" mode.

Testing the performance of video cards was carried out in two: 1024x768 and 1280x1024 (in two games 1280x960), as well as in the following driver settings modes:

  • "Quality"- driver settings to" Quality ", AF Off, AA Off.
  • "Quality + AF16x"- driver settings to" Quality ", AF16x, AA Off.
  • "Quality + AF16x + AA4x"- driver settings to" Quality ", AF16x, AA4x.

The set of synthetic, semi-synthetic benchmarks and games, with the exception of three gaming applications, remained the same:

  • 3DMark 2003- build 3.6.0, 1024x768, default settings.
  • 3DMark 2005- build 1.2.0, 1024x768, default settings.
  • Aquamark 3- tests were carried out only in 1024x768 resolution.
  • Far cry k9 vision(DirectX 9.0c) - game version 1.33 (build 1337), level "Volcano", demo recording of the same name from Ubisoft, maximum detail, two consecutive runs of the benchmark, technologies " Geometry Instancing"and compression of normal maps" Normal-maps compression"activated.
  • Half-life 2(DirectX 9.0c) - game version 1.0.1.0, for testing a demo record "d1_canals_09" that is quite demanding on the performance of the video card was selected and maximum settings graphics in the game itself (presets for different types of cards have been removed in the dxsupport.cfg file).
  • (OpenGL) - game version 1.0.0.1, maximum graphics quality, Shader 2.0, demo "ducche".
  • Quake 4(OpenGL) - game version 1.0.0.0 build 2147, own recorded demo at the "Hangar Perimeter" level, detailed graphics in the game - "Ultra Quality", double demo recording to minimize the dependence of the results on the speed of the hard disk.
  • F.E.A.R.(DirectX 9.0c) - game version 1.01, built-in benchmark, all testing settings are set to "Maximum", Soft Shadows = On.

The changes consist in the exclusion of DOOM 3 from the tests due to the fact that Quake 4 was added to the testing, which uses a similar graphics engine to DOOM 3, as well as the exclusion of the Battlefield 2 game, since the benchmark in this game is a 3D camera flight of the battlefield and has little in common with real gameplay. In addition, the test results in Battlefield 2 are poorly repeatable, and even on the weakest video card of today's testing - the Radeon X800 XL - according to the benchmark results, the FPS stayed at 270 frames per second (1280x1024, AF 16x, AA 4x). Such results, in my opinion, are not at all indicative for the purposes of our testing.

Performance of ATI Radeon X1800 XL in nominal operating mode and under overclocking compared to NVIDIA GeForce 7800 GT and ATI Radeon X800 XL

In addition to ATI Radeon X1800 XL and NVIDIA GeForce 7800 GT, one more video card will take part in today's testing - ATI Radeon X800 XL.

Actually why, for comparison with the Radeon X1800 XL, I chose not the Radeon X850 XT PE - the most fast graphics card the last generation, and the Radeon X800 XL? The point here is not only in the same indexes of the new and the old ATI video cards, and, first of all, in the positioning of these cards in the lines of the past and new generation Radeon X1K. If we turn to video cards based on ATI X8XX series chips, it’s easy to remember that top model and to this day is the Radeon X850 XT PE with frequencies of 540/1180 MHz. In turn, Radeon X800 XL works much more low frequencies graphics processor and memory - 400/1000 MHz. Drawing parallels with the new line of ATI chips / video cards, we find that the Radeon X1800 XT with frequencies of 625/1500 MHz is Hi-End, replacing the Radeon X850 XT PE, but the Radeon X1800 XL with frequencies of 500/1000 MHz is just coming to replace Radeon X800 XL. This is the reason for the choice of Radeon X800 XL for comparing the old and new ATI line. Of course, we should not forget about the price aspect here, because the difference in cost between the Radeon X850 XT PE and the Radeon X800 XL with PCI-Express interface and 256 Mb of video memory today reaches 130–150 US dollars, while a gap of 50 US dollars was declared between the Radeon X1800 XL and the XT version (449 and 499 US dollars, respectively - we are talking about cards with 256 Mb memory). What will be the situation after the massive appearance of these cards in retail, now we can only guess.

Considering that the tests of the Radeon X1800 XL and GeForce 7800 GT were carried out both at nominal frequencies and during overclocking, why not overclock the Radeon X800 XL in terms of the graphics processor and video memory as well? But here, in my opinion, it will be more interesting to compare the Radeon X800 XL at frequencies identical to those of the Radeon X1800 XL - 500/1000 MHz, in order to see, so to speak, a "pure" difference in performance between the new and old generation of video cards.

You already know about the results of the Sapphire Radeon X1800 XL 256 Mb overclocking, so it remains to add that the reference NVIDIA GeForce 7800 GT 256 Mb from the nominal frequencies of 400/1000 MHz was overclocked to 459 MHz in the graphics processor and 1175 MHz in the video memory:

Let's look at the results of synthetic and semi-synthetic benchmarks.

3DMark 2003

The advantage of Radeon X1800 XL over its predecessor Radeon X800 XL is obvious and reaches 28% even at equal frequencies... But the situation with comparing the new video card based on the ATI chip with the NVIDIA GeForce 7800 GT turns out to be quite interesting. In modes that do not load the video card so much, without using full-screen anti-aliasing, the GeForce 7800 GT is ahead. At the same time, the activation of full-screen anti-aliasing makes the Radeon X1800 XL a leader both at nominal operating frequencies and during overclocking. The advantage is not too big, but it still takes place.

3DMark 2005

But in 3DMark 2005, the Radeon X1800 XL outperforms today's competitors in all modes. If the difference in speed between the GeForce 7800 GT and the Radeon X1800 XL is not so great, the lag of the Radeon X800 XL from the latter is very significant and reaches almost 60%. But this is all pure synthetics, let's look at the results of a benchmark created on the real Aquanox game engine - Aquamark 3.

Aquamark 3

Obviously, the performance of video cards in Aquamark 3 is limited by the speed of the central processor. Nevertheless, the GeForce 7800 GT is slightly ahead of its rival Radeon X1800 XL. Now let's move on to the test results in games.

Far cry

With your permission, I will not comment on the test results of video cards obtained at 1024x768, as they are too processor-dependent even on AMD Athlon 64 3000+ @ 2.8 GHz. The most indicative, in my opinion, is testing at a resolution of 1280x1024. When comparing Radeon X800 XL and Radeon X1800 XL at the same frequencies, it becomes obvious that all the power of the new graphics core is manifested in modes with full-screen anti-aliasing, where the advantage of Radeon X1800 XL over Radeon X800 XL reaches 16%, which, given the difference in the cost of video cards, is not so many. If we oppose the new video card based on the ATI GeForce 7800 GT chip, then here we can observe exactly the same tendency as in 3DMark 2003: a slight lag in light modes and victory when using AA 4x.

Half-life 2

In general, in Half-Life 2, there is a parity between ATI Radeon X1800 XL and NVIDIA GeForce 7800 GT. Depending on the quality mode, the advantage from one video card goes to another and vice versa. It's more interesting to see how much faster the new R520 GPU of the old R480 is. Radeon X800 XL lags behind the newbie by 38% at 1280x1024 when using anisotropic filtering maximum level and full screen anti-aliasing.

The Chronicles Of Riddick: Escape From Butcher Bay

Complete defeat of ATI and minimal speed difference between Radeon X800 XL and Radeon X1800 XL. In my opinion, the reason here lies not so much in optimizing this game for video cards on NVIDIA chips, but in debugging ForceWare drivers, since such a significant advantage of NVIDIA in The Chronicles Of Riddick: Escape From Butcher Bay was not previously observed.

The games discussed above, although they are absolute hits, are already in the past. Numerous additions to Far Cry and addons for Half-Life 2 have extended their popularity, but time is running out: it's time to give way to new games. Call of Duty 2, Quake 4 and FEAR have replaced - based on the test results in the last three we are with you now and we'll see.

Quake 4

Quake 4, having adopted the DOOM 3 engine, remained supportive of video cards based on NVIDIA chips. GeForce 7800 GT is in the lead, and by a significant margin. The difference in speed between Radeon X800 XL and Radeon X1800 XL operating at the same frequencies is minimal.

F.E.A.R.

But in F.E.A.R. the gap between the new generation ATI graphics processor and the old one is quite significant and reaches 29% in some modes. It would seem that it remains to add that NVIDIA GeForce 7800 GT, working in the nominal mode, outperforms the overclocked Radeon X1800 XL, but ...

As is often the case after running all the tests and returning the graphics cards provided for testing, there is news that Catalyst drivers, including the just-released Catalyst 5.11, have been found to have a bug causing performance degradation in F.E.A.R. Even if to be extremely correct, it is difficult to call it a mistake. The fact is that Catalyst drivers (as well as ForceWare) contain optimizations for various games and benchmarks. In the case of F.E.A.R. it turned out that the "optimizations" that worked in the demo version of this game led to a drop in performance in the final F.E.A.R. release. According to some estimates, the drop in productivity reached 25-30%. To make sure of this and turn off optimizations, you just need to rename executable file fear.exe to fears.exe for example. Since the Sapphire Radeon X1800 XL video card had already been returned by that time, tests with the renamed file were only carried out on the Radeon X850 XT PE, and this is what happened:

The gain, as they say, is evident. It should be added that the bug has not been fixed in the latest version of Catalyst 5.11. We are waiting for the New Year's release Catalyst drivers and the growth of the speed of the new line of video cards based on ATI chips not only in F.E.A.R. (without quality degradation, of course).

I think, in conclusion, you will be interested in the table of the difference in speed between the Radeon X1800 XL 256 Mb and the Radeon X800 XL, operating at 500/1000 MHz:

Benchmarks & Games Resolution Quality Quality + AF 16x Quality + AF 16x + AA 4x
Aquamark 3 1024x768 3.10% 3.30% 5.10%
3DMark 2003 12.40% 18.90% 28.40%
3DMark 2005 24.20% 51.10% 59.50%
Far cry 1024x768 -1.00% 3.10% 17.80%
1280x1024 0.50% 7.00% 15.90%
Half-life 2 1024x768 1.90% 28.70% 33.40%
1280x1024 7.60% 36.20% 38.30%
The Chronicles Of Riddick: EFBB 1024x768 5.90% 5.70% -3.10%
1280x1024 7.10% 7.10% -5.30%
Quake 4 1024x768 1.80% 2.70% -1.70%
1280x1024 6.40% 7.00% -3.30%
F.E.A.R. 1024x768 29.00% 29.00% 11.10%
1280x1024 19.00% 19.00% 17.60%

All is lost or is there still a chance?

So what do we end up with? In games originally developed for video cards based on NVIDIA chips, the Radeon X1800 XL is inferior, and sometimes quite significantly, to its competitor, the GeForce 7800 GT. In other games, you can see an approximate parity, and in modes with full-screen anti-aliasing - the advantage of the Radeon X1800 XL. Of course, ATI has a trump card up its sleeve in the form of Catalyst drivers optimized for the new line of video cards. The most important thing is that this card does not turn out to be cheating and does not affect the quality of graphics in games and synthetic benchmarks. On the side of NVIDIA, in my opinion, there are two more significant advantages: the GeForce 7800 GT has been freely available for four months (and at a price lower than that declared for the Radeon X1800 XL) and to organize the SLI mode you do not need to look for a "special" video card, as in Crossfire technology, and all you need to do is buy one more GeForce 7800 GT.

ATI perfectly understands that now the time is working for NVIDIA, and even if they manage to quickly saturate the market with new video cards by the new year, in the light of the above factors it is far from obvious that the buyer will give preference to the new ATI products. Therefore, it is quite logical that now all efforts are thrown into the revision and launch into mass production of the new ATI R580 graphics chip.

According to the latest information that appeared during the preparation of the article, ATI decided to reduce the price recommended for the Radeon X1800 XL from $ 449 to $ 399. This step, most likely, should help to increase the demand for video cards Radeon X1800 XL and strengthen competition with the GeForce 7800 GT.

We just have to check the balance of forces in the middle price range, where we will face a "battle" between ATI Radeon X1600 XT and NVIDIA GeForce 6800 GS.

Top related articles