How to set up smartphones and PCs. Informational portal
  • home
  • TVs (Smart TV)
  • Built-in Intel HD Graphics. Comparison of mobile and desktop video cards from Intel, AMD and Nvidia

Built-in Intel HD Graphics. Comparison of mobile and desktop video cards from Intel, AMD and Nvidia

The evolution of Intel graphics | Intel enters the GPU race

In the GPU world, AMD and Nvidia take center stage in terms of performance and attention to their products. Although these companies have become famous for their technology, none of them are, in fact, the largest suppliers of GPUs. This title belongs to Intel. The corporation tried to compete with AMD and Nvidia in terms of performance and sometimes even released full-fledged video cards. But its strength is in integrating graphics technologies into its chipsets and processors. Thus, Intel GPUs are now present in most modern computers. But due to the limitations of integrated solutions, the company's graphics modules tend to offer entry-level performance. The most recent developments have been noticeably more impressive. Some solutions even outperform entry-level discrete graphics cards from AMD and Nvidia. Intel HD Graphics may be behind other GPUs, but we have to admit that the days of the GMA 950 and its predecessors are over.

The evolution of Intel graphics | Intel's first dedicated GPU: i740 (1998)

In 1998, Intel released its first graphics card, the i740, codenamed "Auburn". It operated at a clock frequency of 220 MHz and used a relatively small amount of VRAM video memory of 2 - 8 MB. Comparable video cards of the time were typically equipped with 8 - 32 MB of video memory. In addition, the card supported DirectX 5.0 and OpenGL 1.1. To get around the lack of on-board memory, Intel planned to take advantage of a feature built into the AGP interface that allowed the card to use the computer's RAM. Thus, the i740 used the integrated memory as a frame buffer, and stored all textures in the platform's RAM. Considering that the company did not have to overpay for expensive memory, it could sell the i740 cheaper than its competitors. Unfortunately, this GPU encountered a number of difficulties. RAM was not accessed as quickly as integrated video memory, and this negatively impacted performance. In addition, this solution reduced the performance of the central processor, since it had less bandwidth and RAM to work with. Crude drivers further hurt the card's performance, and image quality was questionable due to the slow D/A converter. Ultimately, the i740 was a complete failure. Intel tried to remedy the situation by convincing motherboard manufacturers to include the card with 440BX-based platforms, but this also did not lead to success.

The evolution of Intel graphics | i752 graphics chip and 81x series chipsets (1999)

After the failure with the i740, Intel developed and briefly sold a second video card called the i752 "Portola". However, it was released in very limited quantities. Around the same time, Intel began integrating its graphics core into chipsets such as the i810 ("Whitney") and i815 ("Solano"). GPUs were built into the northbridge, becoming Intel's first integrated graphics processors. Their performance depended on two factors: the speed of the RAM, which was often linked to the FSB and in turn depended on the processor, and the speed of the CPU itself. At that time, Intel used 66, 100, or 133 MHz FSB configurations along with asynchronous SDRAM, giving the system maximum throughput of 533, 800, or 1066 MB/s, respectively. Although the bandwidth was shared with the processor, the iGPU never had access to the entire channel. Motherboard manufacturers could include an additional 4 MB of dedicated video memory on their platforms, connected directly to the GPU via AGP x4, providing an additional 1066 MB/s.

The performance of these iGPUs was poor. In addition, due to the integrated graphics, the i810 chipset lacked an AGP interface, thereby limiting the upgrade of slow PCI-based video cards. The i815 chipset had an AGP port along with the iGPU, but installing a discrete graphics card disabled the iGPU. As a result, these graphics solutions were aimed at entry-level budget PC users.

The evolution of Intel graphics | Intel Extreme Graphics (2001)

In 2001, Intel launched a new Extreme Graphics family that was closely related to the previous generation, including two pixel pipelines and limited MPEG-2 hardware acceleration. Software API support was almost identical to the i815 chipset, although OpenGL support was expanded to API version 1.3.

The performance of the Intel Extreme Graphics iGPU was highly dependent on the chipset, memory, and CPU. The first implementation appeared in the Intel i830 (Almador) chipset family, developed for the Pentium III-M. They still used aging SDRAM, which limited maximum bandwidth to 1066 MB/s, just like earlier GPUs. The clock speed on Almador chipsets has been reduced from 230 MHz (i815) to 166 MHz to save power and reduce heat dissipation.

A desktop version was introduced later in 2002 in chipsets i845 Brookdale, designed for Pentium 4 processors. They also ran at a lower clock speed than the i815 (200 MHz), but could use SDRAM or DDR memory. Thanks to the faster iGPU CPUs in the i845 chipset paired with SDRAM, it was faster than the i815 models, despite lower clocks. Versions using DDR RAM push the performance level even further. Integrated solutions couldn't outpace Nvidia's GeForce 2 Ultra, which was already over a year old at the time, but they were good for light gaming.

The evolution of Intel graphics | Intel Extreme Graphics 2 (2003)

Intel reused the dual-pixel pipeline graphics chip in the Extreme Graphics 2 family released in 2003. The company has again introduced two versions of the GPU. The mobile version was the first to appear in the i852 and i855 chipsets designed for the Pentium M. These versions of the chip operated at frequencies of 133 and 266 MHz, depending on the OEM choice. The second variant of the chip was used in the i865 Springdale chipsets for the Pentium 4. The 266 MHz processor was paired with faster DDR memory that could operate at up to 400 MHz, giving it higher bandwidth than previous iGPUs.

While performance has improved markedly over the older Intel Extreme Graphics line, the graphics demands of games have also increased. As a result, these graphics chips were only able to provide acceptable frame rates in older games.

The evolution of Intel graphics | GMA 900 (2004)

In 2004, Intel ended the Extreme Graphics line, retiring the dual-pixel pipeline core that had been used in all previous Intel GPUs. For the next few years, Intel will market its graphics under the name Graphics Media Accelerator (or GMA). The first of this series was the GMA 900 GPU, integrated into the i915 (Grantsdale/Alviso) family of chipsets. It supported DirectX 9.0 and had four pixel pipelines, but it lacked vertex shaders, and these calculations were done by the CPU. The GPU frequency could be 333 MHz or 133 MHz for low-power systems. The GPU worked with both DDR and DDR2. But regardless of the configuration, performance was relatively poor.

Some manufacturers made special expansion cards to complement the GMA 900 to add DVI output.

The evolution of Intel graphics | GMA 950: Pentium 4 and Atom (2005)

The GMA 950 GPU is integrated into Intel i945 chipsets (Lakeport and Calistoga) and boasts a relatively long life cycle. These chipsets worked with Pentium 4, Core Duo, Core 2 Duo and Atom processors. However, the architecture was almost identical to the GMA 900 and inherited many of its shortcomings, including the lack of vertex shaders. The kernel received minor software compatibility improvements and support for DirectX 9.0c. This was an important update for the graphics chip as it added Aero support to Windows Vista. Thanks to the increased frequency (400 MHz) and support for faster processors and memory, performance has increased slightly. Mobile versions of the GPU could also run at 166 MHz to save power and reduce heat dissipation.

The evolution of Intel graphics | GMA 3000, 3100 and 3150 (2006)

In 2006, Intel rebranded its graphics again, starting with the GMA 3000. This was a significant improvement over the old GMA 950 in terms of performance and technology. The previous generation was limited to four pixel pipelines without vertex shaders. Meanwhile, the new GMA 3000 included eight multi-purpose EU execution units capable of performing multiple tasks, including vertex computation and pixel processing. Intel increased the clock speed to 667 MHz, giving the GMA 3000 a noticeable boost in speed over the GMA 950.

After the GMA 3000 premiered, Intel added two more graphics chips to the family: the GMA 3100 and 3150. Even though they came after the GMA 3000, both GPUs were actually more similar to the GMA 950. They only had four pixel pipelines and relied on a central processor for processing vertices. The reuse of the GMA 950 after rebranding it as the GMA 3100 and 3150 allowed Intel to offer multiple products. Previously, Intel had focused its efforts on just one GPU in its lineup.

The evolution of Intel graphics | GMA X3000 (2006)

After GMA 3000, Intel changed its name again, introducing the fourth generation of GPUs. However, the GMA X3000 was almost identical to the GMA 3000 and included only minor changes. The main difference was the amount of memory used - the GMA 3000 could only use 256 MB of system memory for graphics, while the GMA X3000 increased this figure to 384 MB. Intel has also expanded video codec support in the GMA X3000 to include full MPEG-2 acceleration and limited VC-1 acceleration.

Around the same time, Intel introduced the GMA X3100 and GMA X3500. Essentially, these were upgraded GMA X3000 chips that received support for Pixel Shader 4.0, allowing them to work with new APIs, such as DirectX 10. The clock speed of the GMA X3100 was lower than other versions, since it was designed for mobile platforms.

The evolution of Intel graphics | Latest GMA (2008)

After the X3000, Intel developed only one series of chipsets with integrated graphics. The Intel GMA 4500 family consisted of four models, all of them using the same architecture with 10 execution units. Three versions of the GPU were released for desktop chipsets. The slowest of them was the GMA 4500 with a frequency of 533 MHz. The other two, GMA X4500 and X4500HD, ran at 800 MHz. The main difference between the X4500HD and the X4500 was the use of full VC-1 and AVC hardware acceleration.

The mobile version of the graphics chip was called GMA X4500MHD and operated at a frequency of 400 MHz or 533 MHz. Similar to the X4500HD, the X4500MHD supported full VC-1 and AVC hardware acceleration.

The evolution of Intel graphics | Larrabee (2009)

In 2009, Intel made another attempt to enter the video card market by introducing Larrabee. Realizing that its main advantage was its deep understanding of the x86 architecture, Intel wanted to create a GPU based on the ISA bus. Instead of designing from scratch, Larrabee's design was based on the first Pentium processor, which Intel decided to modify in order to create a scalar block inside the GPU. The old processor architecture has been significantly redesigned, acquiring new algorithms and Hyper-Threading technology to increase performance. Although Larrabee's Hyper-Threading technology was similar to that found in conventional Intel processors, Larrabee was capable of running tasks with four threads per core instead of two.

To handle vertices, Intel created an unusually large 512-bit floating point unit, made up of 16 separate elements that can operate as a single component or as stand-alone units. This FPU theoretically had more than 10 times the throughput of similar Nvidia chips at the time.

Ultimately, the Larrabee initiative was canceled, although Intel continues to develop the technology.

The evolution of Intel graphics | First generation Intel HD Graphics (2010)

Intel introduced the HD Graphics line in 2010 to regain the ground that the GMA family had lost. The HD Graphics core in the first generation Core i3, i5 and i7 processors was similar to the GMA 4500, with the exception of two additional execution units. Clock speeds remained roughly the same, starting at 166 MHz on low-power mobile systems and settling at 900 MHz on higher-end desktop CPUs. Although the 32nm processor and 45nm GMCH were not fully integrated on a single silicon die, both components were contained within the processor package. This reduced latency between the memory controller inside the GMCH and the CPU. API support has not changed significantly since GMA, although overall performance has increased by more than 50 percent.

The evolution of Intel graphics | Sandy Bridge: second generation Intel HD Graphics (2011)

IN Sandy Bridge Intel HD Graphics takes another step forward in terms of performance. Instead of two separate dies under the hood, Intel combined the processors onto a single die, further reducing latency between components. In addition, Intel has expanded the functionality of the graphics chip, adding Quick Sync technology to speed up transcoding and a more efficient video decoder. API support expanded only to DirectX 10.1 and OpenGL 3.1, but the clock frequency increased significantly - now it varied between 350 - 1350 MHz.

With a broader set of features, Intel decided to segment its chip line. Low-end models received the HD label (based on the GT1 core with six EUs and a limited video decoder), mid-level solutions were called HD 2000 (the same GT1 with six EUs, but a full-featured encoding/decoding unit), and the top-level chips were called HD 3000 (core GT2 with 12 EU plus all the benefits of Quick Sync).

The evolution of Intel graphics | Xeon Phi (2012)

While Larrabee's concept was more focused on gaming, the company saw its future in compute-intensive applications and created a coprocessor in 2012 Xeon Phi. One of the first models, called the Xeon Phi 5110P, contained 60 x86 processors with large 512-bit vector computation units clocked at 1 GHz. At this speed, they could provide more than 1 TFLOPS of processing power while consuming an average of 225 watts.

As a result of its high computing speed relative to power consumption, the Xeon Phi 31S1P was used to create the Tianhe-2 supercomputer in 2013, which is still considered one of the fastest supercomputers in the world today.

The evolution of Intel graphics | Ivy Bridge: Intel HD 4000 (2012)

With the advent of Ivy Bridge, Intel has redesigned its graphics architecture. Similar to the iGPU in Sandy Bridge, the graphics core in Ivy Bridge was sold in three different versions: HD (GT1 with six EUs and a limited encoding/decoding unit), HD 2500 (GT1 with six EUs and a full-featured encoding/decoding unit) and HD 4000 ( GT2 with 16 EU and full function encoding/decoding block). The HD 4000 ran at a lower frequency of 1150 MHz than the Intel HD 3000, but had four additional execution units and was significantly faster than its predecessor. The average speed increase in Skyrim was 33.9 percent. Part of the performance gain is due to the improved architecture, which moved to Pixel Shader 5.0 for the first time, plus support for DirectX 11.0 and OpenCL 1.2.

The performance of Intel Quick Sync technology has also increased significantly. Transcoding H.264 video files from one format to another was twice as fast. Hardware video acceleration has also been improved and the Intel HD 4000 is technically capable of decoding multiple 4K video streams simultaneously.

The evolution of Intel graphics | Intel expands its graphics line with Haswell chips (2013)

Architecturally, the HD Graphics core is Haswell is similar to the graphics core in Ivy Bridge and can be considered an extension of it. To get more performance out of Haswell GPUs, Intel used brute force. This time the company chose to install ten execution units in the GT1 Haswell instead of six in the previous generation. Full video decoding was enabled, but Accelerated Encoding and Quick Sync were disabled. In addition, Intel has further diversified its GPU range. The GT2 version with 20 EU was used in three different graphics cores: HD Graphics 4200, 4400 and 4600. They mainly differed in clock speed.

Intel also introduced a higher-end GPU called GT3. It contained 40 execution units and provided a significantly higher level of performance. Processors with the GT3 core were sold under the HD Graphics 5000 and 5100 brand. Rare GT3e version Intel Iris Pro 5200 included 128 MB of eDRAM memory in the processor package and was the first incarnation of the Intel Iris Pro family. Although the Iris Pro 5200 was faster than solutions without additional eDRAM, its impact on the market was limited, since the GPU appeared in only a few top processors.

The low power version of the Haswell iGPU had only four EUs and was used in Intel Atom processors under the code name Bay Trail. With the introduction of the high-performance GT3 and the fuel-efficient Bay Trail, Haswell iGPUs have grown to eight different models. For comparison, the Sandy Bridge and Ivy Bridge generations had only three versions.

The evolution of Intel graphics | Broadwell (2014)

IN Broadwell Intel has again upgraded the iGPU to scale more efficiently. In the new architecture, the execution units were organized into eight subsections. This made adding EUs even easier since Intel could duplicate subsections multiple times. The GT1 version contained two subsections (although only 12 EUs were active). The next three products: HD Graphics 5300, 5500, 5600 and P5700 used the GT2 chip with 24 EUs (but some versions only had 23 active EUs).

The faster GT3 and GT3e cores each contained 48 EUs and were used in the HD Graphics 6000, Iris Graphics 6100, Iris Pro Graphics 6200 and Iris Pro Graphics P6300. Like Haswell Iris Graphics chips, models in the Broadwell Iris Graphics line included a GT3e graphics core with 128 MB of internal eDRAM. Each group of eight execution units had 64 KB of shared cache memory. These GPUs supported DirectX 12, OpenGL 4.4 and OpenCL 2.0.

The evolution of Intel graphics | Skylake (2015)

The latest version of Intel integrated graphics is implemented in processors based on the architecture Skylake. These graphics chips are close to the Broadwell iGPU, have the same architecture and the same number of EUs in almost all models. The main changes affected naming. Intel changed the names to HD Graphics 500. The entry-level GPUs were called HD Graphics and HD Graphics 510 and used a GT1 die with 12 EU. HD Graphics 515, 520, 530 and P530 use GT2 chip with 24 EU.

Starting with Skylake, Intel further separated the Iris and Iris Pro series products. The Iris 540 and 550 will come with 48 execution units in the GT3e chip. It's not yet clear what the core name of the Iris Pro 580 will be, but it will contain a total of 72 EUs and will likely be significantly faster than the Iris Pro 6200 GPU in the Broadwell CPU. It's not clear how much eDRAM will be in these chips, but Intel will likely continue to differentiate Iris and Iris Pro graphics by performance level. The Iris 540 will only have 64MB of eDRAM, which is half the size of the Broadwell GT3e. As for the Iris Pro or Iris 550, Intel has not yet announced their exact specifications.

We are talking about the built-in integrated graphics in the Haswell line of processors. The performance of the Intel HD Graphics 4600 can be compared with video cards like nVIDIA GeForce GT 630M. However, Intel's integrated graphics can handle up to 16 operations, which is ahead of GeForce.

Characteristics and comparison with GeForce GT 630

If you perform peak performance calculations to compare between the HD 4600 and GeForce, you will see the following picture:

And in rasterization speed:

HD 4600 2.5 Mpix/sec
GeForce GT 630 3.2 Mpix/sec

Based on this, we can conclude that GeForce is still a serious rival for Intel. The HD 4600 uses twenty actuators, which in turn improves performance by 20% when compared to the HD 4000. The standard graphics clock speed of the HD 4600 is 400 MHz. However, it is worth considering the core’s support for Turbo Boost, therefore, depending on the task, it can be overclocked to 1350 MHz. The main characteristics of the Intel HD 4600 look like this:

In addition, this chip has an improved video decoder in 4K format, as well as support for Shader 5.0 and Open CL 1.2 Open GL 4.0.

Intel HD 4600 in games

Based on the above, we will try to understand what games will run on this chip. At its core, the Intel HD 4600 is not a new invention, but a gradually evolving system that was created in early 2010. At the moment, the chip has turned from a weak budget option into one worthy of competition with inexpensive video cards that have native memory. If previously it was available for surfing the Internet and watching videos, it is now possible to play some specific games. In theory, the Intel HD graphics 4600 can handle the most modern games, due to its support for DirectX 11.1, but being just an integrated graphics card, it can’t handle everything. Below are several game options with the results of multiple tests. For a much clearer comparison, Intel HD Graphics 4400 was also tested.

Aliens vs. Predator

Increased quality:
4400 - 10.2 fps
4600 - 13,6
Not an option.

At resolution - 800x480:
4400 - 69,3
4600 - 103,4
Quite playable.

Batman: Arkham Asylum GOTY Edition

Increased quality:
4400 - 26,6
4600 - 41,2
In principle it is possible, but the game will obviously lag.

At 800x480 resolution:
4400 - 105,2
4600 - 196,3
The game will go very well.

Crysis: Warhead x64

Increased quality:
4400 - 9,8
4600 - 14,6
Absolutely not acceptable.

With extension settings 720x480:
4400 - 106,0
4600 - 156,8
You can play safely and comfortably.

F1 2010

Increased quality:
4400 - 12,5
4600 - 15,1
Absolutely unplayable.

With resolution settings of 720x480:
4400 - 33,9
4600 - 50,9
There may be lags in the game.

Far Cry 2

Increased quality:
4400 - 17,1
4600 - 27,2
In principle, you can play.

At 800x480 resolution:
4400 - 42,6
4600 - 89,8
You can play safely and comfortably.

Metro 2033

Increased quality:
4400 - 6,5
4600 - 9,8
No matter how hard you try to relax and get into the game, you won’t be able to.

At low settings, namely a resolution of no more than 1024x768:
4400 - 24,4
4600 - 46,5
You can try, but lags are quite likely.

Video cards are not needed

Based on the results of tests and research, we can safely say that the new graphics chip has made a huge leap forward in comparison with the core of the previous generation HD 4000. The average percentage gap in all tests was almost 40 percent. Successfully competing with budget discrete video cards like the GeForce GT 630, Intel's new integrated graphics allows you to abandon the useless purchase of similar video cards, because their performance is approximately equal. In addition, this graphics can easily compete with the newest cheap video cards. With incomparably high energy costs, their productivity will vary within the same limits, if not lower. Another important detail is that this graphics can be used both in the Core i7 4770K and in the more affordable Core i5.

« Why is this integration needed? Give us more cores, megahertz and cache!“- the average computer user asks and exclaims. Indeed, when a computer uses a discrete video card, there is no need for integrated graphics. I admit, I lied about the fact that today a central processor without built-in video is harder to find than with it. There are such platforms - LGA2011-v3 for Intel chips and AM3+ for AMD “stones”. In both cases, we are talking about top solutions, and you have to pay for them. Mainstream platforms, such as Intel LGA1151/1150 and AMD FM2+, are universally equipped with processors with integrated graphics. Yes, “built-in” is indispensable in laptops. If only because in 2D mode, mobile computers last longer on battery power. On desktops, integrated video is useful in office builds and so-called HTPCs. Firstly, we save on components. Secondly, we again save on energy consumption. However, recently AMD and Intel are seriously talking about the fact that their integrated graphics are graphics for all graphics! Also suitable for gaming. This is what we will check.

We play modern games on the graphics built into the processor

300% increase

For the first time, graphics integrated into the processor (iGPU) appeared in Intel Clarkdale solutions (first generation Core architecture) in 2010. It is integrated into the processor. An important amendment, since the very concept of “embedded video” was formed much earlier. Intel did it back in 1999 with the release of the 810 chipset for Pentium II/III. At Clarkdale, integrated HD Graphics video was implemented as a separate chip located under the heat-distributing cover of the processor. The graphics were produced according to the old 45-nanometer technical process at that time, the main computing part was produced according to 32-nanometer standards. The first Intel solutions in which the HD Graphics unit “settled” along with other components on one chip were Sandy Bridge processors.

Intel Clarkdale - the first processor with integrated graphics

Since then, on-chip graphics for mainstream LGA115* platforms has become the de facto standard. Generations Ivy Bridge, Haswell, Broadwell, Skylake - all have integrated video.

Graphics integrated into the processor appeared 6 years ago

In contrast to the computing part, “embeddedness” in Intel solutions is progressing noticeably. HD Graphics 3000 in Sandy Bridge K-series desktop processors has 12 execution units. HD Graphics 4000 in Ivy Bridge has 16; HD Graphics 4600 in Haswell has 20, HD Graphics 530 in Skylake has 25. The frequencies of both the GPU itself and RAM are constantly increasing. As a result, the performance of embedded video increased by 3-4 times over four years! But there is also a much more powerful series of “embedded” Iris Pro, which are used in certain Intel processors. 300% interest over four generations is not 5% per year.

Intel Integrated Graphics Performance

In-processor graphics is one segment where Intel has to keep up with AMD. In most cases, the Reds' decisions are faster. There is nothing surprising in this, because AMD develops powerful gaming video cards. So the integrated graphics of desktop processors use the same architecture and the same developments: GCN (Graphics Core Next) and 28 nanometers.

AMD hybrid chips debuted in 2011. The Llano family of chips was the first to combine integrated graphics and computing on a single chip. AMD marketers realized that it would not be possible to compete with Intel on its terms, so they introduced the term APU (Accelerated Processing Unit, processor with a video accelerator), although the idea had been hatched by the Reds since 2006. After Llano, three more generations of “hybrids” came out: Trinity, Richland and Kaveri (Godavari). As I already said, in modern chips the integrated video is architecturally no different from the graphics used in Radeon discrete 3D accelerators. As a result, in 2015-2016 chips, half of the transistor budget is spent on iGPUs.

Modern integrated graphics take up half the usable CPU space

The most interesting thing is that the development of APUs influenced the future... of game consoles. So the PlayStation 4 and Xbox One use an AMD Jaguar chip - eight-core, with graphics based on GCN architecture. Below is a table with characteristics. The Radeon R7 is the most powerful integrated video the Reds have to date. The block is used in AMD A10 hybrid processors. Radeon R7 360 is an entry-level discrete video card, which, according to my recommendations, can be considered a gaming card in 2016. As you can see, the modern “integration” in terms of characteristics is not much inferior to the Low-end adapter. It cannot be said that the graphics of game consoles have outstanding characteristics.

The very appearance of processors with integrated graphics in many cases puts an end to the need to buy an entry-level discrete adapter. However, today integrated video from AMD and Intel is encroaching on the sacred - the gaming segment. For example, in nature there is a quad-core Core i7-6770HQ (2.6/3.5 GHz) processor based on the Skylake architecture. It uses integrated Iris Pro 580 graphics and 128 MB of eDRAM memory as a fourth-level cache. The integrated video has 72 execution units operating at a frequency of 950 MHz. This is more powerful than the Iris Pro 6200 graphics, which uses 48 actuators. As a result, the Iris Pro 580 turns out to be faster than such discrete video cards as the Radeon R7 360 and GeForce GTX 750, and also in some cases imposes competition on the GeForce GTX 750 Ti and Radeon R7 370. What else will happen when AMD switches its APUs to 16-nanometer technical process, and both manufacturers will eventually begin to use HBM/HMC memory together with integrated graphics.

Intel Skull Canyon - a compact computer with the most powerful integrated graphics

Testing

To test modern integrated graphics, I took four processors: two each from AMD and Intel. All chips are equipped with different iGPUs. So, AMD A8 (plus A10-7700K) hybrids have Radeon R7 video with 384 unified processors. The older series - A10 - has 128 more blocks. The flagship also has a higher frequency. There is also the A6 series - its graphics potential is completely sad, since it uses the “built-in” Radeon R5 with 256 unified processors. I did not consider it for games in Full HD.

AMD A10 and Intel Broadwell processors have the most powerful integrated graphics

As for Intel products, the most popular Skylake Core i3/i5/i7 chips for the LGA1151 platform use the HD Graphics 530 module. As I already said, it contains 25 actuators: 5 more than the HD Graphics 4600 (Haswell), but 23 less than the Iris Pro 6200 (Broadwell). The test used the youngest quad-core processor - Core i5-6400.

AMD A8-7670KAMD A10-7890KIntel Core i5-6400 (review)Intel Core i5-5675C (review)
Technical process28 nm28 nm14 nm14 nm
GenerationKaveri (Godavari)Kaveri (Godavari)SkylakeBroadwell
PlatformFM2+FM2+LGA1151LGA1150
Number of cores/threads4/4 4/4 4/4 4/4
Clock frequency3.6 (3.9) GHz4.1 (4.3) GHz2.7 (3.3) GHz3.1 (3.6) GHz
Level 3 cacheNoNo6 MB4 MB
Integrated GraphicsRadeon R7, 757 MHzRadeon R7, 866 MHzHD Graphics 530, 950 MHzIris Pro 6200, 1100 MHz
Memory controllerDDR3-2133, dual channelDDR3-2133, dual channelDDR4-2133, DDR3L-1333/1600 dual channelDDR3-1600, dual channel
TDP level95 W95 W65 W65 W
Price7000 rub.11,500 rub.13,000 rub.20,000 rub.
Buy

Below are the configurations of all test benches. When it comes to integrated video performance, it is necessary to pay due attention to the choice of RAM, since it also determines how many FPS the integrated graphics will show in the end. In my case, DDR3/DDR4 kits were used, operating at an effective frequency of 2400 MHz.

Test benches
№1: №2: №3: №4:
Processors: AMD A8-7670K, AMD A10-7890K;Processor: Intel Core i5-6400;Processor: Intel Core i5-5675C;Processor: AMD FX-4300;
Motherboard: ASUS 970 PRO GAMING/AURA;
RAM: DDR3-2400 (11-13-13-35), 2x 8 GB.Video card: NVIDIA GeForce GTX 750 Ti;
RAM: DDR3-1866 (11-13-13-35), 2x 8 GB.
Motherboard: ASUS CROSSBLADE Ranger;Motherboard: ASUS Z170 PRO GAMING;Motherboard: ASRock Z97 Fatal1ty Performance;
RAM: DDR3-2400 (11-13-13-35), 2x 8 GB.RAM: DDR4-2400 (14-14-14-36), 2x 8 GB.RAM: DDR3-2400 (11-13-13-35), 2x 8 GB.
Motherboard: ASUS CROSSBLADE Ranger;Motherboard: ASUS Z170 PRO GAMING;
RAM: DDR3-2400 (11-13-13-35), 2x 8 GB.RAM: DDR4-2400 (14-14-14-36), 2x 8 GB.
Motherboard: ASUS CROSSBLADE Ranger;
RAM: DDR3-2400 (11-13-13-35), 2x 8 GB.
Operating system: Windows 10 Pro x64;
Peripherals: LG 31MU97 monitor;
AMD Driver: 16.4.1 Hotfix;
Intel Driver: 15.40.64.4404;
NVIDIA Driver: 364.72.

RAM support for AMD Kaveri processors

Such sets were chosen for a reason. According to official data, the built-in memory controller of Kaveri processors works with DDR3-2133 memory, however, motherboards based on the A88X chipset (due to an additional divider) also support DDR3-2400. Intel chips, coupled with the flagship Z170/Z97 Express logic, also interact with faster memory; there are noticeably more presets in the BIOS. As for the test bench, for the LGA1151 platform we used a dual-channel Kingston Savage HX428C14SB2K2/16 kit, which overclocked to 3000 MHz without any problems. Other systems used ADATA AX3U2400W8G11-DGV memory.

Selecting RAM

A little experiment. In the case of Core i3/i5/i7 processors for the LGA1151 platform, using faster memory to accelerate graphics is not always rational. For example, for the Core i5-6400 (HD Graphics 530), changing the DDR4-2400 MHz kit to DDR4-3000 in Bioshock Infinite gave only 1.3 FPS. That is, with the graphics quality settings I set, performance was limited precisely by the graphics subsystem.

Dependence of the performance of the integrated graphics of an Intel processor on the frequency of RAM

The situation looks better when using AMD hybrid processors. Increasing the speed of RAM gives a more impressive increase in FPS; in the frequency delta of 1866-2400 MHz we are dealing with an increase of 2-4 frames per second. I think that using RAM with an effective frequency of 2400 MHz in all test benches is a rational solution. And closer to reality.

Dependence of the performance of the integrated graphics of an AMD processor on the frequency of RAM

We will judge the performance of integrated graphics based on the results of thirteen gaming applications. I roughly divided them into four categories. The first includes popular but undemanding PC hits. Millions play them. Therefore, such games (“tanks”, Word of Warcraft, League of Legends, Minecraft - here) have no right to be demanding. We can expect a comfortable FPS level at high graphics quality settings in Full HD resolution. The remaining categories were simply divided into three time periods: the 2013/14, 2015 and 2016 games.

Integrated graphics performance depends on RAM frequency

The quality of graphics was selected individually for each program. For undemanding games, these are mainly high settings. For other applications (with the exception of Bioshock Infinite, Battlefield 4 and DiRT Rally) the graphics quality is low. Still, we will test the built-in graphics in Full HD resolution. Screenshots describing all graphics quality settings are located in the screenshot of the same name. We will consider 25 fps as playable.

Undemanding games2013/14 GamesGames of 2015Games of 2016
Dota 2 - high;Bioshock Infinite - average;Fallout 4 - low;Rise of the Tomb Raider - low;
Diablo III - high;Battlefield 4 - average;GTA V - standard;Need for Speed ​​- low;
StarCraft II - high.Far Cry 4 - low.XCOM 2 - low.
DiRT Rally - high.
Diablo III - high;Battlefield 4 - average;GTA V - standard;
StarCraft II - high.Far Cry 4 - low."The Witcher 3: Wild Hunt" - low;
DiRT Rally - high.
Diablo III - high;Battlefield 4 - average;
StarCraft II - high.Far Cry 4 - low.
Diablo III - high;
StarCraft II - high.

HD

The main purpose of testing is to study the performance of integrated processor graphics in Full HD resolution, but first, let's warm up on a lower HD. The iGPU Radeon R7 (for both A8 and A10) and Iris Pro 6200 felt quite comfortable in such conditions. But the HD Graphics 530 with its 25 actuators in some cases produced a completely unplayable picture. Specifically: in five games out of thirteen, since in Rise of the Tomb Raider, Far Cry 4, The Witcher 3: Wild Hunt, Need for Speed ​​and XCOM 2 there is no place to reduce the quality of graphics. It is obvious that in Full HD the integrated video of the Skylake chip is a complete failure.

HD Graphics 530 already merges in 720p resolution

The Radeon R7 graphics used in the A8-7670K failed in three games, the Iris Pro 6200 failed in two, and the built-in A10-7890K failed in one.

Test results in 1280x720 pixels resolution

Interestingly, there are games in which the integrated video of the Core i5-5675C seriously outperforms the Radeon R7. For example, in Diablo III, StarCraft II, Battlefield 4 and GTA V. Low resolution affects not only the presence of 48 actuators, but also processor dependence. And also the presence of a fourth level cache. At the same time, the A10-7890K outperformed its opponent in the more demanding Rise of the Tomb Raider, Far Cry 4, The Witcher 3 and DiRT Rally. The GCN architecture works well in modern (and not so modern) hits.

Intel processors, like their competitors, have integrated (built-in) graphics. It allows you to avoid buying an expensive video card if you don’t need it. Also, integrated graphics in the processor is useful in laptops, as it allows you to save battery power by using these graphics only in powerful applications. The rest of the time the graphics core of the processor is blown away.

Introduction

The choice of integrated graphics is given special attention in 2 cases:

  • you are not going to buy a separate adapter since you do not need high performance for your desktop PC

Basically, it is these two situations that make people pay special attention to integrated graphics.

Here, as in our other articles, chips produced before 2010 will not be considered. This means we will only touch on Intel HD Graphics, Iris Graphics and Iris Pro Graphics

The question of installing integrated graphics in powerful gaming processors remains unclear, because they are used only in conjunction with a powerful video card, which even the most powerful integrated graphics cannot hold a candle to. Most likely, this is due to the high cost of rebuilding the processor assembly line, because the cores of many chips are identical and they are assembled almost identically, and no one is going to change the assembly for the sake of a couple of models. But in this case, we would get greater performance due to the fact that more transistors would work on the processor, but the price in this case would rise.

Everyone knows that AMD's integrated graphics are more powerful than Intel's. Most likely this is due to the fact that they previously thought about creating hybrid “stones” (with a video core). If you want to know about the markings and lines of all AMD graphics (including built-in ones), then you, and a similar article about, is also available at the link.

Interesting fact: the PS4 has processor-integrated graphics, rather than a separate graphics chip.

Classification

A mistake that many people make is that integrated graphics does not necessarily mean the graphics core built into the processor. Integrated graphics are graphics that are built into the motherboard or processor.

Thus, integrated graphics are divided into:

  • Shared Memory Graphics – These graphics are built into the processor and use RAM instead of separate video memory. These chips feature low power consumption, heat dissipation and cost, but 3D performance cannot be matched by other solutions.
  • Discrete graphics - the hardware is a separate chip on the motherboard. Has separate memory and is generally faster than the previous type.
  • Hybrid graphics are a combination of the two previous types.

Now it is clear that Intel chips use shared memory graphics.

Generations

Intel HD Graphics first appeared in Westmere processors (but there were integrated graphics before that).

To determine the performance of a video processor, each generation must be considered separately. The best way to determine performance is to look at the number of execution units and their frequency.

This is how things stand with graphics generations:

Generations of integrated graphics by numbers
MicroarchitecturesRegular modelsPowerful models
5 WestmereHD*
6 Sandy BridgeHD* /2000/3000
7 Ivy BridgeHD*/2500/4000
7 Haswell/Bay TrailHD* /4200-5000Iris* 5100/Iris Pro* 5200
8 Broadwell/Braswell/Cherry TrailHD* /5300-6000Iris* 6100/Iris Pro* 6200
9 Skylake/Braswell/Cherry TrailHD* 510-530/40xIris* 540/50/Iris Pro* 580

Where Graphics is replaced with *.

If you are interested in learning about the microarchitectures themselves, then you can look at this.

The letter P means that we are talking about a Xeon processor (server chips).

Every generation before Skylake has an HD Graphics model, but these models are different from each other. After Westmere, HD Graphics is installed only in Pentium and Celeron. And it is worth distinguishing separately HD Graphics in mobile processors Atom, Celeron, Pentium, which are built on mobile microarchitecture.

Until recently, only identical HD Graphics models corresponding to different microarchitectures were used in mobile architectures. Graphics of different generations differ in performance, and this generation is usually indicated in brackets, for example Intel HD Graphics (Bay Trail). Now, with the release of the new 8th generation of integrated graphics, they will also differ. This is how HD Graphics 400 and 405 differ in performance.

Within one generation, productivity increases with increasing numbers, which is logical.

With the Haswell generation, slightly different chip markings began to apply.

New marking with Haswell

First digit:

  • 4 – Haswell
  • 5 – Broadwell

But there are exceptions to this rule, and in a few lines below we will explain everything.

The remaining numbers have the following meaning:

* - means that the thousands place is increased by one

The GT3e features additional eDRAM cache, which increases memory speed.

But with the Skylake generation, the classification has changed again. The distribution of models by performance can be seen in one of the previous tables.

Relationship between processor markings and integrated graphics

These are the letters that mark processors with built-in graphics features:

  • P – means disabled video core
  • C – enhanced integrated graphics for LGA
  • R – enhanced integrated graphics for BGA (nettops)
  • H – enhanced integrated graphics in mobile processors (Iris Pro)

How to compare video chips

Comparing them by eye is quite difficult, so we recommend that you take a look at this, where you can see information about all integrated Intel solutions, and where you can see the performance rating of video adapters and their results in benchmarks. To find out what graphics are available on the processor you need, go to the Intel website, search for your processor using the filters, and then look in the “Graphics built into the processor” column.

Conclusion

We hope that this material helped you understand integrated graphics, especially from Intel, and will also help you in choosing a processor for your computer. If you have any questions, first look at the instructions in the “Introduction” section, and if you still have questions, then you are welcome to comment!

You've probably come across this built-in graphics card in laptops. However, this name hides several chips that were built into a variety of processors, and the first of its generation was actually in the motherboard.

Before Intel HD Graphics, we mostly dealt with the Intel Graphics Media Accelerator (GMA), and you can read about it. These graphics were not at all suitable for games, which, of course, did not suit many users. And since users want built-in graphics that they can play on, it means that manufacturers are trying to give it to them. So Intel HD Graphics was released.

It first appeared in 2010 along with mobile processors. At that time, they were already manufactured using a 32-nm process technology, but the graphics, which were not yet built into the processor itself, but soldered into the motherboard, were manufactured using a 45-nm process technology. Its graphics performance (3D image processing in particular) was 70% higher than the Intel GMA.

In 2011, the platform appeared. Here, a graphics chip was already built into the processor, manufactured using a 32 nm process technology. In the mobile version it was Intel HD 2000 for Core and Intel HD Graphics for Pentium and Celeron. This was the second generation of Intel HD.

The third generation appeared simultaneously with in 2012. In Celeron and Pentium there was an option without an index - just Intel HD Graphics, in Core - Intel HD Graphics 2500 and . It supported DirectX 11 - and this is important for games. This graphics core has also been used for integrated graphics in processors.

The line showed us a fairly large number of Intel HD options. The simplest one, without an index, as usual, went to Pentium and Celeron, and those with an index went to other processors. We will tell you about them in a separate article.

In the meantime, learn more about Intel HD Graphics, built into Haswell processors. It included 10 execution units and 40 shader processors. In terms of performance, it is approximately equal to the AMD Radeon HD 6450 - as you can see, it’s a pretty good option. In addition, it outperforms the NVIDIA GeForce GT 620.

Likewise, there is support for DirectX 11.1, Shader 5.0, OpenCL 1.2 and OpenGL 4.0. There is a decoder for high-definition 4K video. Clock speed varies depending on the processor model. The exact clock frequency (both in normal mode and in turbo) can be found at the very end, column 7.

What games can you play on this video card? At minimum settings - almost any, but not the most modern ones. Dota 2 is quite suitable. If this particular video card is installed in your laptop, please share what games you play. This material concerns specifically Intel HD Graphics without an index. Let us remind you that in Haswell it can be found in Pentium and Celeron processors.

Best articles on the topic