Опубликовано 28 января 2005, 00:00

Graphical multiprocessing, or NVIDIA's SLI in Gigabyte's make

The price of a motherboard that supports SLI in combination with two video cards is even higher than that for a single-slot solution of similar performance. If you are up to buying a SLI-enabled computer right now, use of Gigabyte’s products is more than justified.
Graphical multiprocessing, or NVIDIA's SLI in Gigabyte's make

Theory

The performance of graphic accelerators has long been a matter of agitated discussions. Some say that it is already over the top and no one will need more that it is, others say attaining a substantial leap in the graphics quality would require a graphic performance many times as high versus today's top records. Both are right, since the application area for household graphic accelerators so far is restricted to computer games only. Many don't play games, and even if they do, then Windows' default mine-sweeper suits them. Others want to watch a photorealistic show on the screen so that even upon a close look you couldn't tell a real film and a mathematically created illusion apart. But don't forget that a graphic accelerator is merely a tool for creating illusions, and true illusionists are game developers trying to create a gaming world maximally brought to reality.

It takes a lot of time writing a game. For example, Half-Life 2 and DooM 3 were implemented for 3-4 years, which is normal. Just imagine - the game you spend a month to walk through was written for years. In so doing, if the new 3D Action runs well on your computer, then it's somehow strange because following the logics it would be not bad at all to buy some extra half-gigabyte of memory and do some minor upgrade. The game started being developed several years ago when today's level of hardware was something in the dreams like in a science-fiction novel. The conclusion is this: developers have got to have the hardware that goes slightly ahead of their time, and two accelerators harnessed together, or simply SLI, is just the right way out.

Also, two accelerators harnessed in a bundle are useful for the household market as well. Imagine that you wish to bus an expensive graphic accelerator for today's games so that all ran smooth and without jitter at all. You go to the shop and buy one accelerator at about $600. In a year or two, the capacity of your system is no longer enough although other components still don't need an upgrade. You go to the shop and buy the second graphic board, absolutely identical as is already installed in your system but which has turned two or three times as cheap,and you again get an up-to-date video system. Take it on trust - that is as possible as making today a GeForce Ti4200 available at $70 which two years ago was about $200-300. Simply speaking, the SLI system allows making the video subsystem of your PC more scalable.

As you may have guessed, this review is devoted to the SLI video system or, to be more precise, to its further generation. To grasp the idea of the whole process, we have to turn back to the cradle of development for 3D gaming graphic accelerators, to the year 1995 when the first 3Dfx's graphic accelerator Voodoo Graphics came into being. That accelerator was made up if two chips - PixelFX and TexelFX both running at 50 MHz. The PixelFX was the frame buffer, whereas TexelFX did the job of texture application. In 1998, there appeared 3Dfx Voodoo 2. That accelerator already offered two TexelFX 2 chips running at 90 MHz which were in fact prototypes of today's pixel pipelines inside modern graphic processors. It's just since 3DFx there appeared such a concept like multi-texturing, i.e. application of two textures on an object per single pass. Also, one of the main features of VooDoo 2 was the possibility to operate in the coupled mode - ScanLine Interleaving, or the SLI abbreviated. Two VooDoo 2 cards were installed into two neighboring slots and connected with a cable for synchronization. A system like that operated as per the following principle - the first card handled the even frame lines, with the other one handing the odd lines. Then, the graphic data from the slave card was fed via an external cable to the master card in which both even and odd lines were summed in the frame buffer, and the image was displayed on screen. Such a system gave the most powerful performance boost which was impossible to attain for the competitors. Later, 3DFx invented a new technology for the SLI system – the Voodoo Scalable Architecture, or VSA-100. The technology allowed running up to 32 graphic processors simultaneously, which was implemented in the card by Quantum3d. In so doing, each graphic processor was handing successively fed frame lines. The number of lines handled by each of the graphic processors was varying dynamically within 1 to 128 depending on the complexity, and then the whole picture was consolidated and displayed as a finished frame.

Unfortunately, 3DFx no longer exists, but its ideas are living inside NVIDIA because most of the wealth heritage came just to the company. NVIDIA acquired 3DFx early in 2001. Now after four years of that event we see the SLI technology coming back to the market of graphic accelerators, already from NVIDIA. Malicious gossips say that 3DFx came to a demise at that idea and went broke with its SLI technology, that NVIDIA is awaiting the same fate, but nobody has serious grounds for stating that, so they shouldn't be taken seriously.

In deed, NVIDIA did use the SLI abbreviation in its new products since the name is associated with two video cards fitted within a single PC. However, at that the similarity to 3DFx's technology in modern NVIDIA products ends. Even the SLI abbreviation has acquired an absolutely different meaning and now stands for the Scalable Link Interface. Let's now sort out how it works.

Today's NVIDIA's SLI technology can operate by two ways:

First – the PGC (Parallel Graphics Configuration), that is dissection of a frame into two horizontal halves whose dimensions vary dynamically. The upper half is handled by the first (master or primary) accelerator two which the monitor is connected. The bottom half of the frame is handled by the second (slave, or secondary) accelerator. Originally, each accelerator handles 50% of the area, but if the upper part of the screen is simpler to generate and processed faster by the accelerator that the bottom part, then the area of the upper part is increased to form a right balance of load upon the two accelerators simultaneously, with the same holding true to the reverse side. That is related to the nonuniformity of the graphic component on the screen, because the upper part shows only the skies which in fact is one big texture, as well as large polygons of distant objects, whereas the bottom parts does most job of handling the scene details. Wicked3D tried to implement a similar technology for the first time, but it suffered from the serious shortcoming like the lack of dynamically variable area of the parts even if the two graphic accelerators were installed into slots of different speed - AGP and PCI. As a result, the system did not work faster than the PCI card handling its half of the frame. That technology by NVIDIA is free from that shortcoming. On the other hand, questions regarding this technology still remain. For example, if the shader surface is cut in half and the shader is to feed data from the bottom part to the upper, then how will these two parts communicate because the shader is handled inside the graphic processor. As a walkaround, the shader should be executed in both accelerators simultaneously, with the not needed pieces will be simply cut off as not wanted. Therefore, it turns out that the accelerator would have to do a lot of excessive work in complex applications.

Second – AFR (Alternate Frame Rendering) which is about generation of full-featured frame by each of the accelerators with further stitching of the sequence in the frame buffer. That technology has already been implemented by ATI, a competitor of NVIDIA, and ATI dubbed it as RAGE MAXX, but it hasn't gained widespread occurrence because of the inadequate marketing policies. As a consequence of this technology, both graphic accelerators with all their identity are always loaded equally, because each further step could be done by a single accelerator.

Anyway, there won't be dual performance gain because while the PGC technology minimizes execution of similar tasks, this technology is based on the parallel execution, i.e. all the shaders etc. are handled in parallel in two accelerators. If we recall the "overheads" to do with SLI operation, like synchronization, data transmission etc, the speed boost albeit seen will definitely not be twice as high.

The physical implementation of the SLI has become possible due to the PCI Express bus. First, nothing was contrary to creating two PCI-Express slots to fit two video cards into them, the high bus bandwidth also allowed not to make a narrow bottleneck in this place, because it was easy for AGP or PCI to feed one accelerator, but two is too much and a single slot is unlikely to do that. Two NVIDIA's identical accelerators are fitted into two PCI-Express 16х slots and are connected with a U-shaped adapter from above for synchronization. This adapter comes as a bundled item with motherboards that offer support for the SLI mode.

All NVIDIA chips of the sixth GeForce series for PCI-Express purely theoretically could run in the SLI mode, because inside each of the chip there is an interfacing module, so it makes no sense waiting for GeForce 6200 video cards with the possibility to install into the SLI, since the lack of such a functionality was governed more due to the marketing goals. Nor it makes sense expecting an early implementation of the possibility to install two different video cards based on different chips. Nevertheless, from the technology viewpoint, such a goal is feasible, and that means even if NVIDIA gives up adding such a feature into own drivers, skilled craftsmen will make that true.

Some motherboards that offer support for two PCI-Express x16 slots simultaneously are also of issue. To date, there are only two products which support functionality like that. The first is Intel's  server chipset E7525 Tumwater for XEON processors, awfully expensive and absolutely unsuitable for games. The second one, slowly emerging onto the market is Nvidia N-Force 4 for AMD Athlon 64 or AMD Athlon FX processors of socket 939. This chipset offers support for two PCI-Express slots only in the top-end implementation of N-Force 4 SLI, and the two cheaper models lack such functionality, but we'll be talking about that later. In the nearest time, we can also expect NVIDIA's N-Force 5 for Intel's processors with support for the SLI.

But theory doesn't give the hands-on experience, so let's smoothly move on to the practical tasks. It's just them which should show the essence of the SLI, how it works in real-world tests.

Practice

To start with, let's describe the hardware with which we ran the tests and say a few notes on that. We received a hardware kit for assembling a SLI system from Gigabyte comprising a GA-K8NXP-SLI (N-Force 4 SLI) motherboard, two Gigabyte GeForce 6600GT video cards and AMD Athlon 64 4000+ kindly presented by AMD. To run the tests, we also took two NVIDIA's original boards based on GeForce 6800GT, and one by NVIDIA's competitor - ATI's X850XT video card. Now let's describe the major components in detail, and start with the processor.

Processor

If you finally decided to assemble a system using the SLI technology, you must be well aware that without a powerful CPU the powerful video subsystem will be absolutely useless.

AMD Athlon 644000

AMD Athlon 644000

In all sincerity – even Athlon 4000 is not enough, for the right balance it is advisable that you have, say, AMD Athlon 64 5500+ or even Athlon 64 6000+, although it is a pity they are not yet in existence, but once they emerge they will be the most suitable choice for the SLI. The thing is, all modern graphic applications are strongly CPU-bound, so this boundedness is the major performance constraint for some games. However powerful video card you install, you won't be able exceeding a certain FPS rate if the processor performance is not enough - all you could do is simply load the video subsystem through improving the image quality due to setting higher resolution, anti-aliasing and anisotropy because these features are not CPU-bound. As you may have noticed, AMD Athlon 64 4000+ is AMD's best model to date, so we can keep track of the CPU-boundedness of SLI versus the most advanced CPU models.

GA-K8NXP-SLI Motherboard

Motherboards by Gigabyte have always been regarded among the best on the whole market of cutting-edge technologies owing to their combination of superb specifications and moderate prices. Of course, moderate prices are possible only with economy class products, which can't be said about the motherboard that we received for tests. GA-K8NXP-SLI is one of the first among the future models based on the N-Force 4 chipset. This chipset was announced by NVIDIA in the fall of 2004 and can be installed on motherboards in the three versions:

N-Force 4N-Force 4 UltraN-Force 4 SLI
Fast S-ATANoneyesyes
Hardware-embedded FireWall Active ArmorNoneyesyes
Support for two PCI Express 16 slots to install SLI systemsNoneNoneyes
Price~120$~150$~200$

As you can see, only the most expensive modification of the chipset has two slots compatible to PCI Express 16x, so only that modification is suitable for building a SLI. That is why we used a motherboard built on the N-Force 4 SLI chipset. Here we should make a reservation and mention that N-Force 4 operates with a dual-channel memory controller, and thus it supports only AMD Athlon 64 processors fitted into Socket939 because the memory controller in this series of processors is integrated into the core. Therefore, it hardly makes sense to expect the emergence of motherboards built on N-Force 4 for Socket 754.

But let's turn back to the motherboard. GA-K8NXP-SLI  is shipped in a doubled box aimed at Bundle products.

GA-K8NXP-SLI Motherboard

GA-K8NXP-SLI Motherboard

That is, in one part of the box there is the motherboard with the package bundle, and in the other part there is a kit of products which come as bonus items and are optional since they add much to the price of the whole package.

Motherboard GA-K8NXP-SLI - WiFi DPS

Motherboard GA-K8NXP-SLI - WiFi DPS

The package bundle of the motherboard is not of much interest since it is absolutely standard, but among the additional items there is the voltage stabilizer module DPS (Dual Power System), and a small board for access to the Wi-Fi 802.11g local network. I think all is clear with the Wi-Fi, so I'd better dwell on the DPS. This module is aimed at extra stabilization of the CPU voltage and has enhanced circuits which duplicate the stabilizers already installed on the motherboard. This improves the operational stability of powerful and overclocked processors which consume large amount of electric energy.

The board itself is of standard ATX form factor. To the right, there is a DPS connector. There are also four banks to install memory modules. To make the board operate in the dual-channel mode, two DDR modules should be installed into slots of different colors simultaneously. For example, the first module is installed into the first slot, with the second one to the third. The connector socket for the Socket 939 processor is installed at the center of the motherboard and has no large components nearby which hinder installation of a cooling system. The latter in turn is installed on a plastic skeleton fastened over the perimeter of the processor.

GA-K8NXP-SLI

(380 K)

(380 K)

GA-K8NXP-SLI

(380 K)

(380 K)

Here are the motherboard's brief specifications:

Manufacturer and the nameGA-K8NXP-SLI
Supported processorsAMD Athlon 64 Socket 939
North bridgeN-Force 4 SLI
Supported memory4 GB PC3200 DDR400 ( 4 slots, 1 GB each)
Memory controllerDual-channel
Hard disk controller standard2x Ultra ATA 133\100
RAID1 hard disk controllerBased on ITE IT87124x S-ATA
RAID2 hard disk controllerSilicon Image sil 3114 controller4x S-ATA
Expansion slots2xPCI Express 16x, 2xPCI Express 1x, 2xPCI Standard (2.3)
Integrated network cardMarvell 8053 10\100\1000CICADA8201 10\100\1000
Integrated audioRealtek ALC850 Audio AC'97 Codec 8xChannel
I/O ports6xUSB, 3xIEEE1394, 2xPS\2
Outputs to the rear panel2xPS\2, 4xUSB, 2xSPDIF Coaxial, 6xMiniJack, 2xTP Gigabit Ethernet, 1xCOM,1xLPT
Cooling outlets (for coolers installed on the motherboard)3
PriceIt is still unavailable in Russia, but according to preliminary data, it costs about 300$.

As you can seem the functionality of the motherboard has nothing to complain about, except the lack of optical outputs for the integrated audio system.

GA-K8NXP-SLI motherboard - ports

GA-K8NXP-SLI motherboard - ports

We can now smoothly move on to reviewing the SLI features of the motherboard. To provide for the normal SLI operation, you should inform the motherboard that you intend to enable the mode, otherwise the graphic multiprocessing system will not be detected by the video driver and you will get two simultaneously running video cards with the possibility to plug in 4 monitors. In GA-K8NXP-SLI, this switching is done with NVIDIA's reference solution - a "mega" jumper in the form of a small board resembling a SODIMM memory module for notebooks. To select the operation mode, the board has to be inserted into the connector by the required side, that is - want SLI, remove the board, turn it over and fit in place. But insert it carefully - if you warp it or plug it loose, you will see the bugs of the driver unwilling to enter the SLI mode. Also note that change of mode is impossible with the video card fit into the second slot because it doesn't let you through to the board.

Motherboard GA-K8NXP-SLI - a SLI jumper

Motherboard GA-K8NXP-SLI - a SLI jumper

Once you set the jumper into the SLI mode, you can safely fit your pair of video cards into the PCI Express connectors and cover them on top with a U-shaped adapter to synchronize operating of the two boards. Albeit not smart, the adapter is quite an interesting device, and the interest to it is that it can't be reached.

GA-K8NXP-SLI a SLI-connector

GA-K8NXP-SLI a SLI-connector

For a long time, no one could understand what the adapters should be shipped with - video cards or motherboards, perhaps at extra money and separately. The choice fell upon the motherboard, so now all the motherboards built on the N-Force 4 SLI chipset will be shipped with adapters for synchronization. That's a right choice since it allows motherboard manufacturers to avoid the standard positioning of two PCI Express 16x slots to either side through varying the adapter length.

SLI connector

SLI connector

On the other hand, that is a substantial shortcoming since other manufacturers of chipsets with two PCI-Express slots (Intel Tumwater) won't be able to install SLI. Although we can state fore sure that if SLI becomes as widespread, the shortage for adapters will somehow be solved, because you can weld up a rig like that made up of regular PCI connectors.

Through the BIOS of the motherboard you can select in which connector the video card will be initialized as the primary.

Through the BIOS of the motherboard you can select in which connector the video card will be initialized as the primary.

Through the BIOS of the motherboard you can select in which connector the video card will be initialized as the primary.

In fact, any new technology brings with it a lot of unresolved bugs and issues. That is why GA-K8NXP-SLI simply needs a newer BIOS version which would solve all the related issues. The number of issues is not that great - they are all minor. For instance, the numbers of S-ATA outputs have got mixed up, or the BIOS hangs up while detecting the HDD devices, nor there is response to disabling one of the IDE channels.

BIOS hangs up

BIOS hangs up

On the other hand, the BIOS proved a good showing since it allows adjusting interrupts for almost all the devices, that is why the issues were solved to the best. For instance, two video cards were sole owners of their lines: 10 interrupts were allotted for the first card, and 11 for the second.

Nevertheless, the SLI system does not end on the motherboard alone, that is why we are moving on to video cards taking part in our tests.

Video cards

Gigabyte presented us two NVIDIA GeForce 6600GT video cards as the test platform. These boards are aimed at the mainstream sector, that is, they are within the price range below $200. On our market, Gigabyte GeForce 6600GT for PCI Express sells as 170-190$, which is affordable to many. However, don't forget that you've got to have two boards, which means the cost of such a kit based on 6600GT video cards would be somewhere within 350-400 dollars.

Gigabyte GeForce 6600GT

Gigabyte GeForce 6600GT

The board ships in Gigabyte's standard package made of external glossy and internal white cardboard.

Gigabyte GeForce 6600GT - package

Gigabyte GeForce 6600GT - package

The package bundle includes an adapter to connect a TV including the component color connection, a DVI-I -> D-Sub adapter, documentation, games and drivers CD. Games shipped with the card, not bad indeed, are worth of attention, these are Joint Operations and Thief – Deadly Shadows. Amusing is the fact that in buying a SLI kit you get two copies of the whole package bundle which, softly speaking, is useless. If we let our imagination run away a bit, we can assume that in the nearest time there are coming up products specially aimed for SLI where the Retail package will contain two cards each.

Gigabyte GeForce 6600GT - package bundle

Gigabyte GeForce 6600GT - package bundle

To evaluate the card, let's look through the table of its specifications:

Gigabyte GeForce 6600GT specifications
ManufacturerGV-NX66T128D
GPUNV43
Chip frequency, MHz500
MemoryGDDR3, 128 MB
Memory speed, MHz500 (1000)
Memory bus width, bit128
InterfacePCI Express 16x
RAMDAC2x400 MHz
Number of vertex pipelines3
Number of pixel pipelines8
APIDirectX 9.0c, OpenGL 1.5
Cooling system controlN/A
Additional power supplyNot required
Connectors for plugging in monitorsD-Sub and DVI
TV-OutYes
TV-INNone
Price, $170-190

Judging by the table of specifications, this board appears to be a half of 6800GT or 6800 Ultra, but with own frequencies. But our task here is not to measure performance of this card as a single solution. According to Gigabyte's claims, this board will run in the SLI mode with their GA-K8NXP-SLI motherboard without an issue and thus is recommended for installation in a SLI system. That is, we should take the "GA-K8NXP-SLI + 2х GV-NX66T128D" kit as a generalized offer by Gigabyte for building a SLI system.

The board is made following NVIDIA's original design for GeForce 6600GT, with the textolyte of proprietary blue color.

Gigabyte GeForce 6600GT - top view, without a radiator

(380 K)

(380 K)

Gigabyte GeForce 6600GT - top view, without a radiator

(380 K)

(380 K)

Gigabyte GeForce 6600GT - rear view

(380 K)

(380 K)

Gigabyte GeForce 6600GT - rear view

(380 K)

(380 K)

The four memory modules are positioned over the upper part of the board at an angle, two on each side by the graphic processor.

Samsung memory modules are marked K4J55323QF-GC20

Samsung memory modules are marked K4J55323QF-GC20

Samsung memory modules are marked K4J55323QF-GC20. Judging by the marking, the modules offer 2 ns access time, which is approximately equivalent to 500 MHz. It should be noted that these modules are the slowest of the K4J55323QF series; there are even 600 MHz and 700MHz circuits, which means the installed 500MHz chips may have a high margin of safety. On the other hand, the memory subsystem already runs at 500MHz, that is, there is no gap between the marked and real frequency. However, these modules are already known to us from Gigabyte boards because they are installed on Gigabyte X800XT for PCI Express and Gigabyte 6800GT for the AGP bus. In our experience, these modules offer a high safety margin and can be easily overclocked by 50 MHz thus making the resultant frequency at about 1100 MHz. The graphic processor was made on the 40th week of year 2004, which is equivalent to early October and has stepping A2. GPU runs at 500MHz and supports the shader model of the third version.

Assembly and launch of SLI

SLI system after assembly

SLI system after assembly

This is how the SLI system looks like after assembly. In its standard make, assemblage of a computer like that will not be of issue because no fine tuning is required. After installing two video cards, powering on the computer and loading the operating system, the video driver detects two video boards and displays image to all the available outputs and displays a prompt saying the multiprocessing mode can be enabled.

On the video driver control tab, you can select the "enable multiprocessing mode" option. This option becomes accessible only the installed video boards fully match to one another, that is, are fully identical. If the driver detects any slightest differences, it blocks start up the multiprocessing mode. It should be noted that any differences or faults detected by the driver emerge just at the moment of SLI mode start-up in the form of its inability to get activated. If all is OK, the computer will ask for rebooting on selecting the option, and upon the reboot the SLI mode gets enabled. By the way, after enabling the SLI only one monitor can be plugged in and only to the primary video card. That is, you have four outputs onto monitors, but only one of them will work in the SLI mode. If you have two monitors and use them both for the desktop, only one monitor will stay operative. You can revert the images onto additional monitors only through disabling the SLI.

On the driver tab, there is quite an interesting option - display the SLI operation graph. In the current version of the driver, it looks like a green line that separates the screen into two parts, with two vertical columns emerging to the left of it. In the PGC mode (Parallel Graphics Configuration) described above, this system will operate as follows: the horizontal strip will move just at the point of job separation between the two accelerators. The image above the strip is handled by the primary accelerator, and what is below that - by the secondary, the vertical strip to the left will be static and nothing will be displayed. With the second mode of SLI operation - AFR (Alternate Frame Rendering) - the system runs as follows: the horizontal line will divide the screen precisely by half, with the columns to the left indicating the level of load upon the graphic processors on video cards.

Also, the driver allows selecting the method of processing a 3D application, but this option does not run quite correctly. There are several causes of that. First, the driver can detect a launched application and enable the mode most suitable for that application. For example, for FarCry the PGC method will be used, and for 3Dmark05 – the AFR method. The user can change the presets, but that is not recommended by the manufacturer.

Notes on the PSU

If NVIDIA requires high power consumption even with one video card, what will it be like if two cards are installed? That's right - nothing good. Only the power consumption will be high. With two Gigabyte GeForce 6600GT, the Codegen 480W PSU coped with the load well, but after installing two NVIDIA GeForce 6800GT cards, it turned really bad. The unpleasant sound from transformers whined up, so we decided not to run further risks and powered all off. With a PowerMan 520W PSU, the problems vanished. The conclusion - if the PSU in your system is below 500W, use of the SLI may prove quite a risky job: you get the choice between "what burns down earlier - the PSU or all the other components".

Opponents

As contenders in the performance comparison, we used the below video cards. The first rivals, being another participant of the major competitions, were NVIDIA's original GeForce 6800GT video cards running in the SLI mode.

GeForce 6800GT in the SLI mode

GeForce 6800GT in the SLI mode

Many think that it makes sense using just them as the major ground for describing the SLI system operation, but all is not that simple as it seems. Originally, it was planned just the way, but during the tests another amusing problem came up - serious bugs in the operation of the sweet couple. Each board separately was running fine, but as a pair they produced an image like that:

There are evident problems with image synchronization - the SLI system is running in the AFR mode, so the whole screen is rippling

There are evident problems with image synchronization - the SLI system is running in the AFR mode, so the whole screen is rippling

This photo was taken from a corner of Codecreatures. There are evident problems with image synchronization - the SLI system is running in the AFR mode, so the whole screen is rippling. In the PGC mode, only its bottom part was rippling, which results in the problems of passing image from the secondary card. We traded places of the video boards and used whatever methods to affect them including replacement of the PSU into a more powerful, and the bug still persisted in its existence. In our viewpoint, this problem may be related to to both video cards and motherboards and quite probable can be eliminated through reflashing with a newer BIOS version in each component - an example of that is that with Gigabyte's 6600GT we found no issues. Nevertheless, we decided that wouldn't affect the graphic processor functioning - it does its job quite well, thus the measurement of speed in such a situation would be correct. We also wanted to use NVIDIA GeForce 6800 Ultra, but unfortunately those models could not be made available during the time set for the tests. In fact, even later we we unable to find them anywhere since they are still in acute shortage.

As another contender to NVIDIA's cards we used ATI's latest solution – X850XT PE. This video card offers a substantial charge of vivacity and is even able bringing a number of unpleasant surprises to the SLI system.

ATI – X850XT PE

ATI – X850XT PE

Well, let's smoothly move on to the benchmarking part.

Benchmarking

You can get the idea of all the testing methodology in our article devoted to this issue. First off, we should make a reservation – in this part of the review, we'll be referring to the article on methodology, since we believe it does not make much sense in describing the purposes of all the tests, methods of testing as well as the bottlenecks typical of all modern graphic accelerators regardless of the manufacturer and model all in one article. The article on methodology is continuously being updated and complemented. Nevertheless, I'd like to outline the major points:

Our test-bench comprises the following components:
• Processor AMD Athlon 64 4000+;
• Motherboard Gigabyte GA-K8NXP-SLI (N-Force 4 SLI);
• Hard disk Hitachi DeskStar S-ATA 7200 rpm 250Gb;
• PSU - PowerMan 520W.

Our test lab appreciates AMD, Gigabyte, NVIDIA, and PatriArch for the hardware presented for tests.

We performed benchmarking with the following programs:

  1. 3D Mark 2001 SE Ver330 (DirectX 8.1)
  2. 3D Mark 2005 (DirectX 9.0c)
  3. Quake3- Arena (OpenGL)
  4. Codecreatures Benchmark Pro (DirectX 8.1)
  5. Doom III (OpenGL)
  6. SeriousSam 2 (OpenGL)
  7. Tomb Raider 5 (DirectX 9.0b)
  8. HALO  (DirectX 9.0b)
  9. Aquamark (DirectX 9.0b)
  10. FarCry(DirectX 9.0b)
  11. SpecViewPerf 8 (DirectX 9.0b)
  12. Return to Castle Wolfenstein
  13. Half-Life 2 (DirectX 9.0c)

As you can see, Unreal Tournament 2004 and Unreal II by EPIC Games are not taking part in our benchmarking session. They flatly refused to start up in the SLI mode - in fact, they did start up, but only one video card was running in the PGC mode, and upon exit there the blue death screen appeared telling something of problems with interrupts which actually weren't there before.

In all the modes, all the cards demonstrated excellent 2D image quality. In none of the video cards any artifacts or issues were found. However, it should be kept in mind that this parameters strongly depends on the monitor, cable and the quality of all the wiring.

In all the below graphs, the color scheme was preserved. The blue color stands for the Gigabyte GeForce 6600GT video card, red stands for Gigabyte GeForce 6600GT in the SLI mode, green – NVIDIA GeForce 6800GT, yellow – NVIDIA GeForce 6800GT SLI, and violet – ATI X850XT PE.

3DMark 2001 SE

Being old enough, this benchmark does its job quite well and indicates the performance level for most DircetX 8.1 games.

3DMark 2001 SE

3DMark 2001 SE

Judging by the graph, the victory came to the ATI's product which overtook NVIDIA's double-headed monsters. On the other hand, the scores attained in these tests are simply over the top, and the FPS is quite enough for a comfortable gameplay with any of the cards, you won't even notice any difference. Such similar results in all the resolutions are caused by the strong CPU-boundedness. Let's make the task a bit more complicated through enabling the 4X FSAA and 8X AF.

3DMark 2001 SE

3DMark 2001 SE

The picture has slightly changed, and the victory came to NVIDIA GeForce 6800GT SLI - to be more precise, two NVIDIA GeForce 6800GT accelerators. It is evident that this pair of accelerators demonstrates a CPU-boundedness in this mode either. Amusing is the fact that Gigabyte GeForce 6600GT in the SLI mode takes a lead over NVIDIA GeForce 6800GT even without it, and only as the resolution goes up it loses its positions.

3DMark 2001 SE

3DMark 2001 SE

Wow, what a test! - all are as pretty as they come, with a good index of speed drop index. That is a definite victory for NVIDIA GeForce 6800GT SLI. Note that on the average the SLI mode gives the cards based on NVIDIA GeForce 6800GT a 1.5 times speed boost, like to all the Gigabyte GeForce 6600GT cards. ATI X850XT PE is a serious competitor to NVIDIA GeForce 6800GT SLI. Let's make the task more complicated.

3DMark 2001 SE

3DMark 2001 SE

With the FSAA + AF enabled, NVIDIA GeForce 6800GT SLI proved an undisputable leader. The performance boost gained from using the SLI mode is somewhere within 70-80%. Gigabyte GeForce 6600GT in the SLI mode is close on the heels to NVIDIA GeForce 6800GT, and even overtakes it at the high resolutions.

Codecult Codecreatures

The test is very similar to the previous 3Dmark 01 SE's Nature.

Codecult Codecreatures

Codecult Codecreatures

In this benchmark, the following trait catches the eye: Gigabyte GeForce 6600GT in the SLI mode makes no performance difference at all versus the same card without the SLI mode enabled. The answer is simple – the SLI mode for Gigabyte GeForce 6600GT video cards failed to start up due to an unknown reason, while all is running fine on NVIDIA GeForce 6800GT in the SLI mode let alone that the gain from using the SLI proved not very high. At that, ATI X850XT PE loses everywhere except the top resolution. Now moving on to the anti-aliasing tests.

Codecult Codecreatures

Codecult Codecreatures

Again, Gigabyte GeForce 6600GT failed to start in the SLI mode - even more, the result it showed was worse than that without enabled SLI, but NVIDIA GeForce 6800GT SLI this time beat ATI X850XT PE. The following principle becomes clear: the higher the load, the more comfortable NVIDIA GeForce 6800GT SLI feels, and the wider is the gap between ATI's X850XT PE.

Quake 3

Quake 3

Quake 3

The result is indeed amusing, interesting and indicative. As we know, Quake 3 is a very old and thus CPU-bound game. As is evident from the results, all the tests run in the SLI mode lost to these cards in the single mode. There arises the question why. The answer is simple - the SLI system uses only the CPU resources for its own management, which results in the drop of performance. But as it comes closer to the upper resolutions, Gigabyte GeForce 6600GT ran out of its power margin, it lost its CPU-boundedness because the processor is not strong enough, and here we see a speed boost gained from using the SLI mode. Note that in almost all of the video cards the result does not differ depending on the resolution, except the above mentioned Gigabyte GeForce 6600GT. If we are right, the picture with the AF enabled should also change.

Quake 3

Quake 3

Unfortunately, the picture hasn't changed because even the enabled image optimization features did not add much to the speed. However, as the resolutions went up, our expectations came true, and we saw a realistic alignment of forces with the ATI X850XT PE as the winner.

RTCW

This benchmark is almost a complete replica of the previous, and we brought it in as an additional proof of failure of the SLI system in CPU-bound applications.

Return to Castle of Wolfenstein

Return to Castle of Wolfenstein

Wow, NVIDIA GeForce 6800GT without SLI easily ripped apart all the others, and Gigabyte GeForce 6600GT demonstrated a very interesting showing at the low resolutions. Clearly, without SLI the cards by NVIDIA are less CPU-bound in this test. The SLI simply suppressed the performance of NVIDIA GeForce 6800GT SLI and threw it off to the level of ATI X850XT PE where RTCW has never been the most winning. Let's make the task more complicated.

Return to Castle of Wolfenstein

Return to Castle of Wolfenstein

That's what should be expected. "Return to Castle Wolfenstein" is a much more CPU-bound game as compared to Q3A whose engine it is based upon. Even with the FSAA and AF enabled, the situation hasn't changed, although Gigabyte GeForce 6600GT in the SLI mode sharply ran ahead because the power of Gigabyte GeForce 6600GT in the single mode was not enough to cope with high resolutions.

Serious Sam

This gaming benchmark as the previous ones is based on OpenGL and makes use of DirectX 8.1 features.

Serious Sam

Serious Sam

By the way, this benchmark has always favored NVIDIA cards, whereas Tomb Raider is more biased to ATI cards. The picture of the test is similar to RTCW in the antialiasing mode. At low resolutions, winners are NVIDIA single-slot solutions because of their CPU-boundedness, but as resolutions go up Gigabyte GeForce 6600GT starts lagging behind as the weakest, with all the remaining cards are staying at their positions. We enable the FSAA and AF and look at the results:

Serious Sam

Serious Sam

The picture is getting clearer. Gigabyte GeForce 6600GT in the single mode wins at two low resolutions at a time, then the situation changes, and the SLI system shows its worth. The same takes place with NVIDIA GeForce 6800GT as well, but the features of SLI system are felt at only 1600х1200 and they are negligible. ATI X850XT PE proved a winner at all the upper resolutions.

Tomb Raider Angel of Darkness

We are now moving to games which require a bit more from the graphic accelerator and thus are less CPU-bounded.

Tomb Raider Angel of Darkness

Tomb Raider Angel of Darkness

The fact is amusing, and as was mentioned earlier, this test has never been the most winning for NVIDIA, but it's just this test that showed a significant efficiency of SLI systems. At the two low resolutions, al is bound to the CPU, except for Gigabyte GeForce 6600GT. The higher the resolution, the less the CPU-boundedness. There is the impression that NVIDIA GeForce 6800GT in the SLI mode was able to leave the CPU-boundedness only in the upper resolutions, where it beat ATI X850XT PE. Note how big is the difference between the SLI and non-SLI systems at 1280x1024 resolution. Adding the AA, we should extend the tendency to the low resolutions either.

Tomb Raider Angel of Darkness

Tomb Raider Angel of Darkness

That's just the way it is, now the SLI demonstrative performance has been added to the 1024x768 resolutions. The performance of ATI X850XT PE in the upper resolution has dropped to almost the level of the single NVIDIA GeForce 6800GT although the latter is in a different pricing segment.

HALO

The game wouldn't start in AA and AF modes and failed to operate with ATI's X850XT PE. All the resolutions showed CPU-boundedness, and only Gigabyte GeForce 6600GT in the single configuration showed a speed drop depending on the resolution thus loading the SLI system. At low resolutions, SLI configurations lost to single configurations, which again proves their CPU-boundedness.

FarCry

Now we are looking into the SLI operation at tests which are most demanding to the graphic resources. The first game that favors the powerful CPU most is FarCry.

HALO

HALO

We see a complete and evident win for ATI X850XT PE which was least CPU-bounded in the test, and all the other cards preserved their values from the low up to the top. My goodness! So many tests, and even AMD Athlon 4000+ is not enough for all of them. Again, SLI systems lost to single accelerators at low and middle resolutions. Only Gigabyte GeForce 6600GT in the SLI mode demonstrated a good showing at 1600х1200. Addition of AA and AF should make the graph more contrasting.

FarCry

FarCry

That's the way it is. Below is the same as in the previous graph. At the higher resolutions, the victory of SLI over the others, which was to be proved since the efficiency of SLI systems is seen only with a lack of power of graphic accelerators. By the way, the gain of NVIDIA GeForce 6800GT in the SLI mode over ATI X850XT PE is evident enough.

Doom3

Doom3

Doom3

Absolutely strange, but Gigabyte GeForce 6600GT in the SLI mode failed to operate. The SLI mode simply wouldn't start and that's it. In the lower resolutions, all the cards want "CPU power". As regards the higher resolutions, the victory of NVIDIA GeForce 6800GT in the SLI mode over all is evident, a complete and undisputable victory. ATI X850XT PE behaved pretty poor, which was to be expected. Gigabyte GeForce 6600GT runs not bad, but could run much better. We are adding AA and AF.

Doom3

Doom3

At that, ATI X850XT PE rehabilitated through showing a performance comparable to that of NVIDIA GeForce 6800GT in the SLI mode. Gigabyte GeForce 6600GT displays an absolutely smooth and anticipated performance drop.

Half-Life 2

While the previous game favors more to NVIDIA products, this recently released masterpiece is crazy about ATI cards. That's marketing.

Half-Life 2

Half-Life 2

At that, we see a complete victory of ATI products over all the competitors, even for NVIDIA GeForce 6800Ultra in the SLI it was very hard to compete with ATI X850XT PE. Nevertheless, there are no flies on the efficiency of SLI which has given NVIDIA GeForce 6800GT SLI an almost twofold speed boost versus NVIDIA GeForce 6800GT without SLI at high resolutions. Things are absolutely the same with Gigabyte GeForce 6600GT in the SLI mode. Let's make the task more complicated.

Half-Life 2

Half-Life 2

That hasn't changed the alignment of forces, although NVIDIA GeForce 6800GT in the SLI mode in the higher resolution anyway did take a lead over ATI X850XT PE. The efficiency of Gigabyte GeForce 6600GT in the SLI mode is seen in all the resolutions, and rises as the resolution reaches the higher points.

Аquamark

Now we are moving to specialized benchmarks.

Аquamark

Аquamark

The performance of Gigabyte GeForce 6600GT in the SLI mode is well seen at all the resolutions. NVIDIA GeForce 6800GT SLI, even without the SLI, came up against the CPU power at the bottom, but at the top the effect of SLI becomes more visible. ATI X850XT PE slightly lost to NVIDIA GeForce 6800GT in the SLI mode at all the resolutions except the lower. By the way, Gigabyte GeForce 6600GT in the SLI mode runs faster than a single NVIDIA GeForce 6800GT. Now on to the tests with AA and AF enabled.

Аquamark

Аquamark

We now can state that the efficiency of NVIDIA GeForce 6800GT in the SLI mode becomes more visible - it is now felt in all the resolutions. The lag of ATI X850XT PE behind the leader is also seen at all the resolutions. The gain of Gigabyte GeForce 6600GT in the SLI mode as compared to NVIDIA GeForce 6800GT has turned more vivid and can now be regarded as quite significant.

3DMark05 - Marks

3DMark05 - Marks

3DMark05 - Marks

There it is, the triumph of NVIDIA's SLI technology. It's a pity that it is not so well seen in real games. ATI X850XT PE runs a bit faster than NVIDIA GeForce 6800GT without the SLI mode and Gigabyte GeForce 6600GT in the SLI mode, so NVIDIA GeForce 6800GT in the SLI mode simply takes a sharp lead and shows truly unsurpassed results. The efficiency of using SLI is seen in all the resolutions for both the SLI systems, and the efficiency is simply amazing because at 1600x1200 it amounts to 93%. Antialiasing if enabled should increase the difference even more.

3DMark05 - Marks

3DMark05 - Marks

That's approximately the way it is. NVIDIA GeForce 6800GT in the SLI mode is the absolute leader, with the second place taken by ATI X850XT PE, and the third goes Gigabyte GeForce 6600GT in the SLI mode. The reduction of speed depending on the resolution is absolutely uniform, which tells that there is absolutely no CPU-boundedness.

But that is merely a general index, sort of 3DMark scores, and we'd better briefly review each test of the package separately. We are bringing in the below graphs to make up a complete report. Nevertheless, we don't think there is much sense in their all-out description.

3DMark05 – First gaming test

3DMark05 – First gaming test

3DMark05 – First gaming test

3DMark05 – First gaming test

3DMark05 – First gaming test

3DMark05 – Second gaming test

3DMark05 – Second gaming test

3DMark05 – Second gaming test

3DMark05 – Second gaming test

3DMark05 – Second gaming test

3DMark05 – Third gaming test

3DMark05 – Third gaming test

3DMark05 – Third gaming test

3DMark05 – Third gaming test

3DMark05 – Third gaming test

3DMark05 – Synthetics

3DMark05 – Synthetics

3DMark05 – Synthetics

3DMark05 – Synthetics

3DMark05 – Synthetics

3DMark05 – Synthetics

3DMark05 – Synthetics

3DMark05 – Synthetics

3DMark05 – Synthetics

3DMark05 – Synthetics

3DMark05 – Synthetics

3DMark05 – Synthetics

3DMark05 – Synthetics

SpecPerf 8

Finally, a test of performance at professional applications.

SpecPerf 8

SpecPerf 8

As we can notice, the SLI slightly reduces the operation speed in all the professional applications, and the results for both SLI- and non-SLI systems are mainly identical. Moreover, there is even no much difference between Gigabyte GeForce 6600GT and NVIDIA GeForce 6800GT. Therefore, for now the SLI is needed only as a game accelerator.

Final Words

How complex this world is. We thought it would suffice to fit two boards into the PC and ripe the benefits of a twofold performance boost, but it turned out that we should also think twice where the boost will be and why. The efficiency of NVIDIA's SLI technology has nothing to complain about. But the problem comes up under a heavy load when the CPU can no longer withstand the power of graphic accelerators, and the problem is getting more acute. It is increasingly useless to improve graphic accelerators because of the lack of adequate processing capacity to establish the right balance. Just see for yourselves - with AMD Athlon 64 4000+, most tests run with the participating cards not using AA an AF were CPU-bounded. Moreover, use of AA and AF most likely did not solve the situation but merely allowed looking at the performance of accelerators at 1280х1024 or 1600х1200 resolutions. Even the most recent and graphically advanced games proved CPU-bounded, let alone the games of the previous generation. Additionally, the SLI system itself spends the CPU resources and thus handicaps single-slot solutions both on the base of own components and those by third-party manufacturers like ATI. In a word, it's up to you to decide whether to use the SLI or not, but you will see its fruits in games only because games are simply in thirst for the power of the graphic accelerator without much load to the CPU. Otherwise, it would be better of you had one card, not two.

The SLI technology itself has quite right makings and great future is probably reserved for it, of course unless NVIDIA makes blunders like those by 3DFx and stake on that, because so far due to the economic reasons the SLI is still quite far from ideal, and very few can afford it. The price of a motherboard that supports the SLI in combination with two video cards is felt even more than that for a single-slot solution of similar performance. Additionally, the technology is not yet running stably, it still suffers from flaws, but in our view they can be solved at the driver level.

Nevertheless, we can ascertain that the Gigabyte’s motherboard with video cards by it runs almost trouble-free except the failure to enable the SLI mode for some games. Each component by itself was running very stably and had nothing to complain about. Anyway, if you are up to buying a SLI-enabled computer right now, use of Gigabyte’s products is more than justified.

Автор:Dmitriy Zinovyev