Zertz - 2008-09-09 17:44:22 in Video Cards
Category: Video Cards
Reviewed by: Zertz   
Reviewed on: December 12, 2008
Price: $252 x 2


For every single gamer out there, getting the most performance out of their video card is vital for a pleasant gaming experience. That is probably why the most expensive hardware component hiding inside the case is, more often than not, the graphics card. Obviously, not everyone can afford to put the same kind of money into it, so that's why both ATI and nVidia offer various models from low prices to ridiculously expensive cards. For those gamers that always want more out of their setup, pairing two cards is the only choice for absolute top of the line performance. This is exactly why both companies offer the possibility to combine the power of multiple cards. As you probably are aware, nVidia's iteration of this feature is called SLI, which now stands for Scalable Link Interface. Both cards are linked together using a bridge connected between them. See where I'm going with this?

Recently, we reviewed the XFX GTX260 Black Edition and we were quite impressed by the performance it was able to deliver. Today, I am going over not only one, but two of XFX's slightly lower clocked "XXX" versions of the GTX260. This one has its GT200 core clocked at 640MHz, just 26MHz shy of the praised Black Edition, while the GDDR3 memory runs at the exact same 2.3GHz.


Closer Look:


The box, or should I say boxes, are pretty impressive to say the least. The huge GTX logo is really catchy with all the light rays being emitted from it along with some kind of creature in the back keeping a close look at you. The model name, 260, is clearly labeled right under it, although it doesn't say anything about having 216 cores like the cards from a short while back since all the current cards ship with the beefier core. Of course, this card supports SLI technology in both dual and triple card configurations. Also bundled with it is a very popular game, Call of Duty 4. Hopefully the latest iteration of the famous game series, World at War, will be included in the near future. Nevertheless, it's an awesome game and it's free!








Once the cardboard cover is slid out of they way, XFX's usual flashy green box shows up, which really is a good thing considering it's not only good to look at but, more importantly, ensures that the card is kept safe and sound. I attempted to get things moving by shaking the box a bit, but quickly abandoned this pursuit after I realized nothing in there was going to move. Once inside, I found the door hanger as well as a couple of manuals and discs. After moving those out of the way along with the one inch thick foam, the card itself finally shows its face. It is packaged in an anti-static bag that sits tight into another thick layer of foam. Finally, at the bottom of the box, I found DVI to VGA and S-Video to Component adapters as well as a Molex to PCI-E power connector. The last cable is used for the audio pass-through feature, more on that later.




Here's the full bundle that you'll get with this card. The driver disc, although newer ones are available, Call of Duty 4, a quick install guide and a brief manual. XFX's door hanger is also somewhat useful since it has the card's unique serial number on the back as well as a place to note your user name and password created while registering the card, or cards in this case. This way you don't have to take the cards back out of the computer once you realize you forgot to note the serial numbers to register the cards.



Let's keep moving and take a deeper look at the card itself.

Closer Look:

This GTX260 is kept cool, if that is possible at all, by nVidia's reference design. Even though this an overclocked version, there is absolutely no way to differentiate it from other models since the sticker covering the cooler is the same. Perhaps a little customization would have been nice to see, although it's not like you have to stare at them all day long, plus it still looks pretty good. The whole card is covered by the housing from top to bottom, which keeps it safe from any damage that could occur.




On the front side, this GTX260 has a pair of dual-link DVI outputs as well as an S-Video, which can be converted to component using the bundled adapter. Nothing out of the ordinary here, but the majority of people only use a single DVI port anyway. The back side of the card is relatively boring with nothing in particular to show, it has an opening to let cool air come in. Since a lot of people have a fan blowing toward the video card, it really helps keep the temperature of those 1.2 billion transistors under control.



The SLI connectors are hidden and protected by a plastic cover that can simply be clipped off. Since there is a pair of them, you can make three of those work together in a Tri-SLI capable motherboard - either 790i or X58 chipsets. The card is powered by two 6-pin PCI-E connectors, which are located on the side. Users with short cases will enjoy this solution, however it can make routing cables a bit awkward. Finally, that small connector on the right is the audio pass-through so that the card can output sound using the supplied SPDIF cable and a DVI to HDMI adapter.



It is now time to get XFX's GTX260 ready for testing.

Closer Look:

As you are probably aware, installing a video card is relatively simple, but there's more to it than just inserting it into a free PCI-E slot. The first thing you want to do is, I must admit, boring but still very important. That is registering the card with XFX using the serial number found on the card or on the back of the door hanger. This has to be done within thirty days of purchase, so you better do it now before you start gaming and entirely forget about it. XFX offers a lifetime warranty, so it's definitely worth the effort and a couple minutes of your time. Once that is done, then you can take a glance at the manual and install drivers.




















Just in case you don't have Adobe Reader installed, like it is the case when you're doing a fresh install, you should do so prior to installing new hardware to avoid any driver conflicts that could lead to mediocre performance. XFX includes a copy of the popular PDF reader along with the electronic version of the installation manual.



Drivers are not only important, but vital to exploit the full capabilities of your newly acquired graphic card since, otherwise, it will just use default Windows drivers that are designed to provide compatibility for almost any card but very limited functionality. XFX bundles a disc with the latest drivers available at the time the card was packed, but with driver releases coming out every so often, it is generally a good idea to check online for the latest ones.



If the default level of performance or visual quality is not up to your expectations, you can use nVidia's Control Panel to modify some options in this regard. There are a bunch of different tabs where adjustments can be done, though I will only take you through a few of the ones I find important to gamers among us. The Manage 3D Settings tab allows the Global (all situations) settings to be tailored to your liking, whether you want to gear the system toward performance or quality. Under this tab are some game specific settings as well. There are also pre-made profiles that can be adjusted to activate all the eye candy or to minimize the settings in order to get higher framerates, although that most likely won't be a problem with the pair of cards on the bench today. The choice is all yours. This is also where you activate SLI.



Before moving on to testing, let's just take a quick look at XFX's specifications.


Manufacturing Process 65 nm
Number of Transistors 1.4 Billion
Graphics Clock (Including dispatch, texture units, and ROP units) 640 MHz
Processor Clock (Processor Cores) 1,400 MHz
Stream Processors 216
Memory Clock (Clock rate / Data rate) 1,150 MHz / 2,300 MHz
Memory Interface 448 bit
Total Memory Bandwidth 128,8 GB/s
Memory Size 896 MB
ROPs 28
Texture Filtering Units 72
Texture Filtering Rate 47.95 GigaTexels/sec
HDCP Support Yes
HDMI Support Yes (Using DVI-to-HDMI adaptor)
Connectors 2 x Dual-Link DVI-I 1 x 7-pin HDTV Out
Bus Technology PCI Express 2.0
Form Factor Dual Slot
Power Connectors 2 x 6-pin
Max Board Power 182 watts
GPU Thermal Threshold 105 C




At OverclockersClub.com, we use a set of benchmarks to stress the graphics card. We will use a series of newer gaming benchmarks, as well as some that are more seasoned, to show how well the pair of XXX cards from XFX compares to some of the other enthusiast video cards on the market. I will be using both single and multiple GPU models to compare the performance of the XXX GTX260s from XFX. All driver settings and clock speeds will be left at factory defaults for both the CPU and GPU in an effort to minimize or eliminate any variables that could impact the results. The test system used in this review is listed below. After testing the card at stock speeds, I'll overclock it to see what kind of performance can be gained. All testing is done with the default settings in the respective control panels, as well as default settings in the BIOS of the motherboard used in this test. For this round of testing, our drivers have been updated to 177.79 for the Nvidia cards and Catalyst 8.8 for the ATI video cards used in this review. The exception being the Far Cry 2 testing where the nVidia Driver used is 180.43 and the ATI driver is the FarCry Hotfix driver.


Comparison Video Cards:



Overclocked settings:

To show the XXX GTX260s overclocked above their stock setting will not result in a really measurable performance increase in the tested resolutions. Part of that reason is that the games will be CPU bound by the capable Q9450. At 2.66GHz it just does not have the power to show what these cards are capable of. For that reason I will deviate slightly from my normal testing regimen and bump the processor to 3.2GHz, the equivalent speed of the Core 2 Extreme 9770 for just a little bit of additional power to see what the XFX GTX260 XXX cards have when overclocked and run in an SLI configuration. For overclocked cards they still have a little bit more headroom available for the overclocker and gaming enthusiast. This headroom comes in the way of almost another 100MHz on the GPU cores and almost 75MHz on the memory. Both of these are pretty significant increases over the factory overclock as well as the factory stock numbers before XFX put the screws to them. The bump in CPU speed as well as the overclock on the cards made for some significant increases in performance. In World in Conflict at 1920x1200 there was an increase of 9 FPS, FarCry 2 saw 14 FPS over the stock speeds on the CPU and GPU cores. The performance increase in WIC is almost 20%, in FarCry 2 a 15% bonus in performance is there for the taking. If you are an overclocker you will definitely need to put some additional clock speeds to work to get the most from these cards. They do well without it but the additional power sure makes a difference.



  1. Crysis
  2. Knights of the Sea
  3. BioShock
  4. Call of Duty 4
  5. World in Conflict
  6. Far Cry 2
  7. Company of Heroes: Opposing Fronts
  8. 3DMark 06 Professional
  9. 3DMark Vantage



Crysis has been out for quite some time now. In that time there has yet to be a single or multi GPU setup that can fully showcase the graphics performance of the game. Will two GTX 260s be the combination to do it? The Crysis single player demo includes both CPU and GPU benchmarks to test the performance of your processor and video card.





















At low resolutions, Crysis simply does not benefit enough from the second GPU to make it worth the investment. Things start to become interesting at 1680x1050. However, once I jumped to 1920x1200 the stock SLI setup came in 30% faster than the single Black Edition GTX260, which isn't bad but still miles from perfect scaling. With the processor set at 3.2GHz and higher clocks on the cards, this performance advantage steps up to 70%! At those speeds, it also keeps its rival, the 4870X2, at bay, leading by a mere 7% at stock but by nearly 40% when overclocked.



PT Boats: Knights of the Sea is a new DX10 title that features its own proprietary graphics engine currently in development. The game is a combination of Real Time Strategy and Simulation. You have the ability to control the entire crew or just a single member. Play as the German, Russian or Allied navies, and prove your mettle on the open seas.


Video Settings:


















SLI scaling, and Crossfire as well, in Knight of the Sea ranges from horrible to null. Obviously, I am running into processor limitations, which would explain why the scores really start to fly with the Core 2 Quad clocked higher. XFX's pair of GTX260s can only equal the GTX280, but they manage to keep a 30% performance lead on the ATI 4870X2. Once again, overclocking the processor leads to a significant improvement, allowing them to cruise to the top of our graphs and gain another 20% on the red team's dual GPU card. Things don't look quite as good when you compare the overclocked SLI setup to a single GTX260 stock configuration - SLI scores at the very best 18% higher.



BioShock is one of the creepier games out the wild. The building of a perfect Utopian society undersea gone horribly wrong. Its inhabitants driven mad with the introduction of tonics and genetic modifications. Now Rapture is just a shadow of its former glory with little girls looting the dead of what little they have left while being shadowed by guardians known as "Big Daddys" It is a demanding game that will make your hardware scream for mercy. This First Person Shooter allows for an infinite number of weapons and modifications to provide a unique experience each time it is played. The environment as well as the story line will wrap you up for hours on end.


Video Settings:



















Low resolutions really don't do this SLI setup any justice, as the higher clocked single GTX260 and 280 keep a tight fight going for the lead. Things quickly turn around for today's main attendee at 1920x1200 over single card configurations, where the 260s take, respectively, a 34% and 50% performance lead over the single GTX at stock and overclocked settings. The 4870X2 still trails behind a couple frames per second, nothing significant enough, especially when dealing with over a hundred FPS, to declare a clear winner.


Call of Duty 4: Modern Warfare is the successor to the Call of Duty crown. This iteration of the game is fought in many of the world's hot spots with modern armaments and firepower. You can play as either a US Marine or British SAS trooper. Since this game does not feature an in-game test, I will run through a section of the game and measure average FPS using Fraps 2.9.3.


Video Settings:

















This is one of those SLI friendly games, especially at the highest tested settings where the pair of 260s outperform the single card by over 60%. With the setup overclocked, this number climbs to an impressive 82%. The 4870X2 is still being beat, this time by nearly 16% and 30% at stock and overclocked settings, respectively. Although it constantly beats ATI's dual GPU card, scaling isn't nearly as good at lower resolutions.



World in Conflict Released last year World in Conflict is a Real Time Strategy game that simulates the all-out war the world hopes never comes. The difference in this RTS game is that it is not the typical "generate wealth and build" type of game. Instead, you advance by conquering your foe with limited opportunities to replenish your troops..


Video Settings:

















Today's powerful SLI test bench clearly maxes out this benchmark, scoring basically the same at every single resolution. This means that even with the quad core overclocked at 3.2GHz, the processor is still holding the cards back. Scaling is still somewhat decent, 20% and 40% at stock and overclocked while it still comes ahead of the 4870X2. So while it does beat every other card by a good margin, XFX's GTX260s need even more processing power before they can start to fly.


Far Cry 2:

While I have not run all of the cards in the comparison list through the Far Cry 2 benchmark, I will add the XFX GTX260 XXX cards in an SLI configuration as well as an HD4870X2 to the list of cards I have run through the testing. Our testing is not yet complete but you can look for Far Cry 2 to become part of the OverclockersClub benchmark suite. The settings used are just a few steps below the maximum in-game settings and offer a good blend of performance vs. visual quality.

















FarCry 2 falls into a very similar situation to World in Conflict, the SLI setup shows improvements over a single 260 and dominates every other card, except Zotac's overclocked GTX280, which manages to beat it by a slight margin at lower resolutions. However, the GTX260s can't spread their wings exactly like they would want to because I once again ran into a processor bottleneck. This goes to show how much power those cards can deliver. No matter how high the resolution was, they still cruised to 78 and over 90 frames per seconds at stock and overclocked settings.


Company of Heroes: Opposing Fronts is the latest chapter in the Company of Heroes series. The scene is WWII. The mission is Operation Market Garden, the first allied attempt to break into the Third Reich. Play as the British or Germans. This Real Time Strategy game is brought to us by Relic Entertainment.


Video Settings:



















The twin GTX260s still stand high in the charts, scoring, at stock, 26% and, when overclocked, almost 60% higher than XFX's own Black Edition of the same card. They still keep a solid lead over the red team's flagship card as well as every other models, especially when both the processor and the card are overclocked.


3DMark06 is one of those benchmarks that always comes up when a bragging contest breaks out. 3DMark06 presents a severe test for many of today's hardware components. Let's see how this setup fares. The settings we will use are listed below.



















Performance in 3DMark06 is not so impressive, scaling is poor and they get beaten by both of ATI's dual GPU cards, the 4870X2 and the 4850X2. While stock scores aren't so impressive, with some overclocking I was able to get the scores to jump to over 17000 at each resolution, enough to beat the competition by a fair margin.


Featuring all-new game tests, this benchmark is for use with Vista based systems. "There are two all-new CPU tests that have been designed around a new 'Physics and Artificial Intelligence-related computation.' CPU test two offers support for physics related hardware." There are four preset levels that correspond to specific resolutions. 'Entry' is 1024x768 progressing to 'Extreme' at 1920x1200. Of course, each preset can be modified to arrange any number of user designed testing. For our testing, I will use the four presets at all default settings.

















Except at the "Entry" setting, 3DMark Vantage seems to enjoy having a couple cards at its disposal. The GTX260s easily win at every available setting and the scaling is nearly as good as it gets. At stock settings, the SLI setup is 56% better than a single, overclocked, GTX260. Overclocking grants an additional 14% performance improvement, making it nearly 80% faster than a single card. While they were constantly being beaten by the red dual GPU cards in the previous 3DMark, this time around the green team comes out as the clear winner.


This pair of GTX260s coming from XFX is nothing short of impressive. They feature an awesome looking heatsink and the bundle it comes with is all you need to get started - a couple adapters as well as the award winning Call of Duty 4. This is also the latest version of the 260, equipped with 216 processing cores and each of those are clocked to 640MHz, which is 64MHz over the standard edition. That's better than nothing, but since I just cannot settle with factory settings I, of course, overclocked these cards as much as it could. I was able to take the cores to 732MHz, which is pretty decent headroom. Obviously, the GTX260 is a very good performer, but as you have seen, a pair of them in SLI performs even better. The twins won every single benchmark of our suite except for Futuremark's 3DMark06 which, for some reason, has always performed better on ATI video cards. Saying it destroyed every other card wouldn't be entirely true though. At resolutions under 1920x1200, performance wasn't that far above other competing cards and scaling was often poor, especially at stock settings. However, once I started benching at 1920x1200, while the other cards performance started dipping, this setup stayed up there. They always delivered very playable framerates and that's to be expected considering the money this setup will take out of your pockets. A pair of those will likely cost a bit less than a 4870X2 and provide a performance increase anywhere from 5% to 10% in most games.

Now let's get through the cons, because, yes, there are some. Price is somewhat of an issue, although if you are even considering buying more than one video card, money is most likely not a huge problem. Graphic cards are the hottest running component in a modern enthusiast computer so when you put two of them side by side things tend to get hot. They climbed up to about 80 Celsius, which, although not life threatening, isn't anywhere near cool. Something that has always been an issue with multiple cards working together is scaling. Most of us are aware that adding a second card won't double performance for various reasons, but it also won't necessarily increase it by a steady percentage. Comparing those XXX cards to the slightly higher clocked Black Edition led to anywhere between 0% and, at best, with both the video card and the processor overclocked, an 80% increase.

Overall, that was really an awesome pair of cards to have on the test bench. If you can afford it, this is definitely a great choice as far as performance goes. You should be aware that this kind of graphic processing power needs a very powerful processor to be able to show off all it has under the hood since, even with a 3.2GHz quad core, some games were still bottlenecked at a set framerate. Gaming at high resolution with all the eye candy turned on was a tremendously pleasant experience that simply can't be offered by a single GPU.