Gigabyte GTX 560 OC Review

RHKCommander959 - 2011-05-11 21:03:24 in Video Cards
Category: Video Cards
Reviewed by: RHKCommander959   
Reviewed on: May 17, 2011
Price: $199

Introduction:

NVIDIA is back at the naming scheme game again with the new GTX 560. This card is at the very center of the new NVIDIA generation for performance. Also to note, the Ti moniker added onto the prior GTX 560 makes a big difference now. It is unclear yet if NVIDIA will do the same for the GTX 550 Ti although there is the possibility. For the sake of suppressing some confusion, the GTX 560 is superior in CUDA core count and thus performance over the GTX 550 Ti. The design is very similar to the GF 104 based GTX 460 so it should be quite interesting to see how they compare against each other as the GTX 560 is based on GF 114 silicon. NVIDIA has stated that this card is not the replacement for the GTX 460. The designs are very similar with the GTX 560 being clocked faster on both core and memory. NVIDIA is marketing this card as the perfect upgrade for users with older graphics cards. According to a Steam survey most gamers are still using older 9800 GTs.

The GTX 560 under review today (product launch) is from Gigabyte. This GTX 560 is factory overclocked 20 MHz on the core equating to 830 MHz while the memory is stock at 4008 MHz on a 256-bit memory bus just like on the GTX 560 Ti. The heat sink cooling this new card is called the WindForce 2x Anti-turbulence cooling solution. This design has two 100mm fans and four 6mm heat pipes. The fans are inclined slightly to help push air out toward the exhaust grill and out of the case. Recycling exhaust air is a common problem with impeller based fan designs similar to this. Each 100mm fan can push up to 30.5 CFM of air while still operating quietly with PWM fan control technology. Gigabyte has chosen to not solder the heat pipes, claiming that solder actually hurts thermal conduction compared to just clamping the fins to the heat pipes. Air is a worse thermal conductor than metal and in general, cheaper heat sinks aren’t soldered (to save on money). The possibility exists that the disabled CUDA cores could be re-enabled. This would depend on whether or not they were physically disabled or if the BIOS was used to disable them. If it were possible to unlock the core, then the card would become a GTX 560 Ti!

 

Closer Look:

The front of the box has a robotic eyeball looking outwardly. Everything is bluish or golden in color generally with some small exceptions. The top has two stickers for NVIDIA, one saying that this is an authorized board partner and the other saying the card is 3D Vision ready. The Gigabyte logo found on all sides of the box is reflective and changes color depending on the angle of light coming in. The lower portion has three stickers showcasing features of the card. The first feature is a 3-year warranty extending from Canada to the USA, and Latin America. Gigabyte's Ultra Durable technology is extended to this video card. This includes a thicker copper PCB of 2oz, Japanese solid capacitors, ferrite core chokes, and MOSFETs. The chokes should not buzz like the cheap ones do and efficiency is increased across the board (no pun intended) with all of these components. Life span is also increased by using these higher-end components. The card is cooled by a system dubbed WindForce 2x anti-turbulence cooling. This system uses two low-RPM (<2000) 100mm fans at a very slight angle to help guide air out of the exhaust grill. Four heat pipes directly touching the GPU IHS carry heat from the core to the fins. Air still escapes and re-circulates around the case but at least the flow is directed outward more than on most heat sink assemblies. The back of the box goes into further detail on the WindForce, showing that the goal was quiet operation with great cooling. The Ultra Durable VGA components are explained in much greater detail as well. The components claim an increase of 10-30% overclocking capability and 10-30% lower Power Switching loss. GPU temperatures are also supposed to be 5-10% lower. The main features are repeated at the bottom in nine languages.

 

 

 

 

The top flap of the box is similar to the front and back, with the same details being recycled. The look is nice and clean while describing the card adequately from every side. The other flap has some barcodes on it along with the same stuff from the first flap. Both sides are again similar except with one showing some features in grey boxes. The box looks good overall, the back had some minor grammar issues but overall, showed quality.

 

 

 

Opening it up we find a nice glossy smooth cardboard box. Opening it up shows the driver CD tucked inside the manual at the top. Underneath this is foam protecting the card. To the right is a space holding the hardware accessories: two double-Molex to 6-pin PCIe connectors; one Mini HDMI to HDMI adapter; and one DVI to D-Sub VGA adapter. As always, the card is protected by an anti static bag! Inside the bag, the card is also protected by caps on the outputs and SLI connection. The PCI Express slot and power connections are left unguarded.

 

 

Time to get a look at the card itself!

Closer Look:

The Gigabyte GTX 560 OC is built with similar components that go into the company’s motherboards. The design is called Ultra Durable VGA. More copper is built into the PCB layers for better conductivity. Higher quality capacitors, MOSFETs, and chokes are used and in doing so give the card a longer life expectancy while reducing temperatures and power consumption. The chokes should not emit any squeal as can be heard from cheap chokes used on many vanilla cards (the bane of most gamers). With 336 cores built on TSMC's 40nm node, the GTX 560 will have similar performance to the GTX 460 before it while having a very similar design to the GTX 560 Ti. The core is most likely the same with some functionality disabled. The core speed has been increased to 810 MHz from the 675 MHz of the GTX 460. The memory layout is similar being that it is again 1GB of GDDR5 on a 256-bit bus. Memory speeds have increased from 900 MHz to 1002 MHz. Performance will very likely be similar considering that the GTX 460 had more in common with the GTX 500 line than the GTX 400s. Considering everything, it is kind of odd for NVIDIA to say that this isn’t a replacement for the GTX 460 unless they plan to keep it around just as they have kept the GTS 450 for the lower end.

The heat sink design uses two transparent large 100mm impellers angled lightly to help blow air from back to front on the card. This directional flow helps since the fans aren't negating each other in the center as is the problem with fans placed flatly in line. Heat will still be dumped into the case but some of it will exit out the grill. This design is far more efficient on the GPU itself both thermally and audibly in comparison to OEM blower motor designs. The PWM fans are rated for a maximum of 2000 RPM and thus will likely be extremely quiet during operation. The heat sink uses four 6mm direct-contact heat pipes to wick heat from the GPU core and disperse it more rapidly into the fins. Gigabyte has claimed that by not using solder here that the heat transfer is greater since there are less materials for the heat to traverse through. The only problem is that air is a worse thermal conductor than solder and pressed on fins will have some gaps between the fins and heat pipes. The back of the card is devoid of any major electronics, most new cards all have their memory on the GPU-side nowadays. Four spring-retention screws hold the heat sink to the GPU core. Near the 6-pin connections is a small aluminum heat sink attached by push-pins to cool the MOSFETs.

 

 

 

 

 

 

 

 

 

 

The supported outputs are two DVI-I ports and one Mini HDMI port. The audio for the Mini HDMI is connected through the PCB. The early days of NVIDIA HDMI support required a sound wire to be run from one end of the card to the other to pass sound through to the port. With a D-Sub VGA adapter, virtually all of the market will be readily able to connect their displays. NVIDIA hasn't yet deemed DisplayPort worth the jump since the standard hasn't caught on very well. The rear of the card has two 6-pin power connectors, a growing trend is to have them side mounted but the Gigabyte version has them mounted facing out the back. The fan header is right next to the power connections. (editor's note: I personally prefer rear mounted power ports as the wires are easier to hide).

 

 

The WindForce logo is placed onto the side of the plastic fan shroud so that when the card is installed, people can see it through a case window or open side panel. The slight fan angle can be made out in these two photos. The design should still only take up two slots. The fan angle would help the card breathe in SLI configurations although care should be taken to make sure the adjacent card's mounting screws don't catch an impeller!

 

 

The four direct-contact heat pipes cover the GPU core area very well and are pretty flush with the rest of the base. The gaps between these heat pipes is considerable and should be packed with thermal paste by users upon re-installation. From the factory, the grooves and heat sink base were already liberally covered in paste. The size of the fans has them overhanging the heat sink and should aid in cooling motherboard components such as nearby chipset heat sinks. The PCB has plenty of extra holes for different cooling options. With six holes surrounding the GPU core, some serious pressure could be applied for extreme cooling such as DICE or LN2. Most of the Ultra Durable VGA components can be seen with the heat sink removed. All of the chokes have stickers on them saying Metal Choke. The MOSFETs are hiding underneath the small aluminum heat sink for thermal relief, with the solid Japanese capacitors nearby. The memory used is produced by Hynix. The model is H5GQ1H24AFR-T2C and they are rated for up to 5.0 GHz operation with 1.5V while only being clocked at 4008 MHz. The GF114 core is protected by an integrated heat shield. An IHS protects the core from damage due to improper heat sink installation at the cost of reduced thermal transfer capabilities. A common misconception by enthusiasts is that they think they must coat the whole IHS in thermal paste for proper heat transfer. Only the area above the core really needs thermal paste, the temperature difference is very negligible when done properly. A bare core definitely needs to be covered completely. Some enthusiasts opt to remove the IHS and either leave it off or replace the paste underneath. Removing the IHS can lead to heat sink mounting issues though.

 

 

Move on to catch a glimpse of the specifications and features!

Specifications:

Graphics Engine
GeForce GTX 560
Bus Standard
PCI Express x16 2.0
Memory Type
1GB GDDR5
Memory Interface
256 bits
Core Clock Speed(MHz)
830
Memory Clock Speed(MHz)
4008
Memory Bandwidth(GB/sec)
128.3
CUDA Cores
336
DVI Output
2
D-SUB Output
1 via DVI to D-Sub adaptor
HDMI-Output
1 via Mini HDMI to HDMI dongle)
Mini HDMI-Output
1
VIVO(Video-in/out)
N/A
HDTV Support
Yes
HDCP Support
Yes
Dual-link DVI
Yes
Display Output (Max Resolution)
2560x1600
RAMDACs
400
DirectX Version Support
11
OpenGL Version Support
4.1
SLI Support
Yes
3-way SLI
N/A

 

Features:

 

 

 

All Information courtesy of Gigabyte

Testing:

The testing consists of running Aliens vs. Predator, Metro 2033, Crysis Warhead, Call of Duty: Modern Warfare 2, Just Cause 2, Unigine Heaven Benchmark 2.1, Battlefield: Bad Company 2, 3DMark 11 Professional, 3DMark Vantage, and temperature/power consumption testing. Three common resolutions are used for all the tests with 4AA and 16AF settings, but the 3DMark tests have four resolutions/runs. After a run through all the tests, the card is overclocked to roughly its maximum stable capabilities and then tested again. Settings stay the same for each card tested so the results can be compared. All testing is done on similar hardware running 64-bit Windows 7. The charts are all organized in terms of best to worst performance.

 

Comparison Video Cards:

 

 

Overclocking:

I had set out with hopes to overclock this card well considering the card was factory overclocked already. The upgraded PCB components and heat sink also led me to believe it should do rather well. The card pushed (without voltage adjustment capability) to 910 MHz core and 1125 MHz memory using MSI Afterburner V2.20 Beta 2. 920 MHz and higher would result in crashing after a few test runs but 910 MHz was rock solid stable. 1135 MHz on the memory was unstable but lowering it to 1125 MHz made it stable. Gigabyte has included software called Easy Boost on the driver CD which has the same functionality but still no voltage control. The program also has some nifty features like BIOS backup and flashing, version number, and so on. Overclocking was simple with trial and error finding where the card would stop freezing on the core and then the memory. Temperatures were great and the card operated silently in comparison to the case fans. With voltage modifications, I'm sure this card could have gone a good deal further. Stock voltage was only 0.975V for the core.

 

 

Maximum Clock Speeds:

Testing for the maximum clock speed consists of looping Crysis Warhead and Unigine 2.5 for 30 minutes each to see where the clock speeds will fail when pushed. If the clock speed adjustment fails, then the clock speeds and tests are re-run until they pass the full one hour of testing.

   

 

  1. Aliens vs. Predator
  2. Metro 2033
  3. Crysis Warhead
  4. Call of Duty: Modern Warfare 2
  5. Just Cause 2
  6. Unigine Heaven Benchmark 2.1
  7. Battlefield: Bad Company 2
  8. 3DMark 11 Professional
  9. 3DMark Vantage
  1. Temperature
  2. Power Consumption

Aliens vs. Predator, developed by Rebellion Developments, is a science fiction first-person shooter and is a remake of its 1999 game. The game is based off the two popular sci fi franchises. In this game, you have the option of playing through the single player campaigns as one of three species, the Alien, the Predator, and the Human Colonial Marine. The Game uses Rebellion's Asura game engine that supports Dynamic Lighting, Shader Model 3.0, Soft Particle systems, and Physics. To test this game I will be using the Aliens vs. Predator benchmark tool with the settings listed below. All DirectX 11 features are enabled.

 

Settings

 

 

 

 

 

 

 

 

   

   

   

Higher = Better

 

With Aliens vs. Predator, the GTX 560 tied or beat the Sapphire HD 6870 in 5 out of 6 tests! Performance was almost identical to the GTX 460.

Testing:

Part first-person shooter, part survival horror, Metro 2033 is based on the novel of the same name, written by Russian author Dmitry Glukhovsky. You play as Artyom in a post-apocalyptic Moscow, where you'll spend most of your time traversing the metro system, with occasional trips to the surface. Despite the dark atmosphere and bleak future for mankind, the visuals are anything but bleak. Powered by the 4A Engine, with support for DirectX 11, NVIDIA PhysX and NVIDIA 3D Vision, the tunnels are extremely varied — in your travels, you'll come across human outposts, bandit settlements, and even half-eaten corpses. Ensuring you feel all the tension, there is no map and no health meter. Get lost without enough gas mask filters and adrenaline shots and you may soon wind up as one of those half-eaten corpses — chewed up by some horrifying manner of irradiated beast that hides in the shadows just waiting for some hapless soul to wander by.

 

Settings:

 

 

 

 

 

 

   

   

   

Higher = Better

 

The GTX 560 and GTX 460 score similarly again. The card was slightly behind the 6850 in Metro 2033.

Testing:

Crysis Warhead is a standalone expansion pack situated in time with the story line of the original Crysis. As Sergeant "Psycho" Sykes, you have a secret mission to accomplish on the far side of the island. Along the way there are EMP blasts and aliens to contend with, as you hunt down the KPA chief. This game uses an enhanced version of the CryEngine 2.

Settings

 

 

 

 

 

 

 

   

   

   

Higher = Better

 

The card beats the 6850 from XFX with Crysis Warhead. Performance was similar to the 5800 cards, usually falling right between the two.

Testing:

Call of Duty: Modern Warfare 2 is an iteration of the venerable first person shooter series, Call of Duty. Despite its long, successful pedigree, the game is not without substantial criticism and controversy, especially on the PC. Aside from the extremely short campaign and lack of innovation, the PC version's reception was also marred by its lack of support for user-run dedicated servers, which means no user-created maps, no mods, and no customized game modes. You're also limited to 18-player matches instead of the 64-player matches that were possible in Call of Duty 4: Modern Warfare. Despite all this, the game has been well received and the in-house IW 4.0 engine renders the maps in gorgeous detail, making it a perfect candidate for OCC benchmarking. You start off the single player missions playing as Private Allen and jump right into a serious firefight. This is the point where testing will begin. Testing will be done using actual game play with FPS measured by Fraps.

Settings

 

 

 

 

 

 

 

 

 

 

   

   

   

Higher = Better

 

Performance fell between the 6870 and 6850. NVIDIA's claims that this card is superior to the 6850 are holding true for the most part.

Testing:

Just Cause 2 is a third-person shooter that takes place on the fictional island of Panau in Southeast Asia. In this sequel to 2006's Just Cause, you return as Agent Rico Rodriguez to overthrow an evil dictator and confront your former boss. When you don't feel like following the main story line, you're free to roam the island, pulling off crazy stunts and causing massive destruction in your wake, all beautifully rendered by the Avalanche Engine 2.0. In the end, that's what the game basically boils down to — crazy stunts and blowing things up. In fact, blowing things up and wreaking havoc is actually necessary to unlock new missions and items.

 

Settings

 

 

 

 

 

 

 

 

   

   

   

Higher = Better

 

With Just Cause 2, the GTX 560 was able to keep up with many of the higher-end cards. Overclocking helped it moderately to almost the power output of the GTX 480.

Testing:

Unigine Heaven Benchmark 2.0 is a DirectX 11 GPU benchmark based on the Unigine engine. What sets the Heaven Benchmark apart is the addition of hardware tessellation, available in three modes — Moderate, Normal and Extreme. Although tessellation requires a video card with DirectX 11 support and Windows Vista/7, the Heaven Benchmark also supports DirectX 9, DirectX 10 and OpenGL. Visually, it features beautiful floating islands that contain a tiny village and extremely detailed architecture.

 

Settings

 

 

 

 

 

 

 

   

   

   

Higher = Better

 

Performance was the same between the Gigabyte card and the GTX 460 in Unigine. The 6850 was far behind. Overclocking gave a good boost in all three resolutions!

Testing:

Battlefield: Bad Company 2 is a first-person shooter developed by EA Digital Illusions CE (DICE) and published by Electronic Arts for Windows, PS3 and XBox. This game is part of the Battlefield franchise and uses the Frostbite 1.5 Engine, allowing for destructible environments. You can play the single player campaign or multiplayer with five different game modes. Released in March 2010, it has so far sold in excess of six million copies.

Settings

 

 

 

 

 

 

 

 

   

   

   

Higher = Better

 

The GTX 560 was able to outscore the 6970 in all three resolutions! This shows that Bad Company 2 leans more towards the NVIDIA side for performance. The 6800 cards are nowhere to be seen.

Testing:

3DMark 11 is the next installment for Futuremark in the 3DMark series with Vantage as its predecessor. The name implies that this benchmark is for Microsoft DirectX 11 and with an unintended coincidence, the name matches the upcoming date in number (which was the naming scheme to some prior versions of 3DMark nonetheless). 3DMark 11 is designed solely for DirectX 11 so Windows Vista or 7 are required along with a DirectX 11 graphics card in order to run this test. The Basic Edition has unlimited free tests on performance mode whereas Vantage only allowed for a single test run. The advanced edition costs $19.95 and unlocks nearly all of the features of the benchmark and the professional edition runs $995.00 and is mainly suited for corporate use. The new benchmark contains six tests, four of which are aimed only at graphical testing, one to test for physics handling and one to combine graphics and physics testing together. The open source Bullet Physics library is used for physics simulations and although not as mainstream as Havok or PhysX, it still seems to be a popular choice.

With the new benchmark comes two new demos that can be watched, both based on the tests but unlike the tests, these contain basic audio. The first demo is titled "Deep Sea" and have a few vessels exploring what looks to be a sunken U-Boat. The second demo is titled "High Temple" and is similar to South American tribal ruins with statues and the occasional vehicle around. The demos are simple in that they have no story, they are really just a demonstration of what the testing will be like. The vehicles have the logos of the sponsors MSI and Antec on their sides with the sponsorships helping to make the basic edition free. The four graphics tests are slight variants of the demos. I will use the three benchmark test preset levels to test the performance of each card. The presets are used as they are comparable to what can be run with the free version so that results can be compared across more than just a custom set of test parameters.

 

Settings

 

 

 

 

   

   

   

Higher = Better

 

Performance is close between the GTX 460 and GTX 560 in 3DMark 11 with the 460 winning in the beginning and losing in the end. The 6850 is behind in all of the tests too.

Testing:

Featuring all-new game tests, this benchmark is for use with Vista-based systems. "There are two all-new CPU tests that have been designed around a new 'Physics and Artificial Intelligence-related computation.' CPU test two offers support for physics related hardware." There are four preset levels that correspond to specific resolutions. "Entry" is 1024 x 768 progressing to "Extreme" at 1920 x 1200. Of course, each preset can be modified to arrange any number of user designed testing. For our testing, I will use the four presets at all default settings.

 

Settings

 

 

 

 

 

 

 

   

   

   

   

Higher = Better

 

The closest competition to the GTX 560 from AMD is the 6870 in Vantage. The older 5870 outperforms the card by a solid margin. As the resolution climbs, the scores get closer to the GTX 460's.

Testing:

Temperature testing will be accomplished by loading the video card to 100% using MSI Kombuster, which is paired with MSI's Afterburner overclocking utility for temperature monitoring. I will be using the stability test set to a resolution of 1920 x 1200 using 8xAA. I will use a 15 minute time frame to run the test, ensuring that the maximum thermal threshold is reached. The fan speed will be left in the control of the driver package and video card's BIOS for the stock load test, with the fan moved to 100% to see the best possible cooling scenario for the overclocked load test. The idle test will be a 20 minute cool down with the fan speeds left on automatic in the stock speed testing and bumped up to 100% when running the overclocked idle and load testing. For load testing, the GTX 500 series, I will use Crysis Warhead running at 2560 x 1600 using the Gamer setting with 8xAA looping the Avalanche benchmark scenario, as I have found this to put a load close to that of Kombuster on a video card. This is needed as a way around the current limiting ability of the GTX 500 series when it detects programs that put an unrealistic load on the GPU, which Kombuster does.

Settings

 

 

 

 

 

 

 

 

 

 

 

  

  

Lower = Better

 

The Gigabyte GTX 560 managed to get two of the best temperatures out of the pack: load; stock; and idle overclocked results! The card came close at idle stock and did real well at load overclocked too. These temperatures are fantastic.

Testing:

Power Consumption of the system will be measured in both idle states and loaded states and will take into account the peak voltage of the system with each video card installed. I will use MSI Kombuster to load the GPU for a 15 minute test and use the peak load of the system as my result for the maximum load. The idle results will be measured after 15 minutes of inactivity on the system. For load testing the GTX 500 series, I will once again use Crysis Warhead run at 2560 x 1600 using the Gamer setting with 8xAA looping the Avalanche benchmark scenario.

Settings

 

 

 

 

 

 

 

 

 

 

 

 

   

   

Lower = Better

 

The GTX 500 series is far more efficient than the GTX 400 series was. The Gigabyte card had some of the lowest power usage numbers. However its low overclock consumption was in part thanks to its smaller overclock.

Conclusion:

The Gigabyte GTX 560 OC provided a solid gaming experience while being affordable. The whole time it was in use, the card was quiet. The 100mm fans were inaudible compared to the case fans even when run at full speed of ~2000 RPM. Overclocking results weren't ground breaking but the card did do well. Core speed was already overclocked 20 MHz from factory to 830 MHz and it was able to operate stable at 910 MHz. The memory was able to overclock from 1002 MHz to 1125 MHz stable as well. Temperatures were some of the best recorded with help from the WindForce heat sink! Power consumption numbers were some of the lowest out of the cards tested thanks to NVIDIA's design and Gigabyte's Ultra Durable VGA components. I couldn't discern any choke squeal so the Ultra Durable ferrite chokes definitely did their job well. With some voltage modification, this card should easily have a bit more headroom. The Gigabyte Easy Boost software is handy with overclocking and fan speed control and BIOS tools. Users can backup and flash their BIOS with the program with the push of a button!

The only con for this card would be the overclocking capability in comparison to the cooling. The card had plenty of overclocking room thermally but was held back by a lack of voltage. Overclocking is a gamble and this note shouldn't be held against the card considering it did decently. The card itself wasn't at fault as it was designed for quiet stability and not designed for hardcore overclocking. The overclock results were approximately 11% gain on the core and 12% on the memory.

The card performed very well but until stocks of the GTX 460 run out, it will be interesting to see how it sells being that the GTX 460 is considerably cheaper. Performance was great and everything worked perfectly so if the price is right this card makes a great choice!

 

Pros:

 

Cons: