We are a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for us to earn fees by linking to Amazon.com and affiliated sites.

[nextpage title=”Introduction”]

The GeForce GTX 560 Ti is the latest addition to the GeForce GTX 500 series from NVIDIA, coming with a very affordable USD 250 price tag, competing directly with the Radeon HD 6870. Which one is the best? Check it out.

We are reviewing the reference model for the GeForce GTX 560 Ti provided by NVIDIA. When a new graphics processor is released, typically “manufacturers” (called “partners” by NVIDIA) don’t actually manufacture video cards using the new chip; they buy the cards already assembled from NVIDIA and only add their own sticker, box and product manual. Therefore, the video card you will see in this review is exactly the same one you will find on the market under several different brands. In order to differentiate themselves, partners may also release models with a different cooling solution and/or overclocked.

In the table below we compare the main specs of the video cards included in our review. They are all DirectX 11 parts.

The effective clock rate of the GeForce GTX 560 Ti is actually 4,008 MHz.

Video Card Core Clock Shader Clock Memory Clock (Real) Memory Clock (Effective) Memory Interface Memory Transfer Rate Memory Shaders Price
GeForce GTX 560 Ti 822 MHz 1,644 MHz 2 GHz 4 GHz 256-bit 128.3 GB/s 1 GB GDDR5 384 USD 250
GeForce GTX 570 732 MHz 1,464 MHz 1.9 GHz 3.8 GHz 320-bit 152 GB/s 1.28 GB GDDR5 480 USD 350
Radeon HD 5870 850 MHz 850 MHz 2.4 GHz 4.8 GHz 256-bit 153.6 GB/s 1 GB GDDR5 1,600 USD 270 – 290
Radeon HD 6870 900 MHz 900 MHz 2.1 GHz 4.2 GHz 256-bit 134.4 GB/s 1 GB GDDR5 1,120 USD 220 – 240
Radeon HD 6950 800 MHz 800 MHz 2.5 GHz 5 GHz 256-bit 160 GB/s 2 GB GDDR5 1,408 USD 300
Radeon HD 6970 880 MHz 880 MHz 2.75 GHz 5.5 GHz 256-bit 176 GB/s 2 GB GDDR5 1,536 USD 370

Prices were researched at Newegg.com on the day we published this review.

You can compare the specs of these video cards with other video cards by taking a look at our AMD ATI Chips Comparison Table and NVIDIA Chips Comparison Table tutorials.   

Now let’s take an in-depth look at the NVIDIA reference model for the GeForce GTX 560 Ti.

[nextpage title=”The NVIDIA GeForce GTX 560 Ti”]

Below we have an overall look at the NVIDIA reference model for the GeForce GTX 560 Ti. It requires two six-pin auxiliary power connectors.

NVIDIA GeForce GTX 560 TiFigure 1: NVIDIA GeForce GTX 560 Ti

NVIDIA GeForce GTX 560 TiFigure 2: NVIDIA GeForce GTX 560 Ti

This video card has one mini HDMI and two DVI-D connectors.

NVIDIA GeForce GTX 560 TiFigure 3: Video connectors

[nextpage title=”The NVIDIA GeForce GTX 560 Ti (Cont’d)”]

In Figure 4, you can see the video card with its cooler removed and, in Figure 5, a close-up of the voltage regulator circuit.

GeForce GTX 560 TiFigure 4: Video card with the cooler removed

The voltage regulator circuit uses solid capacitors, ferrite-core coils (which make the regulator to have higher efficiency because they have lower energy loss than iron-core coils), and low RDS(on) MOSFET transistors (i.e., higher efficiency).

GeForce GTX 560 TiFigure 5: Voltage regulator circuit

The GPU heatsink can be seen in Figures 6 and 7. It has a copper base, three six-mm heatpipes, aluminum fins, and an 80 mm fan.

GeForce GTX 560 TiFigure 6: The GPU heatsink

GeForce GTX 560 TiFigure 7: The GPU heatsink

The NVIDIA GeForce GTX 560 Ti uses eight 1 Gbit GDDR5 chips, making its 1 GB video memory (1 Gbit x 8 = 1,024 MB = 1 GB). Each chip is connected to the GPU using a 32-bit data lane, making the video card’s 256-bit memory interface (32 bits x 8 = 256).

The chips used are K4G10325FE-HC04 parts from Samsung, which support up to 2.5 GHz (5 GHz DDR) and since on this video card memory is accessed at 2 GHz (4 GHz DDR), there is still a huge 25% margin for you to increase the memory clock rate while keeping the chips inside the maximum they support. Of course you can always try to overclock the memory chips above their specs.

GeForce GTX 560 TiFigure 8: Memory chips

[nextpage title=”How We Tested”]

During our benchmarking sessions, we used the configuration listed below. Between our benchmarking sessions the only variable was the video card being tested.

Hardware Configuration

Software Configuration

  • Windows 7 Ultimate 64-bit
  • Video resolution: 2560×1600 @ 60 Hz

Driver Versions

  • AMD/ATI video driver version: Catalyst 10.12 beta
  • NVIDIA video driver version: 263.09
  • NVIDIA video driver version: 266.58 (GeForce GTX 560 Ti)
  • Intel Inf driver version: 9.1.2.1008

Software Used

Error Margin

We adopted a 3% error margin. Thus, differences below 3% cannot be considered relevant. In other words, products with a performance difference below 3% should be considered as having similar performance.

[nextpage title=”Call of Duty 4″]

Call of Duty 4 is a DirectX 9 game implementing high-dynamic range (HDR) and its own physics engine, which is used to calculate how objects interact. For example, if you shoot, exactly what will happen to the object when the bullet hits it? Will it break? Will it move? Will the bullet bounce back? It gives a more realistic experience to the user.

To get accurate results, we had to disable the 80 FPS limit in the game. To do this, input the command, “/seta com_maxfps 1000” (minus the quotes) into the console (` key). It can be set to any number greater than 200.

We ran this program at three 16:10 widescreen resolutions, 1680×1050, 1920×1200, and 2560×1600, maxing out all image quality controls (i.e., everything was set to the maximum values in the Graphics and Texture menus). We used the internal game benchmarking feature, running a demo provided by NVIDIA called “wetwork.”We are putting this demo here for downloading if you want to run your own benchmarks. We ran the demo five times, and the results below are the average number of frames per second (FPS) achieved by each video card.

GeForce GTX 560 Ti

Call of Duty 4 – Maximum 1680×1050 Difference
Radeon HD 6970 177.4 20%
GeForce GTX 570 169.0 14%
Radeon HD 6950 155.6 5%
Radeon HD 5870 150.3 1%
GeForce GTX 560 Ti 148.2  
Radeon HD 6870 142.4 -4%

GeForce GTX 560 Ti

Call of Duty 4 – Maximum 1920×1200 Difference
Radeon HD 6970 162.3 29%
GeForce GTX 570 144.6 15%
Radeon HD 5870 130.8 4%
Radeon HD 6950 130.4 4%
GeForce GTX 560 Ti 125.4  
Radeon HD 6870 123.5 -2%

GeForce GTX 560 Ti

Call of Duty 4 – Maximum 2560×1600 Difference
Radeon HD 6970 108.4 27%
GeForce GTX 570 100.4 18%
Radeon HD 6950 92.2 8%
Radeon HD 5870 91.8 7%
Radeon HD 6870 87.3 2%
GeForce GTX 560 Ti 85.4  

[nextpage title=”Crysis Warhead”]

Crysis Warhead is a DirectX 10 game based on the same engine as the original Crysis, but optimized (it runs under DirectX 9.0c when installed on Windows XP).

We used the HardwareOC Crysis Warhead Benchmark Tool to collect the data for this test.We ran this program at three 16:10 widescreen resolutions, 1680×1050, 1920×1200, and 2560×1600, all at very high image quality (but with no anti-aliasing and no anisotropic filtering) and using the Airfield demo. The results below are the number of frames per second achieved by each video card.

GeForce GTX 560 Ti

Crysis Warhead – Very High 1680×1050 Difference
GeForce GTX 570 44 16%
Radeon HD 6970 39 3%
GeForce GTX 560 Ti 38  
Radeon HD 6950 36 -5%
Radeon HD 5870 34 -11%
Radeon HD 6870 32 -16%

GeForce GTX 560 Ti

Crysis Warhead – Very High 1920×1200 Difference
GeForce GTX 570 38 12%
Radeon HD 6970 34 0%
GeForce GTX 560 Ti 31  
Radeon HD 6950 31 0%
Radeon HD 5870 30

-3%
Radeon HD 6870 27 -13%

GeForce GTX 560 Ti

Crysis Warhead – Very High 2560×1600 Difference
Radeon HD 6970 24 20%
GeForce GTX 570 24 20%
Radeon HD 6950 21 5%
Radeon HD 5870 21 5%
GeForce GTX 560 Ti 20  
Radeon HD 6870 18 -10%

[nextpage title=”Far Cry 2″]

Far Cry 2 is based on an entirely new game engine called Dunia, which is DirectX 10 when played under Windows Vista with a DirectX 10 compatible video card.

We used the benchmarking utility that comes with this game, setting image quality to Ultra High (with x8 anti-aliasing) and running the “Ranch Long” demo three times. The results below are expressed in frames per second and are an arithmetic average of the three results collected.

GeForce GTX 560 Ti

FarCry 2 – Ultra High – AAx8 1680×1050 Difference
GeForce GTX 570 99.1 17%
GeForce GTX 560 Ti 84.6  
Radeon HD 6970 81.9 -3%
Radeon HD 6950 78.4 -7%
Radeon HD 5870 74.4 -12%
Radeon HD 6870 70.6 -17%

GeForce GTX 560 Ti

FarCry 2 – Ultra High – AAx8 1920×1200 Difference
GeForce GTX 570 84.7 8%
GeForce GTX 560 Ti 78.3  
Radeon HD 6970 74.3 -5%
Radeon HD 6950 70.7 -10%
Radeon HD 6870 70.6 -10%
Radeon HD 5870 65.6 -16%

GeForce GTX 560 Ti

FarCry 2 – Ultra High – AAx8 2560×1600 Difference
Radeon HD 6970 55.4 8%
GeForce GTX 570 55.2 8%
GeForce GTX 560 Ti 51.3  
Radeon HD 6950 50.4 -2%
Radeon HD 5870 44.2 -14%
Radeon HD 6870 42.4 -17%

[nextpage title=”Aliens vs. Predator”]

Aliens vs. Predator is a DirectX 11 game that makes full use of tessellation and advanced shadow rendering. We used the Aliens vs. Predator Benchmark Tool developed by Rebellion. This program reads its configuration from a text file (our configuration files can be found here). We ran this program at 1680×1050, 1920×1200, and 2560×1600 resolutions, with very high settings, 16x anisotropic filtering and 4x anti-aliasing.

GeForce GTX 560 Ti

Aliens vs. Predator – Very High – AAx4, AFx16 1680×1050 Difference
Radeon HD 6970 47.9 29%
GeForce GTX 570 43.3 17%
Radeon HD 6950 42.1 13%
Radeon HD 5870 37.7 2%
GeForce GTX 560 Ti 37.1  
Radeon HD 6870 31.4 -15%

GeForce GTX 560 Ti

Aliens vs. Predator – Very High – AAx4, AFx16 1920×1200 Difference
Radeon HD 6970 39.6 31%
GeForce GTX 570 35.2 17%
Radeon HD 6950 35.1 16%
Radeon HD 5870 30.8 2%
GeForce GTX 560 Ti 30.2  
Radeon HD 6870 25.6 -15%

GeForce GTX 560 Ti

Aliens vs. Predator – Very High – AAx4, AFx16 2560×1600 Difference
Radeon HD 6970 24.6 33%
GeForce GTX 570 22.0 19%
Radeon HD 6950 21.7 17%
Radeon HD 5870 19.0 3%
GeForce GTX 560 Ti 18.5  
Radeon HD 6870 15.8 -15%

[nextpage title=”Lost Planet 2″]

Lost Planet 2 is a game that uses a lot of DirectX 11 features, like tessellation (to round out the edges of polygonal models), displacement maps (added to the tessellated mesh to add fine grain details), DirectCompute soft body simulation (to introduce more realism in the “boss” monsters), and DirectCompute wave simulation (to introduce more realism in the physics calculations in water surfaces; when you move or when gunshots and explosions hit the water, it moves accordingly). We reviewed the video cards using Lost Planet 2 internal benchmarking features, choosing the “Benchmark A” (we know that “Benchmark B” is the one recommended for reviewing video cards, however, at least with us, results were inconsistent). We set graphics at “high,” anti-aliasing at “4x” and DX11 at “full.” The results below are the number of frames per second generated by each video card.

GeForce GTX 560 Ti

Lost Planet 2 – High – AAx4 1680×1050 Difference
GeForce GTX 570 61.30 27%
GeForce GTX 560 Ti 48.10  
Radeon HD 6970 45.20 -6%
Radeon HD 6950 40.20 -16%
Radeon HD 6870 35.70 -26%
Radeon HD 5870 31.10 -35%

GeForce GTX 560 Ti

Lost Planet 2 – High – AAx4 1920×1200 Difference
GeForce GTX 570 54.20 29%
GeForce GTX 560 Ti 42.00  
Radeon HD 6970 41.70 -1%
Radeon HD 6950 33.60 -20%
Radeon HD 6870 30.60 -27%
Radeon HD 5870 27.80 -34%

GeForce GTX 560 Ti

Lost Planet 2 – High – AAx4 2560×1600 Difference
Radeon HD 6970 37.85 44%
GeForce GTX 570 35.50 35%
Radeon HD 6950 27.40 5%
GeForce GTX 560 Ti 26.20  
Radeon HD 6870 23.90 -9%
Radeon HD 5870 23.80 -9%

[nextpage title=”3DMark 11 Professional”]

3DMark 11 Professional measures Shader 5.0 (i.e., DirectX 11) performance. We ran this program at three 16:10 widescreen resolutions, 1680×1050, 1920×1200, and 2560×1600, selecting the four graphics tests available and deselecting the other tests available. We used two image quality settings for each resolution, “Performance” and “Extreme,” both at their default settings. The results being compared are the “GPU Score” achieved by each video card.

GeForce GTX 560 Ti

3DMark Vantage – Performance 1680×1050 Difference
Radeon HD 6970 3424 27%
GeForce GTX 570 3285 22%
Radeon HD 6950 3023 12%
Radeon HD 5870 2814 5%
Radeon HD 6870 2745 2%
GeForce GTX 560 Ti 2690  

GeForce GTX 560 Ti

3DMark Vantage – Performance 1920×1200 Difference
Radeon HD 6970 2641 30%
GeForce GTX 570 2466 21%
Radeon HD 6950 2334 15%
Radeon HD 5870 2208 9%
Radeon HD 6870 2148 6%
GeForce GTX 560 Ti 2034  

GeForce GTX 560 Ti

3DMark Vantage – Performance 2560×1600 Difference
Radeon HD 6970 1573 34%
GeForce GTX 570 1414 20%
Radeon HD 6950 1383 18%
Radeon HD 5870 1352 15%
Radeon HD 6870 1287 9%
GeForce GTX 560 Ti 1176  

GeForce GTX 560 Ti

3DMark Vantage – Extreme 1680×1050 Difference
Radeon HD 6970 2071 28%
GeForce GTX 570 1932 19%
Radeon HD 6950 1765 9%
Radeon HD 5870 1702 5%
Radeon HD 6870 1668 3%
GeForce GTX 560 Ti 1624  

GeForce GTX 560 Ti

3DMark Vantage – Extreme 1920×1200 Difference
Radeon HD 6970 1611 28%
GeForce GTX 570 1507 19%
Radeon HD 6950 1415 12%
Radeon HD 5870 1380 9%
Radeon HD 6870 1314 4%
GeForce GTX 560 Ti 1263  

GeForce GTX 560 Ti

3DMark Vantage – Extreme 2560×1600 Difference
Radeon HD 6970 1005 32%
GeForce GTX 570 910 20%
Radeon HD 6950 882 16%
Radeon HD 5870 875 15%
Radeon HD 6870 824 9%
GeForce GTX 560 Ti 759  

[nextpage tit
le=”Conclusions”]

In most scenarios the new GeForce GTX 560 Ti crushed its main competitor, the Radeon HD 6870: it was between 10% and 37% faster on Lost Planet 2, between 17% and 18% faster on Aliens vs. Predator, between 19% and 21% faster on Far Cry 2, and between 11% and 19% faster on Crysis Warhead. On Call of Duty 4 we saw a technical tie, with the GeForce GTX 560 Ti being 4% faster at 1680×1050 but both cards achieving the same performance at 1920×1200 and 2560×1600. On 3DMark 11 at 1680×1050 (both Performance and Extreme profiles) both cards achieved the same performance level, but at higher resolutions the Radeon HD 6870 was up to 9% faster.

It is true that you can find the Radeon HD 6870 for between USD 10 and USD 30 less than the GeForce GTX 560 Ti, but based on our results we believe that paying a little bit more for the new GeForce GTX 560 Ti is well worth it.