GeForce GTX 280, which is being released today, is the new high-end graphics processing unit from NVIDIA. In this review we are going to compare its performance with other high-end video cards we had available, namely the GeForce 9800 GX2, the GeForce 9800 GTX, the Radeon HD 3870 X2 and the Radeon HD 3870. Is this new video card really the fastest on the market today? Check it out.
We posted a separated article explaining in details the new architecture used on the GeForce GTX 200 series. So please read this article if you want to learn what is below the hood.
Let’s first take an overall look at the physical aspects of GeForce GTX 280.
GeForce GTX 280 requires one 6-pin and one 8-pin auxiliary power connectors, as you can see in Figure 4. The maximum power consumption of this beast is 236 W.
[nextpage title=”Introduction (Cont’d)”]
As you could see, the new GeForce GTX 280 has an external looks similar to GeForce 9800 GX2 and GeForce GTX 9800, see Figures 5 and 6.
We removed the video card cooler to take a look at the board, see Figures 7 and 8.
As you can see in Figure 7 the graphics processing unit (GPU) is really big. We show a close-up in Figure 9, see its codename “G200-300.”
In Figure 10, you can see the massive cooler used by this video card. As you can see it uses a big copper base, which is connected to the cooler aluminum fins by copper heat-pipes.
[nextpage title=”More Details”]
To make the comparison between the new GeForce GTX 280 and the other video cards we included in this review easier, we compiled the table below comparing the main specs from these cards. If you want to compare the specs of the new GTX 280 to any other video card not included in the table below, just take a look at our NVIDIA Chips Comparison Table and on our AMD ATI Chips Comparison Table.
Just for the record, the GeForce GTX 280 reference model uses 16 Hynix H5RS5223CFR N2C GDDR3 chips. These chips officially support a clock frequency of up to 1,200 MHz. On this video card they run at 1,107 MHz, so there is a tight 8.4% headroom for you to overclock the video card memories still maintaining them inside their specs. Of course you can always try pushing memories above their specs, but it isn’t guaranteed that the overclocking will work.
|GPU||Core Clock||Shader Clock||Proc.||Memory Clock||Memory Interface||Memory Transfer Rate||Memory||Price|
|GeForce GTX 280||602 MHz||1,296 MHz||240||1,107 MHz||512-bit||141.7 GB/s||1 GB GDDR3||USD 649|
|GeForce 9800 GX2||600 MHz||1,500 MHz||128||1,000 MHz||256-bit||64 GB/s||1 GB GDDR3||USD 470 – 550|
|GeForce 9800 GTX||675 MHz||1,688 MHz||128||1,100 MHz||256-bit||70.4 GB/s||512 MB GDDR3||USD 270 – 355|
|Sapphire Atomic HD 3870 X2||857 MHz||857 MHz||320||927 MHz||256-bit||59.33 GB/s||1 GB GDDR3||–|
|Radeon HD 3870||776 MHz||776 MHz||320||1,125 MHz||256-bit||72 GB/s||512 MB GDDR4||USD 150 – 200|
Some important observations regarding this table:
- All these video cards are DirectX 10 (Shader 4.0).
- The memory clocks listed are the real memory clock. Memory clocks are often advertised as double the figures presented, numbers known as “DDR clock.”
- GeForce 9800 GX2 and Radeon HD 3870 X2 have two GPU’s. The numbers on the table represent only one of the chips.
- All video cards included on our review were running at the chip manufacturer default clock configuration (i.e., no overclocking), except Sapphire Atomic HD 3870 X2. The official core clock for Radeon HD 3870 X2 is 825 MHz, while the official memory clock is 900 MHz. So this card was a little bit overclocked. We couldn’t reduce these clocks to their reference values and since we hadn’t any other Radeon HD 3870 X2 available we included this video card anyway. That is why this is the only video card we are disclosing the vendor and exact model.
- Prices were researched
at Newegg.com one day before this review was published. The price for GeForce GTX 280 is the maximum suggested retail price (MRSP) set by NVIDIA. We couldn’t find Sapphire Atomic HD 3870 X2 for sale. This model will be more expensive than cards from other vendors based on the same GPU because it features water cooling. Just for you to have an idea, prices on the regular Radeon 3870 X2 are quoted between USD 315 and USD 405.
We faced an overheating situation during our tests and we’d like to take some time now to explain what happened exactly. During our Quake 4 benchmarking the system was very unstable. The first thing that came to our mind was that the card was overheating. Touching it we could see that it was hotter than what we expected. Inspecting this problem further it turned out that the overheating wasn’t being caused by the video card itself, but by the fan attached on top of the north bridge chip on our nForce 790i motherboard, which was blowing hot air on the video card, causing it to overheat. This is clearly a design flaw from this particular motherboard. We solved this issue by removing the north bridge fan and adding a 120 mm fan on top of our system. So if you buy this card make sure that your case is well-ventilated and that the north bridge cooler isn’t blowing hot air on your video card. You can see the problem in Figure 11.
[nextpage title=”How We Tested”]
During our benchmarking sessions, we used the configuration listed below. Between our benchmarking sessions the only variable was the video card being tested.
- CPU: Core 2 Extreme QX9770 (3.2 GHz, 1,600 MHz FSB, 12 MB L2 memory cache).
- Motherboard: EVGA nForce 790i Ultra SLI (P05 BIOS)
- Memories: Crucial Ballistix PC3-16000 2 GB kit (BL2KIT12864BE2009), running at 2,000 MHz with 9-9-9-28 timings.
- Hard disk drive: Western Digital VelociRaptor WD3000GLFS (300 GB, SATA-300, 10,000 rpm, 16 MB cache).
- Video monitor: Samsung SyncMaster 305T (30” LCD, 2560×1600).
- Power supply: OCZ EliteXStream 1,000 W.
- CPU Cooler: Thermaltake TMG i1
- Optical Drive: LG GSA-H54N
- Desktop video resolution: 2560×1600 @ 60 Hz
- Windows Vista Ultimate 32-bit
- Service Pack 1
- nForce driver version: 15.17
- AMD/ATI video driver version: Catalyst 8.5
- NVIDIA video driver version: 175.16
- NVIDIA video driver version: 177.34 (GeForce GTX 280)
- 3DMark06 Professional 1.1.0 + October 2007 Hotfix
- 3DMark Vantage Professional 1.0.1
- Call of Duty 4 – Patch 1.6
- Crysis – Patch 1.2.1 + HardwareOC Crysis Benchmark Tool 184.108.40.206
- Half-Life 2: Episode Two – Patch June 9th 2008 + HardwareOC Half-Life 2 Episode Two Benchmark Tool 220.127.116.11
- Quake 4 – Patch 1.4.2
Resolutions and Image Quality Settings
Since we were comparing very high-end video cards, we ran all our tests under three 16:10 widescreen high resolutions: 1680×1050, 1920×1200, and 2560×1600. We always tried to run the programs and games in two scenarios for each resolution, one with low image quality settings and then maxing out the image quality settings. The exact configuration we used will be described together with the results of each individual test.
We adopted a 3% error margin; thus, differences below 3% cannot be considered relevant. In other words, products with a performance difference below 3% should be considered as having similar performance.
[nextpage title=”3DMark06 Professional”]
3DMark06 measures Shader 3.0 (i.e., DirectX 9.0c) performance. We run this software under three 16:10 widescreen resolutions, 1680×1050, 1920×1200, and 2560×1600, first with no image quality enhancements enabled – results we call “low” on the charts and tables below –, then setting 4x anti-aliasing and 16x anisotropic filtering. See the results below.
|3DMark06 Professional 1.1.0 – 1680×1050 – Low||Score||Difference|
|Sapphire Atomic Radeon HD 3870 X2||16260||9.10%|
|GeForce 9800 GX2||15623||4.82%|
|GeForce GTX 280||14904|
|GeForce 9800 GTX||12759||16.81%|
|Radeon HD 3870||10694||39.37%|
|3DMark06 Professional 1.1.0 – 1920×1200 – Low||Score||Difference|
|GeForce 9800 GX2||15547||9.37%|
|Sapphire Atomic Radeon HD 3870 X2||15489||8.96%|
|GeForce GTX 280||14215|
|GeForce 9800 GTX||11631||22.22%|
|Radeon HD 3870||9454||50.36%|
|3DMark06 Professional 1.1.0 – 2560×1600 – Low||Score||Difference|
|GeForce 9800 GX2||13015||10.62%|
|Sapphire Atomic Radeon HD 3870 X2||12315||4.67%|
|GeForce GTX 280||11766|
|GeForce 9800 GTX||8743||34.58%|
|Radeon HD 3870||6823||72.45%|
|3DMark06 Professional 1.1.0 – 1680×1050 – High||Score||Difference|
|Sapphire Atomic Radeon HD 3870 X2||16260||33.75%|
|GeForce 9800 GX2||13900||14.34%|
|GeForce GTX 280||12157|
|GeForce 9800 GTX||8981||35.36%|
|Radeon HD 3870||6915||75.81%|
|3DMark06 Professional 1.1.0 – 1920×1200 – High||Score||Difference|
|Sapphire Atomic Radeon HD 3870 X2||15489||40.92%|
|GeForce 9800 GX2||12213||11.12%|
|GeForce GTX 280||10991|
|GeForce 9800 GTX||7811||40.71%|
|Radeon HD 3870||6114||79.77%|
|3DMark06 Professional 1.1.0 – 2560×1600 – High||Score||Difference|
|Sapphire Atomic Radeon HD 3870 X2||12315||41.49%|
|GeForce 9800 GX2||9829||12.93%|
|GeForce GTX 280||8704|
|GeForce 9800 GTX||5774||50.74%|
|Radeon HD 3870||4319||101.53%|
[nextpage title=”3DMark Vantage Professional”]3DMark Vantage is the latest addition to the 3DMark series, measuring Shader 4.0 (i.e., DirectX 10) performance and supporting PhysX, a programming interface developed by Ageia (now part of NVIDIA) to transfer physics calculations from the system CPU to the video card GPU in order to increase performance. Mechanical physics is the basis for calculations about the interaction of objects. For example, if you shoot, what exactly will happen to the object when the bullet hits it? Will it break? Will it move? Will the bullet bounce back?
We ran this program at three 16:10 widescreen resolutions, 1680×1050, 1920×1200, and 2560×1600. First we used the “Performance” profile, and then we used the “Extreme” profile (basically enabling anti-aliasing at 4x, anisotropic filtering at 16x, and putting all detail settings at their maximum or “extreme” value. The combination of 2560×1600 resolution with extreme settings didn’t produce reliable results according to the program, so we aren’t going to add them here. The results being compared are the “GPU Scores” achieved by each video card.
|3DMark Vantage Professional 1.0.1 – 1680×1050 – Performance||Score||Difference|
|GeForce GTX 280||7695|
|GeForce 9800 GX2||6990||10.09%|
|Sapphire Atomic Radeon HD 3870 X2||5651||36.17%|
|GeForce 9800 GTX||3805||102.23%|
|Radeon HD 3870||2977||158.48%|
|3DMark Vantage Professional 1.0.1 – 1920×1200 – Performance||Score||Difference|
|GeForce GTX 280||6106|
|GeForce 9800 GX2||5379||13.52%|
|Sapphire Atomic Radeon HD 3870 X2||4336||40.82%|
|GeForce 9800 GTX||2891||111.21%|
|Radeon HD 3870||2269||169.11%|
|3DMark Vantage Professional 1.0.1 – 2560×1600 – Performance||Score||Difference|
|GeForce GTX 280||3549|
|GeForce 9800 GX2||2910||21.96%|
|Sapphire Atomic Radeon HD 3870 X2||2382||48.99%|
|GeForce 9800 GTX||1557||127.94%|
|Radeon HD 3870||1244||185.29%|
|3DMark Vantage Professional 1.0.1 – 1680×1050 – Extreme||Score||Difference|
|GeForce GTX 280||6005|
|GeForce 9800 GX2||4858||23.61%|
|Sapphire Atomic Radeon HD 3870 X2||3567||68.35%|
|GeForce 9800 GTX||2703||122.16%|
|Radeon HD 3870||1855||223.72%|
|3DMark Vantage Professional 1.0.1 – 1920×1200 – Extreme||Score||Difference|
|GeForce GTX 280||4732|
|GeForce 9800 GX2||3508||34.89%|
|Sapphire Atomic Radeon HD 3870 X2||2669||77.29%|
|GeForce 9800 GTX||2038||132.19%|
|Radeon HD 3870||1439||228.84%|
[nextpage title=”Call of Duty 4″]
Call of Duty 4 is a DirectX 9 game implementing high-dynamic range (HDR) and its own physics engine, which is used to calculate how objects interact. For example, if you shoot, what exactly will happen to the object when the bullet hits it? Will it break? Will it move? Will the bullet bounce back? It gives a more realistic experience to the user.
We ran this program at three 16:10 widescreen resolutions, 1680×1050, 1920×1200, and 2560×1600, maxing out all image quality controls (i.e., everything was put on the maximum values on the Graphics and Texture menus). We used the game internal benchmarking feature, running a demo provided by NVIDIA called “wetwork.” We are putting this demo for downloading here if you want to run your own benchmarks. The game was updated to version 1.6.
|Call of Duty 4 – 1680×1050 – Maximum||Score||Difference|
|GeForce 9800 GX2||106.2||0.85%|
|GeForce GTX 280||105.3|
|Sapphire Atomic Radeon HD 3870 X2||75.7||39.10%|
|GeForce 9800 GTX||69.1||52.39%|
|Radeon HD 3870||43.0||144.88%|
|Call of Duty 4 – 1920×1200 – Maximum||Score||Difference|
|GeForce 9800 GX2||94.5||3.05%|
|GeForce GTX 280||91.7|
|Sapphire Atomic Radeon HD 3870 X2||61.3||49.59%|
|GeForce 9800 GTX||57.7||58.93%|
|Radeon HD 3870||35.4||159.04%|
|Call of Duty 4 – 2560×1600 – Maximum||Score||Difference|
|GeForce 9800 GX2||64.8||0%|
|GeForce GTX 280||64.8|
|Sapphire Atomic Radeon HD 3870 X2||40.6||59.61%|
|GeForce 9800 GTX||38.3||69.19%|
|Radeon HD 3870||22.4||189.29%|
Crysis is a very heavy DirectX 10 game. We updated this game to version 1.2.1 and used the HOC Crysis Benchmarking Utility to help us collecting data. Since we don’t think the default demo based on the island map stresses the video card the way we want, we used the HOC core demo available with the abovementioned utility. We ran this demo under three 16:10 widescreen resolutions, 1680×1050, 1920×1200, and 2560×1600, first with image quality set to “low” and then with image quality set to “high.” Since all video cards achieved a number of frames per second below 10 at 2560×1600 with image details set to “high,” we are not including this test as the results aren’t reliable. We ran each test twice and discarded the first result, as usually the first run achieves a lower score compared to the subsequent runs since the game loses time loading files. The results are below, in frames per second (FPS).
|Crysis 1.2.1 – 1680×1050 – Low||Score||Difference|
|Sapphire Atomic Radeon HD 3870 X2||125||0%|
|GeForce GTX 280||125|
|GeForce 9800 GTX||84||48.81%|
|GeForce 9800 GX2||75||66.67%|
|Radeon HD 3870||71||76.06%|
|Crysis 1.2.1 – 1920×1200 – Low||Score||Difference|
|GeForce GTX 280||115|
|Sapphire Atomic Radeon HD 3870 X2||108||6.48%|
|GeForce 9800 GTX||69||66.67%|
|GeForce 9800 GX2||63||82.54%|
|Radeon HD 3870||58||98.28%|
|Crysis 1.2.1 – 2560×1600 – Low||Score||Difference|
|GeForce GTX 280||95|
|Sapphire Atomic Radeon HD 3870 X2||71||33.80%|
|GeForce 9800 GTX||44||115.91%|
|GeForce 9800 GX2||42||126.19%|
|Radeon HD 3870||35||171.43%|
|Crysis 1.2.1 – 1680×1050 – High||Score||Difference|
|GeForce GTX 280||42|
|GeForce 9800 GTX||29||44.83%|
|Sapphire Atomic Radeon HD 3870 X2||26||61.54%|
|GeForce 9800 GX2||25||68.00%|
|Radeon HD 3870||19||121.05%|
|Crysis 1.2.1 – 1920×1200 – High||Score||Difference|
|GeForce GTX 280||34|
|GeForce 9800 GTX||22||54.55%|
|GeForce 9800 GX2||21||61.90%|
|Sapphire Atomic Radeon HD 3870 X2||20||70.00%|
|Radeon HD 3870||16||112.50%|
[nextpage title=”Half-Life 2: Episode Two”]
Half-Life 2 is a popular franchise and we benchmark the video cards using Episode Two with the aid of HOC Half-Life 2 Episode Two benchmarking utility using the “HOC Demo 1” provided by this program. We ran the game in three 16:10 widescreen resolutions, 1680×1050, 1920×1200, and 2560×1600, under two scenarios. First with quality set to maximum, bilinear filtering and anti-aliasing set to x0. This configuration we are calling “low” on the charts and tables below. Then we maxed out image quality settings, enabling x16 anisotropic filtering and 16xQCS anti-aliasing. This configuration we are calling “high” on our charts and tables. We updated the game up to the June 9th 2008 patch.
|Half-Life 2: Episode Two – 1680×1050 – Low||Score||Difference|
|Sapphire Atomic Radeon HD 3870 X2||160.4||2.62%|
|GeForce GTX 280||156.3|
|GeForce 9800 GTX||153.8||1.63%|
|Radeon HD 3870||145.7||7.28%|
|GeForce 9800 GX2||136.8||14.25%|
|Half-Life 2: Episode Two – 1920×1200 – Low||Score||Difference|
|Sapphire Atomic Radeon HD 3870 X2||156.7||0.26%|
|GeForce GTX 280||156.3|
|GeForce 9800 GTX||146.9||6.40%|
|GeForce 9800 GX2||135.2||15.61%|
|Radeon HD 3870||120.1||30.14%|
|Half-Life 2: Episode Two – 2560×1600 – Low||Score||Difference|
|GeForce GTX 280||145.1|
|GeForce 9800 GX2||130.6||11.10%|
|Sapphire Atomic Radeon HD 3870 X2||129.7||11.87%|
|GeForce 9800 GTX||107.9||34.48%|
|Radeon HD 3870||72.8||99.31%|
|Half-Life 2: Episode Two – 1680×1050 – High||Score||Difference|
|GeForce 9800 GTX||137.9||54.42%|
|Sapphire Atomic Radeon HD 3870 X2||126.1||41.21%|
|GeForce 9800 GX2||125.4||40.43%|
|GeForce GTX 280||89.3|
|Radeon HD 3870||68.3||30.7%|
|Half-Life 2: Episode Two – 1920×1200 – High||Score||Difference|
|GeForce 9800 GTX||116.3||65.43%|
|GeForce 9800 GX2||111.1||58.04%|
|Sapphire Atomic Radeon HD 3870 X2||106.5||51.49%|
|GeForce GTX 280||70.3|
|Radeon HD 3870||56.8||23.77%|
|Half-Life 2: Episode Two – 2560×1600 – High||Score||Difference|
|GeForce 9800 GTX||71.3||100.85%|
|Sapphire Atomic Radeon HD 3870 X2||50.6||42.54%|
|GeForce 9800 GX2||37.5||5.63%|
|GeForce GTX 280||35.5|
|Radeon HD 3870||34.9||1.72%|
[nextpage title=”Quake 4″]
We upgraded Quake 4 to version 1.4.2 and ran its multiplayer demo id_perftest with SMP option enabled (which allows Quake 4 to recognize and use more than one CPU), under the same three 16:10 widescreen resolutions, 1680×1050, 1920×1200, and 2560×1600, first with image quality settings configured at “low” and then with image quality settings configured at “ultra.” You can check the results below, given in frames per second.
|Quake 4 – 1680×1050 – Low||Score||Difference|
|Sapphire Atomic Radeon HD 3870 X2||285.30||6.14%|
|GeForce GTX 280||268.80|
|Radeon HD 3870||227.75||18.02%|
|GeForce 9800 GTX||225.52||19.19%|
|GeForce 9800 GX2||220.48||21.92%|
|Quake 4 – 1920×1200 – Low||Score||Difference|
|Sapphire Atomic Radeon HD 3870 X2||266.23||12.85%|
|GeForce GTX 280||235.92|
|Radeon HD 3870||188.40||25.22%|
|GeForce 9800 GX2||174.06||35.54%|
|GeForce 9800 GTX||158.87||48.50%|
|Quake 4 – 2560×1600 – Low||Score||Difference|
|Sapphire Atomic Radeon HD 3870 X2||197.82||17.18%|
|GeForce GTX 280||168.81|
|Radeon HD 3870||116.01||45.51%|
|GeForce 9800 GTX||114.34||47.64%|
|GeForce 9800 GX2||100.07||68.69%|
|Quake 4 – 1680×1050 – Ultra||Score||Difference|
|GeForce GTX 280||246.39|
|Sapphire Atomic Radeon HD 3870 X2||237.98||3.53%|
|GeForce 9800 GX2||218.80||12.61%|
|GeForce 9800 GTX||194.65||26.58%|
|Radeon HD 3870||167.26||47.31%|
|Quake 4 – 1920×1200 – Ultra||Score||Difference|
|GeForce GTX 280||224.44|
|Sapphire Atomic Radeon HD 3870 X2||218.62||2.66%|
|GeForce 9800 GX2||158.35||41.74%|
|GeForce 9800 GTX||158.18||41.89%|
|Radeon HD 3870||144.80||55.00%|
|Quake 4 – 2560×1600 – Ultra||Score||Difference|
|Sapphire Atomic Radeon HD 3870 X2||177.36||5.30%|
|GeForce GTX 280||168.43|
|GeForce 9800 GTX||102.04||65.06%|
|GeForce 9800 GX2||94.68||77.89%|
|Radeon HD 3870||94.40||78.42%|
The new GeForce GTX 280 is a truly very high-end card that will bring an impressive performance boost if you are a serious gamer playing the latest games at very high resolutions (i.e., 1680×1050 and above) maxing out all image quality settings controls.
In Crysis with image quality settings configured at “high,” GeForce GTX 280 was between 62% and 70% faster than Sapphire Atomic HD 3870 X2 and between 62% and 68% faster than GeForce 9800 GX2. In this same game with image quality settings configured at “low,” GTX 280 was between 67% and 126% faster than 9800 GX2 – here however GTX 280 and Sapphire Atomic HD 3870 X2 achieved the same performance level, with the only significant difference seen at 2560×1600, where this video card from Sapphire was 34% faster than GTX 280.
In Call of Duty 4 putting all image quality settings on their maximum values, GeForce GTX 280 achieved the same performance level as GeForce 9800 GX2, being between 39% and 60% faster than Sapphire Atomic HD 3870 X2.
In Quake 4 with image quality set at “ultra,” GTX 280 was between 13% and 78% faster than 9800 GX2, but achieving the same performance level as Sapphire Atomic HD 3870 X2, with this model from Sapphire being 5% faster at 2560×1600.
In the new 3DMark Vantage with image quality set at “performance” GeForce GTX 280 was between 10% and 22% faster than GeForce 9800 GX2 and between 26% and 49% faster than Sapphire Atomic HD 3870 X2. When we increased image quality settings to “extreme,” GTX 280 was between 25% and 35% faster than GX2 and between 68% and 77% faster than X2.
In Half-Life 2: Episode Two maxing out image quality settings GeForce 9800 GX2 was between 6% and 58% faster and Sapphire Atomic HD 3870 X2 was between 41% and 51% faster. Setting image quality to their lowest settings GTX 280 achieved the same performance level as Sapphire Atomic HD 3870 X2, with GTX 280 being 12% faster than this card at 2560×1600 and with the reviewed card being between 11% and 16% faster than GX2.
But on 3DMark06 both GX2 and X2 were faster than GTX 280. With low image quality settings the performance difference wasn’t enormous, with X2 being between 5% and 9% faster and with GX2 being between 5% and 11% faster. When we increased image quality settings the performance difference increased, with GX2 being between 11% and 14% faster and Sapphire Atomic HD 3870 X2 being between 34% and 41% faster than GeForce GTX 280.
As you can see, GeForce GTX 280 wasn’t the fastest card all the times, especially when we ran an older simulation (3DMark06). Many can argue that the best way to benchmark video cards is with real games and not with simulations. This is true, but 3DMark is a very interesting tool nevertheless. Also, GeForce 9800 GX2 and Sapphire Atomic HD 3870 X2 were faster than GeForce GTX 280 on Half Life 2: Episode Two when we set image quality settings to their maximum. This may indicate that GTX 280 is optimized for DirectX 10 games.
Many could also criticize our methodology saying that we didn’t include enough games. This is a valid constructive criticism and we know that we could add more games. Unfortunately we couldn’t add more due to the little time available we had to collect data, analyze data and write this review in time for posting it today, the day the embargo on this video card is over.
As we mentioned, GeForce GTX 280 is clearly targeted to gamers playing the latest titles at very high resolutions and that will crank up image quality settings.
GeForce GTX 280 is coming with a sour USD 650 MSRP. Unless you are rich, we give you two options: wait for the price to come down or buy another video card. Both GeForce 9800 GX2 and Sapphire Atomic HD 3870 X2 are good options if you don’t have USD 650 and the patience for waiting the price of GeForce GTX 280 to go down and, at the same time, want a very high-end video card.< /p>
We are now even more curious to see the performance of GeForce GTX 260, as it will reach the market costing “only” USD 400. We hope to get one very soon. Stay tuned!