Radeon HD 5700 series is the first mid-range DirectX 11 video cards to arrive on the market. Let’snow see the performance of Radeon HD 5770 from XFX and see if it is a good buy.
Radeon HD 5770 runs internally at 850 MHz and features 800 processors – the same amount found on Radeon HD 4850 and Radeon HD 4870 –, accessing its memory through a 128-bit interface, which is half the width found on Radeon HD 4850 and Radeon HD 4870. Radeon HD 5770, like Radeon HD 4870 and members from Radeon HD 5800 family, uses GDDR5 memory chips, which are capable of transferring four data per clock cycle instead of just two. This makes the memories, which are accessed at 1.2 GHz, to achieve a performance as if they were accessed at 4.8 GHz, providing a 76.8 GB/s maximum theoretical transfer rate. To compare these specs to other graphics chips please take a look at our AMD ATI Chips Comparison Table and NVIDIA Chips Comparison Table.
The main competitor to Radeon HD 5770 is GeForce GTX 260 with 216 cores (“GeForce GTX 260/216”). In our review we will be mainly comparing the performance from these two video cards. We also included in our review Radeon HD 4850 and Radeon HD 4870, plus the new Radeon HD 5750 and GeForce GTS 250, and we should come up with a very interesting comparison.
Now let’s talk specifically about the Radeon HD 5770 from XFX. As you may be aware, XFX – which, by the way, many years ago was known as “Pine” – was for years one of the leading NVIDIA partners, and a while ago they decided they shouldn’t manufacture only NVIDIA-based video cards.
This model, also known as HD-577A-ZNFC, comes with 1 GB, two DVI outputs, one HDMI output and one DisplayPort output, following the reference model from AMD. This video card allows you to use up to three video monitors at the same time as a single desktop, feature known as “Eyefinity.” But there is a catch: the third monitor must use the DisplayPort connector, which is still not popular.
[nextpage title=”Introduction (Cont’d)”]
We removed the video card cooler to take a look. As you can see in Figure 5, the cooler has a copper base and copper fins. The cooler does not touch the memory chips
With the cooler removed you can see that almost all capacitors are solid, which is a terrific feature as they don’t leak (the ones that are not solid are from Elcon, a Chinese manufacturer). The memory chips from the component side have passive heatsinks, what doesn’t happen with the memory chips located on the solder side of the card. The transistors from the voltage regulator circuit don’t have a heatsink on them. In Figure 6 you can also see that this video card requires only one six-pin auxiliary power connector. A power supply with at least 450 W is the minimum recommended for this video card.
XFX Radeon HD 5770 uses eight 1 Gbit Hynix H5GQ1H24AFR-T2C GDDR5 chips, making its 1 GB memory (1 Gbit x 8 = 1 GB). These chips have a maximum transfer rate of 5 Gbps (“T2C” marking), which is equivalent of a 5 GHz GDDR5 clock or 1.25 GHz (5 GHz / 4) real clock. Since on this video card the memory was running at 1.2 GHz, there is a tiny 4.2% headroom for you to overclock the memories with them still running inside their specifications. Of course you can always try pushing them above their specs. In Figure 7 we provide a close-up of the GDDR5 memory chips.
This video card comes with one free game, Battle Forge. Instead of coming with the installation media, the card comes with instructions on how to download and activate the game with the included serial number.
In Figure 8, you can see all accessories that come with this video card: a quick installation guide, an installation guide, driver CD, the Battle Forge flyer, a power adapter for converting two peripheral power plugs into one six-pin video card power connector, a DVI-to-VGA adapter, a CrossFire bridge and a “do not disturb” sign.
[nextpage title=”Main Specifications”]
XFX Radeon HD 5770 main features are:
- Graphics chip: ATI Radeon HD 5770 running at 850 MHz.
- Memory: 1 GB GDDR5 memory (128-bit interface) from Hynix (H5GQ1H24AFR-T2C), running at 1.2 GHz (“4.8 GHz”).
- Bus type: PCI Express x16 2.0.
- Connectors: Two DVI, one HDMI and one DisplayPort. Support for up to three video monitors at the same time, but the third monitor must be installed on the DisplayPort connector.
- Video Capture (VIVO): No.
- Cables and adapters that come with this board: Standard 4-pin peripheral power plug to 6-pin PCI Express auxiliary power plug (PEG) adapter, DVI-to-VGA adapter, CrossFire
- Number of CDs/DVDs that come with this board: One.
- Games that come with this board: Battle Forge.
- Programs that come with this board: None.
- More information: https://www.xfxforce.com
- Average price in the US*: USD 175.00
* Researched at Newegg.com on the day we published this review.
[nextpage title=”How We Tested”]
During our benchmarking sessions, we used the configuration listed below. Between our benchmarking sessions the only variable was the video card being tested.
- CPU: Core i7 Extreme 965 (3.2 GHz, 8 MB L2 memory cache).
- Motherboard: ASUS P6T Deluxe OC Palm Edition (1611 BIOS)
- Memories: 3x 1 GB Qimonda IMSH1GU03A1F1C-10F memory modules (DDR3-1066/PC3-8500, CL7)
- Hard disk drive: Western Digital VelociRaptor WD3000GLFS (300 GB, SATA-300, 10,000 rpm, 16 MB cache)
- Video monitor: Samsung SyncMaster 305T (30” LCD, 2560×1600)
- Power supply required: Topower PowerBird 900 W
- CPU Cooler: Intel stock
- Optical Drive: LG GSA-H54N
- Windows Vista Ultimate 32-bit
- Service Pack 2
- Video resolution: 2560×1600 @ 60 Hz
- Intel Inf driver version: 126.96.36.1990
- AMD/ATI video driver version: 8.660.0.0
- NVIDIA video driver version: 190.62 (188.8.131.5262)
- 3DMark Vantage Professional 1.0.1
- Call of Duty 4 – Patch 1.7
- Crysis Warhead – Patch 1.1 + HOC Bench Crysis Warhead Benchmark Tool 1.1.1
- Fallout 3 – Patch 1.7
- Far Cry 2 – Patch 1.3
- Unigine Tropics 1.2
We adopted a 3% error margin; thus, differences below 3% cannot be considered relevant. In other words, products with a performance difference below 3% should be considered as having similar performance.
[nextpage title=”3DMark Vantage Professional”]
3DMark Vantage measures Shader 4.0 (i.e., DirectX 10) performance and supports PhysX, a programming interface developed by Ageia (now part of NVIDIA) to transfer physics calculations from the system CPU to the video card GPU in order to increase performance. Mechanical physics is the basis for calculations about the interaction of objects. For example, if you shoot, what exactly will happen to the object when the bullet hits it? Will it break? Will it move? Will the bullet bounce back? Note that since we are considering only the GPU score provided by this program, physics calculations are not taken into account.
We ran this program at three 16:10 widescreen resolutions, 1680×1050, 1920×1200, and 2560×1600. First we used the “Performance” profile, and then we used the “Extreme” profile (basically enabling anti-aliasing at 4x, anisotropic filtering at 16x, and putting all detail settings at their maximum or “extreme” values). The results being compared are the “GPU Scores” achieved by each video card.
Radeon HD 4850 and Radeon HD 4870 didn’t produce a reliable score for the “Extreme” profile under 2560×1600, so these particular results should not be considered.
|3DMark Vantage – 1680×1050 – Performance||GPU Score||Difference|
|GeForce GTX 260/216||7167||4.92%|
|Radeon HD 4870||7135||4.45%|
|Radeon HD 5770||6831|
|Radeon HD 4850||5477||24.72%|
|Radeon HD 5750||5427||25.87%|
|GeForce GTS 250||5148||32.69%|
|3DMark Vantage – 1920×1200 – Performance||GPU Score||Difference|
|GeForce GTX 260/216||6860||27.79%|
|Radeon HD 4870||5598||4.28%|
|Radeon HD 5770||5368|
|Radeon HD 5750||4306||24.66%|
|Radeon HD 4850||4252||26.25%|
|GeForce GTS 250||3893||37.89%|
|3DMark Vantage – 2560×1600 – Performance||GPU Score||Difference|
|GeForce GTX 260/216||3840||25.98%|
|Radeon HD 4870||3230||5.97%|
|Radeon HD 5770||3048|
|Radeon HD 5750||2486||22.61%|
|Radeon HD 4850||2380||28.07%|
|GeForce GTS 250||2149||41.83%|
|3DMark Vantage – 1680×1050 – Extreme||Score||Difference|
|GeForce GTX 260/216||5615||9.90%|
|Radeon HD 5770||5109|
|Radeon HD 4870||5093||0.31%|
|Radeon HD 5750||4071||25.50%|
|GeForce GTS 250||4000||27.73%|
|Radeon HD 4850||3941||29.64%|
|3DMark Vantage – 1920×1200 – Extreme||Score||Difference|
|GeForce GTX 260/216||5240||27.62%|
|Radeon HD 5770||4106|
|Radeon HD 4870||4040||1.63%|
|Radeon HD 5750||3259||25.99%|
|Radeon HD 4850||3119||31.64%|
|GeForce GTS 250||3073||33.62%|
|3DMark Vantage – 2560×1600 – Extreme||Score||Difference|
|GeForce GTX 260/216||3020||24.95%|
|Radeon HD 5770||2417|
|Radeon HD 5750||1907||26.74%|
|GeForce GTS 250||1729||39.79%|
|Radeon HD 4870||518||366.60%|
|Radeon HD 4850||402||501.24%|
[nextpage title=”Call of Duty 4″]
Call of Duty 4 is a DirectX 9 game implementing high-dynamic range (HDR) and its own physics engine, which is used to calculate how objects interact. For example, if you shoot, what exactly will hapen to the object when the bullet hits it? Will it break? Will it move? Will the bullet bounce back? It gives a more realistic experience to the user.
We ran this program at three 16:10 widescreen resolutions, 1680×1050, 1920×1200, and 2560×1600, maxing out all image quality controls (i.e., everything was put on the maximum values on the Graphics and Texture menus). We used the game internal benchmarking feature, running a demo provided by NVIDIA called “wetwork.” We are putting this demo for downloading here if you want to run your own benchmarks. The game was updated to version 1.6. The results below are the average number of frames per second (FPS) achieved by each card.
|Call of Duty 4 – 1680×1050 – Maximum||FPS||Difference|
|GeForce GTX 260/216||86.44||8.24%|
|Radeon HD 4870||83.80||4.93%|
|GeForce GTS 250||82.34||3.11%|
|Radeon HD 5770||79.86|
|Radeon HD 5750||74.08||7.80%|
|Radeon HD 4850||69.54||14.84%|
|Call of Duty 4 – 1920×1200 – Maximum||FPS||Difference|
|GeForce GTX 260/216||83.50||19.05%|
|GeForce GTS 250||75.42||7.53%|
|Radeon HD 4870||73.60||4.93%|
|Radeon HD 5770||70.14|
|Radeon HD 5750||62.40||12.40%|
|Radeon HD 4850||57.18||22.67%|
|Call of Duty 4 – 2560×1600 – Maximum||FPS||Difference|
|GeForce GTX 260/216||64.36||39.67%|
|GeForce GTS 250||49.60||7.64%|
|Radeon HD 4870||47.40||2.86%|
|Radeon HD 5770||46.08|
|Radeon HD 5750||40.82||12.89%|
|Radeon HD 4850||36.26||27.08%|
[nextpage title=”Crysis Warhead”]
Crysis Warhead is a DirectX 10 game based on the same engine as the original Crysis, but optimized (it runs under DirectX 9.0c when installed on Windows XP). We ran this program at three 16:10 widescreen resolutions, 1680×1050, 1920×1200, and 2560×1600, maximizing image quality (16x anti-aliasing, 16x anisotropic filtering) and using the Airfield demo. The results below are the number of frames per second achieved by each video card.
|Crysis Warhead – 1680×1050||FPS||Difference|
|GeForce GTX 260/216||18.0||5.88%|
|Radeon HD 5770||17.0|
|GeForce GTS 250||14.0||21.43%|
|Radeon HD 5750||13.0||30.77%|
|Radeon HD 4850||12.0||41.67%|
|Radeon HD 4870||11.5||47.83%|
|Crysis Warhead – 1920×1200||FPS||Difference|
|GeForce GTX 260/216||15||15.38%|
|Radeon HD 5770||13|
|Radeon HD 5750||11||18.18%|
|GeForce GTS 250||10||30.00%|
|Radeon HD 4870||10||30.00%|
|Radeon HD 4850||8||62.50%|
|Crysis Warhead – 2560×1600||FPS||Difference|
|Radeon HD 5770||8|
|Radeon HD 5750||7||14.29%|
|GeForce GTX 260/216||5||60.00%|
|GeForce GTS 250||4||100.00%|
|Radeon HD 4870||1||700.00%|
|Radeon HD 4850||1||700.00%|
[nextpage title=”Fallout 3″]
Fallout 3 is based on the same engine used by The Elder Scrolls IV: Oblivion, and it is a DirectX 9.0c (Shader 3.0) game. We configured the game with “ultra” image quality settings, maxing out all image quality settings, at three 16:10 widescreen resolutions, 1680×1050, 1920×1200, and 2560×1600. To measure performance, we used the FRAPS utility running an outdoor scene at God mode, running through enemy fire, triggering post processing effects, and ending with a big explosion in front of Dupont Circle.
|Fallout 3 – 1680×1050||FPS||Difference|
|GeForce GTX 260/216||77.31||1.15%|
|Radeon HD 5770||76.43|
|Radeon HD 4870||75.32||1.47%|
|GeForce GTS 250||74.33||2.83%|
|Radeon HD 5750||71.28||7.23%|
|Radeon HD 4850||69.36||10.19%|
|Fallout 3 – 1920×1200||FPS||Difference|
|GeForce GTX 260/216||77.30||8.95%|
|Radeon HD 4870||73.45||3.52%|
|Radeon HD 5770||70.95|
|GeForce GTS 250||68.75||3.20%|
|Radeon HD 5750||66.12||7.30%|
|Radeon HD 4850||59.56||19.12%|
|Fallout 3 – 2560×1600||FPS||Difference|
|GeForce GTX 260/216||61.49||14.06%|
|Radeon HD 5770||53.91|
|Radeon HD 5750||46.83||15.12%|
|GeForce GTS 250||46.09||16.97%|
|Radeon HD 4870||44.32||21.64%|
|Radeon HD 4850||33.25||62.14%|
[nextpage title=”Far Cry 2″]
Far Cry 2 is based on an entirely new game engine called Dunia, which is DirectX 10 when played under Windows Vista with a DirectX 10-compatible video card. We used the benchmarking utility that comes with this game, setting image quality to the maximum allowed and running the “Ranch Long” demo three times. The results below are expressed in frames per second and are an arithmetic average of the three results collected.
Radeon HD 4850 and Radeon HD 4870 could not run Far Cry 2 at 2500×1600 with all image quality settings maxed out.
|FarCry 2 – 1680×1050||FPS||Difference|
|GeForce GTX 260/216||38.92||41.53%|
|GeForce GTS 250||31.07||12.98%|
|Radeon HD 5770||27.5|
|Radeon HD 5750||24.51||12.20%|
|Radeon HD 4870||11.29||143.58%|
|Radeon HD 4850||9.8||180.61%|
|FarCry 2 – 1920×1200||FPS||Difference|
|GeForce GTX 260/216||33.82||38.89%|
|GeForce GTS 250||26.23||7.72%|
|Radeon HD 5770||24.35|
|Radeon HD 5750||21.77||11.85%|
|Radeon HD 4870||8.94||172.37%|
|Radeon HD 4850||7.38||229.95%|
|FarCry 2 – 2560×1600||FPS||Difference|
|GeForce GTX 260/216||18.61||28.43%|
|GeForce GTS 250||17.15||18.36%|
|Radeon HD 5770||14.49|
|Radeon HD 5750||12.30||17.80%|
[nextpage title=”Unigine Tropics”]
Unigine is a 3D engine used by some games and simulations. The developer provides two demos for this engine, Tropics and Sanctuary. We ran the Tropics benchmarking module under DirectX 9 mode at full screen with image quality settings maxed out. The results below are the number of frames per second achieved by each video card.
|Tropics – 1680×1050||FPS||Difference|
|Radeon HD 4870||42.4||3.67%|
|Radeon HD 5770||40.9|
|Radeon HD 5750||34.1||19.94%|
|Radeon HD 4850||32.5||25.85%|
|GeForce GTX 260/216||31.4||30.25%|
|GeForce GTS 250||26.7||53.18%|
|Tropics – 1920×1200||FPS||Difference|
|Radeon HD 5770||34.5|
|Radeon HD 5750||28.8||19.79%|
|GeForce GTX 260/216||28.5||21.05%|
|Radeon HD 4850||27.4||25.91%|
|Radeon HD 4870||24.6||40.24%|
|GeForce GTS 250||21.5||60.47%|
During our tests Radeon HD 5770 showed to be between 7% and 31% than its “economy” version, Radeon HD 5750, depending on the resolution, game and image quality settings.
Radeon HD 5770 was also faster than Radeon HD 4870 on scenarios that high-performance is needed, especially on Crysis Warhead (30% – 48% faster) and Far Cry 2 (144% – 172% faster) with image quality settings maxed out. On other scenarios both cards achieved the same performance level or Radeon HD 4870 was a tiny bit faster (up to 6%).
When Radeon HD 4870 was originally released it cost USD 300, making the new Radeon HD 5770 to be comparatively a real bargain at USD 170.
The problem is that its main competitor, GeForce GTX 260/216, left Radeon HD 5770 eating dust. Only on Unigine Tropics Radeon HD 5770 was faster than GeForce GTX 260/216, on all other games and simulations GeForce GTX 260/216 was faster, especially if you like to crank up image quality settings (the only exception was at Fallout 3 at 1680×1050 with image quality settings maxed out, where both cards achieved the same performance level).
Therefore we can’t recommend Radeon HD 5770 – unless, of course, you are looking for a DirectX 11-ready card with one DisplayPort output. If you are looking for a mid-range video card with a good cost/benefit ratio, GeForce GTX 260/216 continues to be our pick.