• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
Hardware Secrets

Hardware Secrets

Uncomplicating the complicated

  • Case
  • Cooling
  • Memory
  • Mobile
    • Laptops
    • Smartphones
    • Tablets
  • Motherboard
  • Networking
  • Other
    • Audio
    • Cameras
    • Consumer Electronics
    • Desktops
    • Museum
    • Software
    • Tradeshows & Events
  • Peripherals
    • Headset
    • Keyboard
    • Mouse
    • Printers
  • Power
  • Storage
Home » XFX Radeon HD 5750 Video Card Review

XFX Radeon HD 5750 Video Card Review

[nextpage title=”Introduction”]

Radeon HD 5700 series is the first mid-range DirectX 11 video cards to arrive on the market. Let’s see the performance of Radeon HD 5750 from XFX and see if it is a good buy.

Radeon HD 5750 runs internally at 700 MHz and features 720 processors – a little bit less than the 800 processors found on Radeon HD 4850 –, accessing its memory through a 128-bit interface, which is half the width found on Radeon HD 4850. Radeon HD 5750, like Radeon HD 4870 and Radeon HD 5870, uses GDDR5 memory chips, which are capable of transferring four data per clock cycle instead of just two. This makes the memories, which are accessed at 1,150 MHz, to achieve a performance as if they were accessed at 4.6 GHz, providing a 73.6 GB/s maximum theoretical transfer rate. To compare these specs to other graphics chips please take a look at our AMD ATI Chips Comparison Table and NVIDIA Chips Comparison Table.

The main competitor to Radeon HD 5750 is GeForce GTS 250. In our review we will be mainly comparing the performance from these two video cards. We also included in our review Radeon HD 4850 and Radeon HD 4870, plus the new Radeon HD 5770 and GeForce GTX 260/216, and we should come up with a very interesting comparison.

Now let’s talk specifically about the Radeon HD 5750 from XFX. As you may be aware, XFX – which, by the way, many years ago was known as “Pine” – was for years one of the leading NVIDIA partners, and a while ago they decided they shouldn’t manufacture only NVIDIA-based video cards.

This model, also known as HD-575X-ZNFC, comes with 1 GB, two DVI outputs, one HDMI output and one DisplayPort output, following the reference model from AMD.This video card allows you to use up to three video monitors at the same time as a single desktop, feature known as “Eyefinity.” But there is a catch: the third monitor must use the DisplayPort connector, which is still not popular. 

XFX Radeon HD 5750Figure 1: XFX Radeon HD 5750.

XFX Radeon HD 5750Figure 2: XFX Radeon HD 5750.

XFX Radeon HD 5750Figure 3: XFX Radeon HD 5750.

XFX Radeon HD 5750Figure 4: Connectors.

[nextpage title=”Introduction (Cont’d)”]

We removed the video card cooler to take a look. As you can see in Figure 5, the cooler is very simple, using an aluminum base. It doesn’t touch the memory chips and they have no heatsinks on them.

XFX Radeon HD 5750Figure 5: Video card cooler.

With the cooler removed you can see that almost all capacitors are solid, which is a terrific feature as they don’t leak (the ones that are not solid are from Elcon, a Chinese manufacturer). Like the memory chips, the transistors from the voltage regulator don’t have a heatsink.

XFX Radeon HD 5750Figure 6: XFX Radeon HD 5750 with its cooler removed.

XFX Radeon HD 5750 uses eight 1 Gbit Hynix H5GQ1H24AFR-T2C GDDR5 chips, making its 1 GB memory (1 Gbit x 8 = 1 GB). These chips have a maximum transfer rate of 5 Gbps (“T2C” marking), which is equivalent of a 5 GHz GDDR5 clock or 1.25 GHz (5 GHz / 4) real clock. Since on this video card the memory was running at 1.15 GHz, there is an 8.7% headroom for you to overclock the memories with them still running inside their specifications. Of course you can always try pushing them above their specs. In Figure 7 we provide a close-up of the GDDR5 memory chips.

XFX Radeon HD 5750Figure 7: GDDR5 memory chips.

In Figure 8, you can see all accessories that come with this video card, basically a quick installation guide, a driver CD, a power adapter for converting two peripheral power plugs into one six-pin video card power connector, and a “do not disturb” sign.

XFX Radeon HD 5750Figure 8: Accessories.

[nextpage title=”Main Specifications”]

XFX Radeon HD 5750 main features are:

  • Graphics chip: ATI Radeon HD 5750 running at 700 MHz.
  • Memory: 1 GB GDDR5 memory (128-bit interface) from Hynix (H5GQ1H24AFR-T2C), running at 1,150 MHz (“4.6 GHz”).
  • Bus type: PCI Express x16 2.0.
  • Connectors: Two DVI, one HDMI and one DisplayPort. Support for up to three video monitors at the same time, but the third monitor must be installed on the DisplayPort connector.
  • Video Capture (VIVO): No.
  • Cables and adapters that come with this board: Standard 4-pin peripheral power plug to 6-pin PCI Express auxiliary power plug (PEG) adapter.
  • Number of CDs/DVDs that come with this board: One.
  • Games that come with this board: None.
  • Programs that come with this board: None.
  • More information: https://www.xfxforce.com
  • Average price in the US*: USD 140.00

* Researched at Newegg.com on the day we published this review.

[nextpage title=”How We Tested”]

During our benchmarking sessions, we used the configuration listed below. Between our benchmarking sessions the only variable was the video card being tested.

Hardware Configuration

  • CPU: Core i7 Extreme 965 (3.2 GHz, 8 MB L2 memory cache).
  • Motherboard: ASUS P6T Deluxe OC Palm Edition (1611 BIOS)
  • Memories: 3x 1 GB Qimonda IMSH1GU03A1F1C-10F memory modules (DDR3-1066/PC3-8500, CL7)
  • Hard disk drive: Western Digital VelociRaptor WD3000GLFS (300 GB, SATA-300, 10,000 rpm, 16 MB cache)
  • Video monitor: Samsung SyncMaster 305T (30” LCD, 2560×1600)
  • Power supply required: Topower PowerBird 900 W
  • CPU Cooler: Intel stock
  • Optical Drive: LG GSA-H54N

Software Configuration

  • Windows Vista Ultimate 32-bit
  • Service Pack 2
  • Video resolution: 2560×1600 @ 60 Hz

Driver Versions

  • Intel Inf driver version: 9.1.1.1020
  • AMD/ATI video driver version: 8.660.0.0
  • NVIDIA video driver version: 190.62 (8.16.11.9062)

Software Used

  • 3DMark Vantage Professional 1.0.1
  • Call of Duty 4 – Patch 1.7
  • Crysis Warhead – Patch 1.1 + HOC Bench Crysis Warhead Benchmark Tool 1.1.1
  • Fallout 3 – Patch 1.7
  • Far Cry 2 – Patch 1.3
  • Unigine Tropics 1.2

Error Margin

We adopted a 3% error margin; thus, differences below 3% cannot be considered relevant. In other words, products with a performance difference below 3% should be considered as having similar performance.

[nextpage title=”3DMark Vantage Professional”]

3DMark Vantage measures Shader 4.0 (i.e., DirectX 10) performance and supports PhysX, a programming interface developed by Ageia (now part of NVIDIA) to transfer physics calculations from the system CPU to the video card GPU in order to increase performance. Mechanical physics is the basis for calculations about the interaction of objects. For example, if you shoot, what exactly will happen to the object when the bullet hits it? Will it break? Will it move? Will the bullet bounce back? Note that since we are considering only the GPU score provided by this program, physics calculations are not taken into account.

We ran this program at three 16:10 widescreen resolutions, 1680×1050, 1920×1200, and 2560×1600. First we used the “Performance” profile, and then we used the “Extreme” profile (basically enabling anti-aliasing at 4x, anisotropic filtering at 16x, and putting all detail settings at their maximum or “extreme” values). The results being compared are the “GPU Scores” achieved by each video card.

Radeon HD 4850 and Radeon HD 4870 didn’t produce a reliable score for the “Extreme” profile under 2560×1600, so these particular results should not be considered.

XFX Radeon HD 5750

3DMark Vantage – Performance – 1680×1050 GPU Score Difference
GeForce GTX 260/216 7167 32.06%
Radeon HD 4870 7135 31.47%
Radeon HD 5770 6831 25.87%
Radeon HD 4850 5477 0.92%
Radeon HD 5750 5427  
GeForce GTS 250 5148 5.42%

XFX Radeon HD 5750

3DMark Vantage – Performance – 1920×1200 GPU Score Difference
GeForce GTX 260/216 6860 61.34%
Radeon HD 4870 5598 31.66%
Radeon HD 5770 5368 26.25%
Radeon HD 5750 4306  
Radeon HD 4850 4252 0.00%
GeForce GTS 250 3893 9.22%

XFX Radeon HD 5750

3DMark Vantage – Performance – 2560×1600 GPU Score Difference
GeForce GTX 260/216 3840 54.47%
Radeon HD 4870 3230 29.93%
Radeon HD 5770 3048 22.61%
Radeon HD 5750 2486  
Radeon HD 4850 2380 4.45%
GeForce GTS 250 2149 15.68%

XFX Radeon HD 5750

3DMark Vantage – Extreme – 1680×1050 GPU Score Difference
GeForce GTX 260/216 5615 37.93%
Radeon HD 5770 5109 25.50%
Radeon HD 4870 5093 25.10%
Radeon HD 5750 4071  
GeForce GTS 250 4000 1.77%
Radeon HD 4850 3941 3.30%

XFX Radeon HD 5750

3DMark Vantage – Extreme – 1920×1200 GPU Score Difference
GeForce GTX 260/216 5240 60.79%
Radeon HD 5770 4106 25.99%
Radeon HD 4870 4040 23.96%
Radeon HD 5750 3259  
Radeon HD 4850 3119 4.49%
GeForce GTS 250 3073 6.05%

XFX Radeon HD 5750

3DMark Vantage – Extreme – 2560×1600 GPU Score Difference
GeForce GTX 260/216 3020 58.36%
Radeon HD 5770 2417 26.74%
Radeon HD 5750 1907  
GeForce GTS 250 1729 10.29%
Radeon HD 4870 518 268.15%
Radeon HD 4850 402 374.38%

[nextpage title=”Call of Duty 4″]

Call of Duty 4 is a DirectX 9 game implementing high-dynamic range (HDR) and its own physics engine, which is used to calculate how objects interact. For example, if you shoot, what exactly will hapen to the object when the bullet hits it? Will it break? Will it move? Will the bullet bounce back? It gives a more realistic experience to the user.

We ran this program at three 16:10 widescreen resolutions, 1680×1050, 1920×1200, and 2560×1600, maxing out all image quality controls (i.e., everything was put on the maximum values on the Graphics and Texture menus). We used the game internal benchmarking feature, running a demo provided by NVIDIA called “wetwork.” We are putting this demo for downloading here if you want to run your own benchmarks. The game was updated to version 1.6. The results below are the average number of frames per second (FPS) achieved by each card.

XFX Radeon HD 5750

Call of Duty 4 – 1680×1050 FPS Difference
GeForce GTX 260/216 86.44 16.68%
GeForce GTS 250 82.34 11.15%
Radeon HD 4870 83.80 13.12%
Radeon HD 5770 79.86 7.80%
Radeon HD 5750 74.08  
Radeon HD 4850 69.54 6.53%

XFX Radeon HD 5750

Call of Duty 4 – 1920×1200 FPS Difference
GeForce GTX 260/216 83.50 33.81%
GeForce GTS 250 75.42 20.87%
Radeon HD 4870 73.60 17.95%
Radeon HD 5770 70.14 12.40%
Radeon HD 5750 62.40  
Radeon HD 4850 57.18 9.13%

XFX Radeon HD 5750

Call of Duty 4 – 2560×1600 FPS Difference
GeForce GTX 260/216 64.36 57.67%
GeForce GTS 250 49.60 21.51%
Radeon HD 4870 47.40 16.12%
Radeon HD 5770 46.08 12.89%
Radeon HD 5750 40.82  
Radeon HD 4850 36.26 12.58%

[nextpage title=”Crysis Warhead”]

Crysis Warhead is a DirectX 10 game based on the same engine as the original Crysis, but optimized (it runs under DirectX 9.0c when installed on Windows XP). We ran this program at three 16:10 widescreen resolutions, 1680×1050, 1920×1200, and 2560×1600, maximizing image quality (16x anti-aliasing, 16x anisotropic filtering) and using the Airfield demo. The results below are the number of frames per second achieved by each video card.

XFX Radeon HD 5750

Crysis Warhead – 1680×1050 FPS Difference
GeForce GTX 260/216 18.0 38.46%
Radeon HD 5770 17.0 30.77%
GeForce GTS 250 14.0 7.69%
Radeon HD 5750 13.0  
Radeon HD 4850 12.0 8.33%
Radeon HD 4870 11.5 13.04%

XFX Radeon HD 5750

Crysis Warhead – 1920×1200 FPS Difference
GeForce GTX 260/216 15 36.36%
Radeon HD 5770 13 18.18%
Radeon HD 5750 11  
GeForce GTS 250 10 10.00%
Radeon HD 4870 10 10.00%
Radeon HD 4850 8 37.50%

XFX Radeon HD 5750

Crysis Warhead – 2560×1600 FPS Difference
Radeon HD 5770 8 14.29%
Radeon HD 5750 7  
GeForce GTX 260/216 5 40.00%
GeForce GTS 250 4 75.00%
Radeon HD 4850 1 600.00%
Radeon HD 4870 1 600.00%

[nextpage title=”Fallout 3″]

Fallout 3 is based on the same engine used by The Elder Scrolls IV: Oblivion, and it is a DirectX 9.0c (Shader 3.0) game. We configured the game with “ultra” image quality settings, maxing out all image quality settings, at three 16:10 widescreen resolutions, 1680×1050, 1920×1200
, and 2560×1600. To measure performance, we used the FRAPS utility running an outdoor scene at God mode, running through enemy fire, triggering post processing effects, and ending with a big explosion in front of Dupont Circle.

XFX Radeon HD 5750

Fallout 3 – 1680×1050 FPS Difference
GeForce GTX 260/216 77.31 8.46%
Radeon HD 5770 76.43 7.23%
Radeon HD 4870 75.32 5.67%
GeForce GTS 250 74.33 4.28%
Radeon HD 5750 71.28  
Radeon HD 4850 69.36 2.77%

XFX Radeon HD 5750

Fallout 3 – 1920×1200 FPS Difference
GeForce GTX 260/216 77.30 16.91%
Radeon HD 4870 73.45 11.09%
Radeon HD 5770 70.95 7.30%
GeForce GTS 250 68.75 3.98%
Radeon HD 5750 66.12  
Radeon HD 4850 59.56 11.01%

XFX Radeon HD 5750

Fallout 3 – 2560×1600 FPS Difference
GeForce GTX 260/216 61.49 31.30%
Radeon HD 5770 53.91 15.12%
Radeon HD 5750 46.83  
GeForce GTS 250 46.09 1.61%
Radeon HD 4870 44.32 5.66%
Radeon HD 4850 33.25 40.84%

[nextpage title=”Far Cry 2″]

Far Cry 2 is based on an entirely new game engine called Dunia, which is DirectX 10 when played under Windows Vista with a DirectX 10-compatible video card. We used the benchmarking utility that comes with this game, setting image quality to the maximum allowed and running the “Ranch Long” demo three times. The results below are expressed in frames per second and are an arithmetic average of the three results collected.

Radeon HD 4850 and Radeon HD 4870 could not run Far Cry 2 at 2500×1600 with all image quality settings maxed out.

XFX Radeon HD 5750

FarCry 2 – 1680×1050 FPS Difference
GeForce GTX 260/216 38.92 58.79%
GeForce GTS 250 31.07 26.76%
Radeon HD 5770 27.50 12.20%
Radeon HD 5750 24.51  
Radeon HD 4870 11.29 117.09%
Radeon HD 4850 9.80 150.10%

XFX Radeon HD 5750

FarCry 2 – 1920×1200 FPS Difference
GeForce GTX 260/216 33.82 55.35%
GeForce GTS 250 26.23 20.49%
Radeon HD 5770 24.35 11.85%
Radeon HD 5750 21.77  
Radeon HD 4870 8.94 143.51%
Radeon HD 4850 7.38 194.99%

XFX Radeon HD 5750

FarCry 2 – 2560×1600 FPS Difference
GeForce GTX 260/216 18.61 51.30%
GeForce GTS 250 17.15 39.43%
Radeon HD 5770 14.49 17.80%
Radeon HD 5750 12.30  

[nextpage title=”Unigine Tropics”]

Unigine is a 3D engine used by some games and simulations. The developer provides two demos for this engine, Tropics and Sanctuary. We ran the Tropics benchmarking module under DirectX 9 mode at full screen with image quality settings maxed out. The results below are the number of frames per second achieved by each video card.

XFX Radeon HD 5750

Tropics  – 1680×1050 FPS Difference
Radeon HD 4870 42.4 24.34%
Radeon HD 5770 40.9 30.46%
Radeon HD 5750 34.1  
Radeon HD 4850 32.5 4.92%
GeForce GTX 260/216 31.4 8.60%
GeForce GTS 250 26.7 27.72%

XFX Radeon HD 5750

Tropics  – 1920×1200 FPS Difference
Radeon HD 5770 34.5 19.79%
Radeon HD 5750 28.8  
GeForce GTX 260/216 28.5 1.05%
Radeon HD 4850 27.4 5.11%
Radeon HD 4870 24.6 17.07%
GeForce GTS 250 21.5 33.95%

[nextpage title=”Conclusions”]

The good news about Radeon HD 5750 is that in the worst-case scenario it achieved the same perform
ance as Radeon HD 4850, being on most situations between 4% and 13% faster than this previous-generation video card. On situations that high-performance is needed, in particular on Far Cry 2 with image quality settings maxed out, Radeon HD 5750 achieved more than double the performance from Radeon HD 4850. And certain configurations that Radeon HD 4850 can’t run (2560×1600 with image quality settings maxed out on 3DMark Vantage and Far Cry 2) the new Radeon HD 5750 can. And the amazing thing is that when Radeon HD 4850 was originally released it cost around USD 200 and the new Radeon HD 5750 costs only USD 140, so you can get similar or higher performance plus DirectX 11 support at a lower cost.

The problem is that on most of our tests the main competitor from NVIDIA, GeForce GTS 250, was faster. Only on 3DMark Vantage (up to 10%) and on Unigine Tropics (between 28% and 34%) Radeon HD 5750 was faster than GeForce GTS 250. For example, on Call of Duty 4 GeForce GTS 250 was between 11% and 22% faster, on Crysis Warhead it was 8% faster and on Far Cry 2 it was between 20% and 39% faster. On Fall Out 3 the difference between the two cards was too small. Keep in mind that on all these games we cranked up all image quality settings to their max.

Radeon HD 5750 is definitely o good replacement for Radeon HD 4850, as explained, but it may fall behind its main competitor, GeForce GTS 250, depending on the resolution and image quality settings you like to play the most. From the six programs we ran, GeForce GTS 250 won on three of them, Radeon HD 5750 won on two of them and they tied on one of them. Since we can’t run all the games available on the market and since our results do not point out a clear winner, we have to claim a technical tie.

We are given our Silver Award instead of our Golden because, as explained, Radeon HD 5750 isn’t faster than its main competitor on all situations.

Reader Interactions

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Primary Sidebar

As a participant in the Amazon Services LLC Associates Program, this site may earn from qualifying purchases. We may also earn commissions on purchases from other retail websites.

car service

Why Is Fleet Maintenance Important?

If you have a fleet of vehicles you use within your business, it’s crucial you keep up with their

Playing Fifa on Play station 4

Tips for Recycling Your Gaming Consoles and Devices

These days, it seems like almost everybody is gaming. As great as this is, it’s also creating a

Business planning

How to Develop Your Venture Capital Business

Venture Capital (VC) is a type of private equity investment in which investors provide funding to

Footer

For Performance

  • PCI Express 3.0 vs. 2.0: Is There a Gaming Performance Gain?
  • Does dual-channel memory make difference on integrated video performance?
  • Overclocking Pros and Cons
  • All Core i7 Models
  • Understanding RAM Timings

Everything you need to know

  • Everything You Need to Know About the Dual-, Triple-, and Quad-Channel Memory Architectures
  • What You Should Know About the SPDIF Connection (2022 Guide)
  • Everything You Need to Know About the Intel Virtualization Technology
  • Everything You Need to Know About the CPU Power Management

Copyright © 2023 · All rights reserved - Hardwaresecrets.com
About Us · Privacy Policy · Contact