We are a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for us to earn fees by linking to Amazon.com and affiliated sites.
NVIDIA is launching today its new high-end 3D graphics chip, GeForce 7800 GTX, which is based in a totally new core (called CineFX 4.0). Instead of using the “old” GeForce 6800 core and simpling raising its clock, NVIDIA decide to re-design the way instructions are processed inside the chip. According to NVIDIA, this led to a performance increase up to 50% in GeForce 7800 GTX compared to a GeForce 6800 Ultra running at the same clock rate – according to the manufacturer, a single GeForce 7800 GTX is faster than two GeForce 6800 Ultra in SLI mode.
GeForce 7800 GTX main specs are:
- 24 pixel shader pipelines (against 16 in GeForce 6800 Ultra) and eight vertex shader pipelines (against six in GeForce 6800 Ultra).
- Each one of the pipelines was redesigned, allowing the execution of up to eight MAD (multiply-add) operations per clock cycle. This is the most common operation in 3D games. Because of that, each pipeline can achieve a FP processing performance of 165 Gflops, against 66 Glops in Radeon X850 XT Platinum Edition and 54 Gflops in GeForce 6800 Ultra.
- New anti-aliasing mode called “Transparency Anti-Aliasing” which enhances the image quality in 3D games.
- Image enhancement features for 2D video (de-interlacing, inverse 3:2 and 2:2 pull-down) collectively called PureVideo by NVIDIA.
- According to NVIDIA, GeForce 7800 GTX offers up to 60% improvement in HDR (High Dynamic Range) over GeForce 6800 Ultra. HDR feature is used to enhance lightning, and we will explain more about this feature in a moment.
- Power consumption slight inferior than GeForce 6800 Ultra: GeForce 7800 GTX consumes between 100 and 110 W while GeForce 6800 Ultra consumes between 110 and 120 W. If you consider that GeForce 7800 GTX has 36% more transistors than GeForce 6800 Ultra (302 million x 222 million) – which would lead to a power consumption increase –, this new chip is more power efficient.
- Requires a 350 W power supply or a 500 W power supply when running in SLI configuration.
- PCI Express x16 bus.
- GeForce 7800 GTX runs at 430 MHz with 256 MB 256-bit GDDR3 memory running at 1.2 GHz and a suggested retail price in the US of USD 599.
- Cheaper models based on the same core will probably be launched, but so far NVIDIA didn’t announce their names or specs.
Let’s now see in depth how this new chip works.
GeForce 7800 GTX architecture can be seen in Figure 2.
The big blocks in the center are the pixel shader pipelines. As you can see, there are six blocks with four pipelines each, for a total of 24 pixel shader pipelines. Below them you can see a mention to a texture L2 cache that we will talk about in a sec. At the top you will find eight blocks, which are the eight vertex shaders pipelines of this chip.
In Figure 3, you can see the block diagram of one vertex shader pipeline and in Figure 4 you can see the block diagram of one pixel shader pipeline. As you can see, both vertex shader and pixel shader pipelines have access to a L2 texture cache, which increases the chip performance, and pixel shader pipeline has access to a L1 texture cache.
On GeForce 7800 series more computational power was put inside the pixel shaders. On GeForce 6800 series, the pixel shader pipeline consisted of two shader units with a texture unit between them. Each shader unit had a 4 operations per pixel power, for a total of eight operations per pixel. On GeForce 7800 GTX architecture, the texture unit was moved and put on the side of the first shader unit (and not between the units) and the performance of each shader unit was increased to 10 operations per pixel, for a total of 20 operations per pixel, a 150% performance increase over GeForce 6800 architecture. Impressive.
[nextpage title=”3D Image Quality Enhancements”]
Two main new features were changed on GeForce 7800 GTX in order to increase the image quality: High Dynamic Range (HDR) and Transparency Anti-Aliasing.
High Dynamic Range (HDR)
The problem of lightning is that in the real world lights have unlimited bright, and the human eye has a perception of 14 dB (10^14:1), but the video card using a standard 32-bit integer buffer is only capable of reproducing 2.4 dB (255:1), because it uses only 8 bits to store each video component (R, G, B and alpha). GeForce 7800 GTX, in order to increase video quality, uses a 128-bit register for HDR, using 32-bit for each video component. The difference you can see in Figure 5.
GeForce 7800 GTX uses its floating-point engine to produce HDR, not integer like other video cards.
The best way to describe the new Transparency Anti-Aliasing mode is to compare the standard anti-aliasing with this new feature.
[nextpage title=”2D Image Quality Enhancements”]
Collectively called PureVideo by NVIDIA, 2D image enhancements were added basically to improve 2D video quality, basically correcting interlacing and telecine.
Videos originally targeted to TVs are interlaced, because that’s they way TVs work. In interlacing, each video frame has only half of the total lines available. Video monitors used by computers uses non-interlaced scanning (a.k.a. progressive scanning), which is capable of showing all line available per frame, so when reproducing this kind of video on your computer, you can see it doesn’t have the best possible quality. So, GeForce 7800 GTX has a de-interlacing engine, that creates the missing lines from each video frame, thus improving 2D video quality.
When movies are converted to video, another problem arises. Movies are shot at 24 frames per second, while videos on TV should be played at 30 frames per second. So the movie must go to a process called telecine, which creates those 6 frames per second that are missing. Sometimes, however, this process is not very well done and you can see that the image quality isn’t optimal. GeForce 7800 GTX offers two inverse telecine features to correct this problem, called inverse 3:2 and 2:2 pull-down.
Read our tutorial Enabling 2D Enhancements on GeForce 6 and 7 Series to learn how to enable these features on your video card – if you have a GeForce 6 or 7 series, of course.