Today we'll take a first look at the flagship NVIDIA GPU of the current generation—the mighty GeForce Titan. The review will also include tests in a solid number of games, and a comparison with other top graphics adapters.

GeForce GTX Titan is made for enthusiasts who will use it to build multi-GPU rigs. Especially as the new card enables 3-way SLI in a standard enclosure. The Titan even fits in small form factor (SFF) gaming PCs for that matter.

The new graphics adapter is based on the highly anticipated NVIDIA GK110 chip, which was expected to become NVIDIA's flagship GPU. However, the company believed that its mid-to-upper class GK104 would be enough to challenge AMD. And it was: GeForce GTX 660 Ti, GTX 670, and GTX 680 were on a par with Radeons.

Later, AMD released its fastest single-chip solution, Radeon HD 7970 GHz Edition, so NVIDIA was finally forced to make the move. GK110 came to the gaming hardware market right from the Tesla segment.

GK110 GPU specifications

>Codenamed GK110

28nm process technology

process technology 7.1 billion transistors

transistors Unified architecture with an array of processors for stream processing of vertices, pixels, etc.

Hardware support for the DirectX 11 API , including the Shader Model 5.0 , geometry and compute shaders, as well as tessellation

, including the , geometry and compute shaders, as well as 384-bit memory bus, 6 independent 64-bit controllers, support for GDDR5

memory bus, 6 independent 64-bit controllers, support for GDDR5 876MHz average boost clock rate (the standard is 836 MHz )

average boost clock rate (the standard is ) 15 Stream Multiprocessors, including 2880 scalar ALUs for FP32 and 960 scalar ALUs for FP64 computing according to the IEEE 754-2008 standard)

240 texture addressing and filtering units supporting FP16, FP32 precision in textures, as well as support for trilinear and anisotropic filtering for all texture formats

6 wide ROPs (48 pixels) supporting antialiasing up to 32x, also with FP16, FP32 frame buffers; each unit features an array of configurable ALUs and handles Z generation and comparison, MSAA, blending

Integrated support for RAMDAC, 2 x Dual Link DVI, HDMI, DisplayPort;

Integrated support for 4 displays at the same time (2 x Dual Link DVI, HDMI 1.4a, DisplayPort 1.2)

Support for PCI Express 3.0

GeForce GTX Titan reference specifications

876MHz average boost clock rate (the standard is 836 MHz)

2668 universal processors

224 TMUs, 48 blending units

6008 (4 x 1502) MHz effective memory clock rate

GDDR5 memory, 384-bit bus

6GB VRAM

288.4 GB/s memory bandwidth

4.5/1.3 TFLOPS calculating performance (FP32/FP64)

40.1 Gpixel/s theoretical peak fillrate

187.3 Gtexel/s theoretical texture fetch

2 x Dual Link DVI-I, Mini HDMI, DisplayPort 1.2

PCI Express 3.0

Up to 250W power consumption (8-pin and 6-pin power connectors)

Dual-slot solution

MSRP of $999

NVIDIA's flagship video adapter received a word instead of a numerical rating in its name. From the angle of marketing this looks reasonable: the GPU is used in the world's fastest Titan supercomputer (as well as in NVIDIA Tesla K20X), therefore, the name Titan will surely be associated with the high-end hardware.

GeForce GTX Titan doesn't replace any of NVIDIA's video cards. The other top solution, GTX 690, is made for quite different use: to get maximum fps ignoring the drawbacks of multi-chip AFR-rendering. GeForce GTX Titan is more multipurpose: it has larger memory size with fast access time, winning at high resolutions. Moreover, GTX Titan is smaller, more silent and less power-hungry, thus fitting more types of configurations. Finally, it offers extra functionality: GPU Boost 2.0 and full-speed double-precision computation.

As for AMD, it simply offers no counterparts as yet: in dual-chip segment Radeon HD 7990 was just recently introduced, and in single-chip GPUs the company doesn't even seem to try. There is a number of third-party solutions, e.g. ASUS Ares II, but it is a real dual-chip monster with the whole three 8-pin supplementary power connectors.

Thanks to its 384-bit memory bus, GTX Titan can have 3 or 6 GB of memory. No surprise NVIDIA chose 6 GB, thus beating all the competitors. As a result, in any demanding applications at any quality and antialiasing level GTX Titan won't suffer from the lack of memory.

Design

The length of GeForce GTX Titan is 10.5" (267 mm); it requires supplementary power through a single 6-pin and a single 8-pin power connector. The set of outputs includes 2 x Dual-Link DVI, HDMI and DisplayPort.

GeForce GTX Titan looks similar to GeForce GTX 690. The PCB has an aluminum cover with plastic opening, which gives a look at the vapor chamber and dual-slot aluminum heatsink.

The Geforce GTX logo on the flank has a light-emitting diode, which triggers that the graphics card is on. The intensity can be tuned manually with the partners' software, either to keep at the same level or to change according to the adapter's load.

The graphics card boasts an advanced VRM. It includes 6 phases with overvoltage protection for the core, and 2 phases for memory (in a supplementary circuit).

Cooling system

For its new flagship NVIDIA developed an advanced cooling system. It boasts an excellent heat dissipation of the vapor chamber, larger heatsink ribs, improved fan management system: now you can change the activating voltage and rotating speed for the best balance between noise and temperature levels.

The copper vapor chamber works as a heatpipe, but provides better effectiveness. Also, GTX Titan utilizes a new thermal interface material by Shin-Etsu, two times better than that of GTX 680. Thanks to these improvements, the cooling system can handle higher clock rates.

The heat from the vapor chamber goes to a big dual-slot heatsink made of aluminum alloy, which has longer ribs than in the GTX 6xx adapters, thus enlarging the dissipation area and increasing effectiveness. The graphics card also boasts an aluminum plate on the back for extra cooling. To decrease the noise level, the fan is made of the same dampening material as in GTX 680.

To provide a better view, NVIDIA has measured the noise levels for three multi-GPU configurations in Unigine Heaven benchmark (1920x1080, full-screen anti-aliasing, maximum tesselation):

As you can see, in spite of being more powerful, the new graphics adapter provides excellent noise level: three GTX Titan cards are more silent than three Radeon HD 7970 or three GeForce GTX 680 adapters. If the numbers are correct, the difference between the top AMD's and NVIDIA's products is enormous!

Architecture

The new graphics adapter is based on the world's fastest GPU—GK110. It was firstly used by Oak Ridge National Laboratory in their Titan supercomputer, powered by 18668 NVIDIA K20X professional GPUs. In November 2012, it showed the record performance of 17.59 petaflops in double-precision computations (Linpack benchmark was used).

The top GK110 has all the features of GK104 (used in GTX 680) and all the peculiarities of Kepler architecture (including SMX multistreaming processors). GK110 features 5 graphics processing clusters (GPC). Each of them includes 3 SMX processors. Surprisingly, the setup of GK110 is represented with odd quantities, unlike that of GK104 (4 GPC x 2 SMX). Here's the diagram of the chip:

As well as GK104, each SMX in GK110 has a PolyMorph Engine, 64 KB of shared memory, 48 KB of texture cache, 16 blocks of texture filtering (total 240 physical TMU per chip), and 192 ALUs for FP32 (green squares). But besides that, you can also see 64 orange squares in each SMX. These are the ALUs made for FP64—double-precision computing.

The memory subsystem includes six 64-bit channels, forming the 384-bit bus. The number of ROP blocks depends on the memory controllers, therefore, GK110 features 48 blocks. The total L2 cache size is 1.5 MB.

As we mentioned earlier, GTX Titan boasts impressive 6 GB of memory. The users often asked NVIDIA for the increase, and they expectedly got it in the top GPU. 6 GB will improve the performance at high resolutions on multiple displays, as well as in GPGPU tasks. Besides, don't forget that the next generation of consoles is coming: GTX Titan has a solid reply to the 8 GB GDDR5 of Playstation 4.

The memory clock rate of GTX Titan is 6008 MHz, as of the lower-end models. But the memory bandwidth was considerably increased to 288.4 GB/s.

GPU Boost 2.0

GeForce GTX Titan supports the new version of GPU Boost technology. This feature automatically controls the clock rates of the GPU, depending on its power consumption, to achieve maximum performance. The basic core clock rate of GTX Titan is 836 MHz. The average boost clock rate is not much higher—876 MHz. The real clock rate in games can climb on, of course, but this is not how GPU Boost 2.0 works. It's aimed at achieving maximum performance under the actual level of energy consumption and secretion.

In the second version of GPU Boost NVIDIA embodied their discovery: at low temperatures the performance of the GPU isn't always limited by its power consumption. Therefore, in GPU Boost 2.0 the growth of the clock rate is limited by the GPU's operating temperature, fixed at 80°C.

The core clock rate increases automatically until the temperature reaches 80°C. The GPU itself supports this level by changing clock rate and voltage. As a result, the whole adapter operates more silent: if the top temperature is fixed, the cooling system is easier to control, and the fan speed change less.

With the help of the partners' software users can alter the temperature threshold. For example, if an enthusiast wants to increase the performance, he can add 5°C to the default 80°C, thus expanding the voltage and clock rate limits.

In GPU Boost 2.0 the power target clevel doesn't represent the consumption on average but the maximum value for the graphics card. If you set the 100% target level, the maximum consumption will be limited to 250 W. If you choose 106%, this value will change to 265 W. The typical consumption will alter according to the environment.

GPU Boost 2.0 looks much better for liquid cooling, than GPU Boost 1.0. The liquid cooling system is able to decrease the temperature considerably, threrefore, the system is able to use higher voltage and clock rate until it reaches the threshold.

The priority of temperature gives another bonus: in GPU Boost 2.0 users can manage overvoltage. It helps much in achieving higher clock rates. By default, the GPU voltage is limited by NVIDIA to prevent irreversible die damage. Of course, by default the overvoltage is off, and NVIDIA provides a strict warning about the risks before you enable the feature.

In some cases, increasing the clock rate threshold doesn't bring extra performance, due to the lack of power. Here the overvoltage will surely help, especially if you have an extreme cooling system.

Another interesting feature of GPU Boost 2.0 is "screen overclocking". As you know, in games, VSync (vertical synchronization) saves you from a number of artifacts (e.g. gaps in the picture). But when VSync is on, the number of fps is limited by the display's refresh rate (usually 60 Hz today), though the GPU can render at higher speed. For example, if the GPU is able to provide 90 FPS when VSync is on, the display will show just 60 FPS. But if you increase the refresh rate up to 80 Hz, the GPU will be able to operate at 80 FPS as well, providing the same level of quality.

GPU Boost 2.0 helps to solve this problem. With the help of utilities from the video adapters' manufacturers, users can increase pixel clock rate of their displays, achieving higher refresh rate and, therefore, amount of fps. This feature is supported by select displays.

Double-precision computing

GTX Titan is the first GeForce gaming graphics card to support full-speed double-precision computing with no software limitations. Earlier all the GeForce adapters had very low speed of DP computing: GK104 included 192 units for single-precision and only 8 units for double-precision computing. Now each SMX of GTX Titan includes 64 units for DP computing (and the same 192 for SP). Therefore, the speed of double-precision computing is 1/3 of that of single precision—considerably more.

Fortunately, NVIDIA hasn't restricted DP computing performance in GK110 GPU in fear of potential competition between Tesla and Titan series. This is a real gift to developers, engineers and students: now they can afford a considerably cheaper solution that provides >1 TFLOPS in DP computing. The development for NVIDIA processors became much more affordable, thus spreading GPGPU computations NVIDIA is striving for.

By default, the speed of DP computing is restricted to 1/24 of that of SP computing, like in GeForce GTX 680. To turn the full speed on, you should go to NVIDIA Control Panel, choose 'Manage 3D Settings, find 'CUDA—double precision' and check the box near 'GeForce GTX Titan'. The changes will be applied and saved in BIOS immediately.

Why not enable the full-speed DP computing permanently? Because in this case GK110 would consume more power and wouldn't be able to operate at the 'gaming' clock rate. When the full-speed DP computing is on, the lower clock rate and voltage are applied. Double-precision computing isn't used in games, so, if you don't use GPGPU applications, there is no need to turn it on.

Now let's proceed to testing the actual graphics card we have.

Gigabyte GeForce GTX Titan

Model name: GV-NTITAN-6GD-B

GPU: GTX Titan (GK110)

Interface: PCIe 3.0 x16

GPU clock rate (ROPs/shaders): 837-876 MHz (as the standard)

Memory clock rate, physical (effective): 1502 (6008) MHz (as the standard)

Memory bus: 384-bit

Stream processors: 2688

Texture units: 224 (BLF/TLF/ANIS)

ROP units: 48

Dimensions: 270x100 mm, 2 slots

PCB color: black

Peak power consumption (3D/2D/standby): 262/78/74 W

Output connectors: Dual-Link DVI-I, Single-Link DVI-D, HDMI 1.4a, DisplayPort 1.2

Multi-GPU mode: SLI (hardware)

Comparison with the reference design, front view

Gigabyte GeForce GTX Titan Reference NVIDIA GeForce GTX 680

Comparison with the reference design, rear view

Gigabyte GeForce GTX Titan Reference NVIDIA GeForce GTX 680

Gigabyte GeForce Titan is not bigger than traditional top graphics adapters, occupying two slots instead of the expected three. The PCB is different though, due to a different memory bus and the amount of chips (24 384-bit chips of GeForce Titan vs. 8 256-bit ones of GeForce GTX 680). The digital VRM, On Semiconductor NCP 4206, boasts 6 power phases for the core and 2 for the memory chips. The set of outputs enables simultaneous connection of up to 4 displays with Full HD resolution (similar to the GTX 6xx family).

Gigabyte GeForce Titan has two connectors for supplementary power: one 6-pin and one 8-pin.

Maximum resolutions and frequencies:

240 Hz maximum refresh rate

2048 x 1536 @ 85 Hz max. analog (VGA) resolution

2560 x 1600 @ 60 Hz max. digital (DVI) resolution (for Dual Link DVI-D)

Coolers

The graphics card features a closed cooling system with a fan on one end. The core heatsink features a vaporizing chamber.

Though the fan blades are optimized for lower noise, the graphics card does makes some, and the maximum fan speed is nearly 2300 rpm.

The power transistors and memory chips on the front boast their own cooling plate. The memory chips on the back have no cooling at all. However, there is no need for extra cooling when the memory operates at the nominal clock rate.

Temperature monitoring results

We measured temperatures using the EVGA PrecisionX utility.

After 6 hours under maximum load, the top core temperature was 82°C—very good for a top-class graphics adapter.

Package contents

A basic package should include a user manual and a software CD.

Gigabyte GeForce GTX Titan ships with a basic package, 6-pin and 8-pin power splitters, an HDMI cable, and a DVI-to-VGA adapter. The bonus includes a giant mouse pad and... a deck of playing cards! Nice move.

Box

Performance tests

Testbed:

2 x Intel Core i7-3960X (o/c to 4 GHz) CPU

Hydro SeriesT H100i Extreme Performance CPU cooler

Intel Thermal Solution RTS2011LC cooler

ASUS Sabertooth X79 motherboard on the Intel X79 chipset

MSI X79A-GD45(8D) motherboard on the Intel X79 chipset

16GB of 1600MHz Corsair Vengeance CMZ16GX3M4A1600C9 DDR3 SDRAM

Seagate Barracuda 7200.14 3TB SATA2 HDD

WD Caviar Blue WD10EZEX 1TB SATA2 HDD

2 x Corsair Neutron CSSD-N120GB3-BK SSD

2 x 1200W Corsair CMPSU-1200AXEU PSU

Corsair Obsidian 800D Full-Tower case

Windows 7 Ultimate 64-bit, DirectX 11

30" Dell UltraSharp U3011 display

VSync disabled

AMD Catalyst 13.2beta7; NVIDIA Drivers 314.09

Benchmarks:

Hard Reset — DirectX 11.0, built-in benchmark, maximum quality settings.

3DMark (2013) (FutureMark) — DirectX 11.0, FireStrike, Performance settings.

Aliens vs. Predator (Rebellion/SEGA) — DirectX 11.0, Very High settings, run from in game.

Nexuiz (2012) (IllFonic/THQ) — DirectX 11.0, built-in benchmark, maximum quality settings.

Crysis 2 Maximum Edition (Crytek/EA) — DirectX 11.0, Very High settings, Central Park level, launched with the Adrenaline Crysis 2 Benchmark Tool.

DiRT: Showdown (Codemasters) — DirectX 11.0, Ultra High settings, launched as follows: dirt showdown.exe -benchmark example_benchmark.xml.

Heaven Benchmark 2.0 (Unigine) — DirectX 11.0, High settings.

Valley Benchmark (Unigine) — DirectX 11.0, Maximum settings

Metro 2033 (4A Games/THQ) — DirectX 11.0, Super High settings, PhysX disabled, run from in game.

Total War: Shogun 2 (Creative Assembly/SEGA) — DirectX 11.0, maximum quality settings.

Sleeping Dogs (United Front Games/Square Enix) — DirectX 11.0, built-in benchmark, maximum quality settings.

Under each gaming benchmark you'll find a table, where the difference between the graphics adapters is expressed in percents with regard to our hero. Positive numbers show its advantage, negative show the advantage of the other card.

Total War: Shogun 2

Compared with AA+AF 1280x1024 1680x1050 1920x1200 2560x1600 ASUS ARES II - -21.2 -29.4 -33.5 -45.5 ASUS ARES II + -21.2 -25.9 -28.1 -21.6 HD 7970 GHz Edition - -2.2 2.0 2.8 3.3 HD 7970 GHz Edition + 30.8 38.9 38.7 51.4 GTX 690 - -19.4 -27.0 -28.2 -33.5 GTX 690 + -33.0 -33.9 -29.9 -34.6 GTX 680 4GB - 5.2 8.0 9.9 14.8 GTX 680 4GB + 15.3 17.2 19.3 16.8

Hard Reset

Compared with AA+AF 1280x1024 1680x1050 1920x1200 2560x1600 ASUS ARES II - -0.9 -4.9 0.0 -8.0 ASUS ARES II + -15.8 -15.3 -31.8 -24.2 HD 7970 GHz Edition - -3.7 -2.2 2.0 -4.3 HD 7970 GHz Edition + 13.3 21.7 8.6 52.9 GTX 690 - 8.0 11.5 0.0 -3.2 GTX 690 + -5.8 -6.0 -21.0 -28.8 GTX 680 4GB - 0.8 0.5 2.0 1.4 GTX 680 4GB + 1.4 2.1 2.0 15.5

Unigine Heaven Benchmark DirectX 11

Compared with AA+AF 1280x1024 1680x1050 1920x1200 2560x1600 ASUS ARES II - -26.1 -27.8 -29.0 -28.3 ASUS ARES II + -31.7 -32.3 -33.8 -33.4 HD 7970 GHz Edition - 51.6 39.2 42.2 54.5 HD 7970 GHz Edition + 32.8 40.3 29.3 29.6 GTX 690 - -27.6 -24.2 -24.9 -16.9 GTX 690 + -26.6 -28.0 -29.1 -22.7 GTX 680 4GB - 16.4 22.7 26.2 33.9 GTX 680 4GB + 17.4 19.1 26.3 38.1

Aliens vs. Predator DirectX 11

Compared with AA+AF 1280x1024 1680x1050 1920x1200 2560x1600 ASUS ARES II - -46.7 -47.0 -47.3 -46.4 ASUS ARES II + -49.0 -49.1 -49.7 -49.4 HD 7970 GHz Edition - 6.3 1.8 7.4 7.1 HD 7970 GHz Edition + 0.5 1.3 0.5 -0.2 GTX 690 - -29.2 -28.3 -29.1 -30.9 GTX 690 + -21.1 -21.7 -27.9 -30.3 GTX 680 4GB - 14.4 9.1 9.4 20.8 GTX 680 4GB + 31.9 21.9 30.2 26.2

Nexiuz (2012)

Compared with AA+AF 1280x1024 1680x1050 1920x1200 2560x1600 ASUS ARES II - 6.5 -6.2 -20.1 -34.5 ASUS ARES II + 0.2 -8.3 -30.4 -36.9 HD 7970 GHz Edition - 11.1 10.5 14.7 29.6 HD 7970 GHz Edition + 14.9 37.3 21.6 26.0 GTX 690 - 3.1 -6.5 -23.7 -24.8 GTX 690 + -8.8 -14.1 -31.7 -30.5 GTX 680 4GB - 28.6 30.8 26.2 26.7 GTX 680 4GB + 33.9 55.0 31.5 30.0

Unigine Valley Benchmark DirectX 11

Compared with AA+AF 1280x1024 1680x1050 1920x1200 2560x1600 ASUS ARES II - -11.3 -22.9 -29.8 -35.3 ASUS ARES II + -24.4 -33.1 -34.2 -31.2 HD 7970 GHz Edition - 27.3 26.7 27.9 28.8 HD 7970 GHz Edition + 31.6 31.8 31.3 31.8 GTX 690 - -10.9 -21.9 -30.6 -27.4 GTX 690 + -20.6 -25.8 -27.5 -20.1 GTX 680 4GB - 12.0 15.7 18.0 18.6 GTX 680 4GB + 19.7 22.9 24.0 26.6

3DMark FireStrike (2013)

Compared with AA+AF 1280x1024 1680x1050 1920x1200 2560x1600 ASUS ARES II - -32.0 -35.1 -40.4 -42.6 ASUS ARES II + -31.2 -32.5 -37.2 -39.5 HD 7970 GHz Edition - 34.7 31.2 21.1 17.2 HD 7970 GHz Edition + 38.8 36.8 28.0 24.3 GTX 690 - -10.6 -18.7 -16.9 -17.8 GTX 690 + -7.7 -12.0 -9.4 -21.4 GTX 680 4GB - 29.5 28.2 25.7 25.9 GTX 680 4GB + 32.5 31.9 29.9 30.4

DiRT: Showdown

Compared with AA+AF 1280x1024 1680x1050 1920x1200 2560x1600 ASUS ARES II - -11.4 -21.2 -35.9 -50.1 ASUS ARES II + -22.1 -32.7 -38.9 -48.4 HD 7970 GHz Edition - 57.9 45.3 27.9 -2.1 HD 7970 GHz Edition + 34.5 29.6 21.5 1.5 GTX 690 - 1.7 -3.8 5.6 -11.4 GTX 690 + -13.7 -10.3 0.0 -7.7 GTX 680 4GB - 70.4 63.3 79.7 48.8 GTX 680 4GB + 55.9 50.6 68.8 53.0

Metro 2033

Compared with AA+AF 1280x1024 1680x1050 1920x1200 2560x1600 ASUS ARES II - -4.3 -12.7 -24.7 -37.3 ASUS ARES II + -14.5 -27.7 -34.5 -41.3 HD 7970 GHz Edition - 12.4 22.5 17.4 20.1 HD 7970 GHz Edition + 21.3 18.0 10.6 9.8 GTX 690 - -14.4 -13.5 -22.2 -22.2 GTX 690 + -17.9 -21.1 -20.9 -13.7 GTX 680 4GB - 9.0 18.0 13.2 19.5 GTX 680 4GB + 40.9 23.0 22.4 22.8

Sleeping Dogs

Compared with AA+AF 1280x1024 1680x1050 1920x1200 2560x1600 ASUS ARES II - -6.3 -10.4 -12.9 -27.6 ASUS ARES II + -0.7 -8.7 -16.2 -29.9 HD 7970 GHz Edition - -2.6 3.1 15.4 14.4 HD 7970 GHz Edition + 31.1 30.9 37.2 31.2 GTX 690 - 0.5 5.0 -3.4 -9.1 GTX 690 + 9.0 1.9 -7.6 -15.8 GTX 680 4GB - -0.4 8.4 3.6 7.4 GTX 680 4GB + 14.4 3.4 3.8 12.2

Crysis 2

Compared with AA+AF 1280x1024 1680x1050 1920x1200 2560x1600 ASUS ARES II - -17.3 -22.4 -26.4 -30.6 ASUS ARES II + -16.5 -20.5 -21.8 -25.0 HD 7970 GHz Edition - 39.2 46.5 43.1 41.3 HD 7970 GHz Edition + 38.5 48.3 43.1 36.4 GTX 690 - -11.6 -27.1 -29.3 -26.7 GTX 690 + -11.2 -28.6 -31.8 -29.1 GTX 680 4GB - 40.7 32.2 20.7 26.8 GTX 680 4GB + 29.0 16.5 19.3 19.2

Final thoughts

The capability rating compares each graphics card with GeForce GT 630, which is considered a baseline, or 100%. The rating is based on both synthetic and gaming results and shows, as the name implies, what a product is capable of.

The usability rating is obtained by dividing each card's capability rating by its price. It basically shows whether a given product is over or underpriced, considering what it can do, and thus how reasonable it is to buy it.

The complete ratings can be found in our i3Dspeed. Below are just the parts relevant to today's review.

Sorted by capability (high to low)

# Card Capability rating Usability rating Approx. price, USD 01 2 x ASUS Ares II (2 x Radeon HD 7970 GHz, 2x3072MB), 1100/1100/6600 1730 62 2800 02 ASUS Ares II (2 x Radeon HD 7970 GHz 2x3072MB), 1100/1100/6600 1350 96 1400 03 2 x GTX 680, 2x2048MB, 1000-1100/6000 1300 121 1076 04 GTX 690 2x2048MB, 914-1014/6000 1240 122 1018 05 2 x HD 7970, 2x3072MB, 925/925/5500 1230 130 946 06 Titan 6144MB, 837-876/6000 1040 93 1114 08 GTX 680 4096MB, 1071-1200/6000 840 138 609 09 HD 7970 GHz 3072MB, 1050/1050/6000 810 167 485 10 GTX 680 2048MB, 1000-1100/6000 780 145 538

Concerning the price, NVIDIA GeForce Titan climbs to the level of dual-GPU 'giants'. As for performance, the flagship fits between two Radeon HD 7970 cards working in the Crossfire mode and NVIDIA's previous single-GPU leader, GeForce GTX 680. So yeah, Titan is the fastest single-GPU graphics card today. However, while priced similarly to dual-GPU solutions, it yields to them noticeably in terms of performance.

Sorted by usability (high to low)

# Card Usability rating Capability rating Approx. price, USD 25 HD 7970 GHz 3072MB, 1050/1050/6000 167 810 485 29 GTX 680 2048MB, 1000-1100/6000 145 780 538 30 GTX 680 4096MB, 1071-1200/6000 138 840 609 31 2 x HD 7970, 2x3072MB, 925/925/5500 130 1230 946 32 GTX 690 2x2048MB, 914-1014/6000 122 1240 1018 33 2 x GTX 680, 2x2048MB, 1000-1100/6000 121 1300 1076 34 ASUS Ares II (2 x Radeon HD 7970 GHz 2x3072MB), 1100/1100/6600 96 1350 1400 35 Titan 6144MB, 837-876/6000 93 1040 1114 34 2 x ASUS Ares II (2 x Radeon HD 7970 GHz 2x3072MB), 1100/1100/6600 62 1730 2800

Among the top graphics adapters in our usability chart GeForce GTX Titan only outdoes a tandem of ASUS Ares II cards (four Radeon HD 7970 GHz Edition adapters in total). Even a single ASUS Ares II delivers much more juice for its price. And when Radeon HD 7990, expected to be slightly less powerful than ASUS Ares II, is out, Titan will drop down the charts even more due to its excessive price. Finally, according to our charts, GeForce GTX 690 remains the fastest NVIDIA graphics, if we don't consider the number of processors. Still, GeForce GTX Titan is claimed to have good overclocking capabilities, so developing it seems promising to a certain degree.

Gigabyte GeForce GTX Titan copies the reference design, because NVIDIA ships the GK110 chip only in ready-made cards. Titan requires 2 slots and can be used for 3-way SLI configuration in standard PC cases, as opposed to the 3-slot ASUS Ares II. Just bear in mind that it's obviously not completely silent. Oh, and those playing cards seems like a nice touch.

Write a comment below. No registration needed!