Presenting the GeForce GTX Titan
A few days ago, Nvidia announced their fastest single-GPU graphics card, the GTX Titan. It also ties the GTX 690 as the most expensive card they have, with a price tag of `68,499.
In terms of on-paper specifications, the story is like this: The Titan is based on the GK110 GPU, the successor to the GF110 found in the GTX 580. While “little” Kepler (GK104/106/107 in the GTX 680 and lower) emphasised gaming performance while sacrificing compute performance (particularly of double-precision floating point variety), GK110 aims to make up for that.
So you have 14 SMXes (one’s disabled in the Titan, GK110 has 15 otherwise) and 6 ROP partitions, giving you a total of 2688 single-precision CUDA cores, 896 double-precision CUDA cores, 224 texture units and a total of 48 Raster Operators (ROPs) in the GTX Titan. Then there’s 6GB of (effective) 6GHz GDDR5 memory mated to a 384-bit bus, giving an effective bandwidth of 288 GB/s. Core clocks are lower than the default for the GTX 680, at 876 MHz.
In case you recognise the specs above (I’d treat you to ice-cream if you did), you’d realise that they’re identical to the Tesla K20X, whose multiple units were used in the Cray Titan supercomputer a while back. Yes, that’s why it’s called Titan.
Real world gaming performance (as this is a GeForce card, after all) is, on an average, between the Radeon HD 7970 GE and Nvidia's own GTX 690. Worst case scenario for the Titan is lower frame rates than the 7970GE, while the best case is that rare moment where it can meet or beat the 690.
So who’s buying this overpriced luxury card? Who’s letting Nvidia charge a lot more than it’s probably costing them to make the GPU (as they’re doing with all Kepler cards, really)? Well, there are some folks, unfortunately.
Premium small form-factor (SFF) gaming PC builders in the US and other countries are including it in their small-but-powerful-and-very-expensive PCs, since the card has manageable thermal, noise and size characteristics in the confined spaces they work in.
Then there’s also the bunch of folks who can afford two of these monsters (Titans, I meant), which makes sense as it scales far better than two GTX 690s in SLI for the same price. They’d want to drive three high definition screens and use Nvidia’s 3D Vision, I suppose. These are probably the same people who own a Core i7 3970X for gaming purposes, just saying.
The last category would perhaps be CUDA developers who need a cheap workstation card. Yes, I know I wrote “cheap” for `68,499 card that actually costs $999 in the US, but considering the K20X is many times as expensive as the Titan, it’s not as bad as it looks, especially if you’re going to turn profits on this investment.
While that wraps up Titan very briefly, the launch tells us two things:
Nvidia needs to seriously catch up to AMD’s OpenCL and compute performance with next year’s Maxwell cards.
AMD needs to seriously work on a card that beats Titan for half the price. They’re close, with the 7970GE (which is faster than the GTX 680 anyway), but they have to exceed it to bring Nvidia’s pricing back to real life.
Post new comment