Tuesday, October 20, 2015

The NVIDIA GeForce GTX TITAN Z-True Power unleashed

The CPU may be the brain of your PC, but when it comes to gaming, the graphics card is the beating heart that pumps pixels out of your obelisk of a tower and into your monitor. A graphics card consists of dedicated video memory and a graphics processing unit that handles all sorts of calculations, like mapping textures and rendering millions of polygons. The graphics card is, simply, the most vital component of your gaming PC. And these are the ones worthy of your next PC, whether it's a savvy middle-of-the-road build, a budget rig or a 4K monster.

A powerful architecture

In March 2014, NVIDIA announced the GeForce GTX Titan Z at its GPU Technology Conference. It was touted as the world's fastest graphics card with its pair of full GK110 GPUs but it came with an equally stunning price of $2999. NVIDIA claimed it would be available by the end of April for gamers and CUDA developers to purchase but it was pushed back slightly and released at the very end of May, going on sale for the promised price of $2999.

The specifications of GTX Titan Z are damned impressive - 5,760 CUDA cores, 12GB of total graphics memory, 8.1 TFLOPs of peak compute performance. 

NVIDIA is adamant though that the primary target of the Titan Z is not just gamers but the CUDA developer that needs the most performance possible in as small of a space as possible. For that specific user, one that doesn't quite have the income to invest in a lot of Tesla hardware but wants to be able to develop and use CUDA applications with a significant amount of horsepower, the Titan Z fits the bill perfectly.


The GeForce GTX TITAN Z Graphics Card


The GeForce GTX Titan Z is an absolutely stunning looking graphics card. The industrial design started with the GeForce GTX 690 (the last dual-GPU card NVIDIA released) and continued with the GTX 780 and Titan family, lives on with the Titan Z. 

The all metal finish looks good and stands up to abuse, keeping that PCB straight even with the heft of the heatsink. There is only a single fan on the Titan Z, center mounted, with a large heatsink covering both GPUs on opposite sides. The GeForce logo up top illuminates, as we have seen on all similar designs, which adds a nice touch.


The rear of the card has a very thick and strong back plate that is used both has a heatsink and for design. It protects the back side components while adding strength to the unit for installation and even shipment.

Output options on the Titan Z are identical to other reference NVIDIA designs and include a pair of dual-link DVI connections, a full size HDMI and a full size DisplayPort. There is plenty of room for air exhaust as well which is important to get as much movement OUT of the chassis as possible.

I do think it's odd that NVIDIA chose to continue with just a single DP connection as this limits the support for 4K panels to one per Titan Z. If you were planning on picking up a set of three monitors, you need one NVIDIA card per monitor!

With a TDP of Whopping 375 watts, the NVIDIA GeForce GTX Titan Z requires a pair of 8-pin power connections. Now, the R9 295X2 also only required a pair of 8-pin connections but it was drastically going over the recommended spec for PCIe power draw. This beast has a very big appetite!!!!
Without its clothes, the Titan Z shows the immense complication of technical design that mashing a pair of huge GK110 GPUs on to a single card requires. You have a PCIe bridge chip in the middle, power management hardware resting between the two GPUs and 12GB of GDDR5 memory running at 7.0 GHz.

NVIDIA GeForce GTX Titan Z Specifications and Performance Unveiled

We already reported several detailed specifications of the GeForce GTX Titan Z but we’ll do a round-up. Built with the same heart and DNA of NVIDIA’s GeForce Titan and Titan Black Edition, the GeForce GTX Titan Z is an engineering marvel with two chips under its hood that pack 7.1 billion transistors each. The GeForce GTX Titan Z will replace the GeForce GTX 690 boasting dual-GK110 cores compared to dual-GK104 cores on its predecessor. The GeForce GTX Titan Z will feature two GK110 cores with 5760 Cuda Cores, 448 TMUs and 96 ROPs.
The card features a 384-bit x 2 bus which will run across a massive 12 GB VRAM. This is an impressive feature giving developers and games an unprecedented amount of VRAM for use. The memory is clocked at 7 GHz effective clock speed. The core clock speeds are maintained at 705 MHz base and 876 MHz boost clock and the card features a maximum single precision performance of 8.1 TFlops and 2.3 TFlops of double precision which is impressive.

Benchmarking the Titan X’s performance

So why does the Titan X rock such a ridiculous amount of RAM? The massive 12GB frame buffer is frankly overkill for today’s games, but it helps future-proof one of the Titan X’s biggest strengths: Ultra-high-resolution gaming. Higher resolutions consume more memory, especially as you ramp up anti-aliasing to smooth out jagged edges even more.

The Titan X is the first video card that can play games at 4K resolution and high graphics settings without frame rates dropping down to slideshow-esque rates.

Not at ultra-high-level details, mind you—just high. And still not at 60 frames per second (fps) in many cases. But you’ll be able to play most games with acceptable smoothness, especially if you enable MFAA and have a G-Sync-compatible monitor.

A 4K-resolution screenshot of Metro: Last Light's benchmarking tool.

All tests were done in My test bench consisting of the following components.

>Intel’s Core i7-5960X with a Corsair Hydro Series H100i closed-loop water cooler
>An Asus X99 Deluxe motherboard
>Corsair’s Vengeance LPX DDR4 memory,Obsidian 750D full tower case, and 1200-watt AX1200i power supply
>A 480GB Intel 730 series SSD (I’m a sucker for that skull logo!)
>Windows 8.1 Pro


First up we have Middle-earth: Shadow of Mordor. Shadow of Mordor garnered numerous industry awards in 2014 for its remarkable Nemesis system—and with the optional Ultra HD Texture pack installed, it can give modern graphics cards a beating. The add-on isn’t even recommended for cards with less than 6GB of onboard RAM, though it'll still run on more memory-deprived cards. 

The game was tested by using the Medium and High quality presets, then by using the Ultra HD texture back and manually cranking every graphics option to its highest setting (which Shadow of Mordor’s Ultra setting doesn’t actually do). You won’t find numbers for the dual-GPU Radeon R9 295x2 here, because every time I tried change the game’s resolution or graphics settings when using AMD’s flagship, it promptly crashed the system, over and over and over again. Attempts to fix the problem proved fruitless.




Sniper Elite III was released in the middle of 2014. While it’s not the most graphically demanding game, it scales well across various resolutions, and it’s an AMD Gaming Evolved opposite to Shadow of Mordor’s Nvidia-focused graphics. Plus, it’s always fun to snipe Nazis in the unmentionables in slow motion.





Next up: Sleeping Dogs: Definitive Edition. This recent remaster of the surprisingly excellent Sleeping Dogs actually puts a pretty severe hurting on graphics cards. Even the highest of highest-end single-GPU options hit 60fps in Sleeping Dogs: Definitive Edition with detail settings cranked, at 4K or 2560x1600 resolution.



I also tested the systems using two off-the-shelf benchmarking tools: 3DMark’s Fire Strike, and Unigine’s Valley. Both are synthetic tests but well respected in general.

Finally, here’s the power usage and thermal information. Power usage is the total power consumed by the PC at the wall socket, measured with a Watts Up meter during a Furmark run.

Nvidia’s GeForce GTX Titan X: The final verdict

Nvidia was right: Single-GPU graphics cards don’t come more powerful than the Titan X. It’s no contest. The Titan X truly is the first solo GPU card capable of playing 4K games at reasonable detail settings and frame rates. And that ferocious power pushes even further if you’re playing with MFAA enabled, especially if you’re lucky enough to have a G-Sync monitor to match.

No comments:

Post a Comment

Disqus Shortname

Comments system