It came as no surprise when NVIDIA announced the release of their new Next-Gen flagship card last week the GTX780, just a day after the announcement of the Xbox One. It seems that the Tech Giant was silently lying in wait for both console manufacturers to put all their cards on the table before pouncing with their new hardware masterpiece.

GTX780

The GTX780. Flagship build. Flagship specs. Flagship name.

Masterpiece it is. Built with the exact same PCB as the R13000 Titan, making it the same size as the Titan. The GTX780 also houses the exact Keplar (GK110) chip that features on the Titan and comes with the same superior aluminium shell that boasts the same green Geforce GTX LED as the Titan. One would almost be forgiven for buying a GTX780 and passing it off to their mates as a Titan – at 60% of the cost, why not?

Is it of Titan pedigree though?

Although the GTX780 looks the part, like anything, you always get what you pay for – at 60% of the cost of the Titan, one would expect up to 60% of the performance right?

Wrong! On paper, the GTX780 has half the amount of GDDR5 memory that the Titan boasts. It is also few hundred CUDA Cores lighter and contains less texture units, resulting in a slightly lower texture fill rate. On the flipside, the GTX780 boasts a faster GPU clock and new version of GPU boost, giving the GTX780 a slight advantage when it comes to pixel-processing grunt.

NVIDIA-GTX-780-VS-GTX-Titan

Same same. We recommend buying a GTX780 and telling your mates it is a Titan.

So then, how much slower is the GTX780 than a Titan? In a word, marginally. In fact, according to Gamespot’s test results, the GTX780 outperformed the Titan in the majority of their game tests.

What’s even more impressive is that the GTX780 is a staggering 50% better on paper next to the last generation single-card flagship, the GTX680. In practice though, it benchmarks around 30% faster across the board. Which is worrying for users who have just forked out (used losely) for an entry-level 650GT, because that card will be effectively redundant once the equivalent 7-series card lands.

This card is definitely a winner. The only thing I am slightly disappointed about is the fact that NVIDIA didn’t call it a Titan780 because it’s every bit deserving of the name, and although it is still expensive, it really is a Titan for the masses.

GTX780_Sexy

Tech Porn.

Interestingly though, the GTX690 still destroys both of these cards in a single card shootout – which isn’t a fair comparison if you look at it as 1vs1 because the GTX690 is really 2 in 1, but when you factor in the price of a GTX690, it is still cheaper than 2x GTX780’s and a no-brainer against the equally priced Titan.

Check out the full specification comparison below:

GTX690

TITAN

GTX780

Shader Units

2×1536

2688

2304

ROP’s

2x 32

48

48

Graphics Processor

2x GK104

GK110

GK110

Transistors

2x 3500m

7100m

7100m

Memory Size

2x 2048 MB

6144MB

3072 MB

Memory bus width

2x 256 bit

384 bit

384 bit

Core Clock

915 MHz +

837 MHz +

863 MHz +

Memory Clock

1502 MHz

1502 MHz

1502 MHz

Price

R12000-R14 000

R13000

R8000 – Est

Improved software too?

Naturally, to accompany the improved GK110 chips that feature in the GTX780, NVIDIA have introduced a whole new software package to compliment this new hardware architecture. As mentioned, the revised 110 chip features GPU boost 2.0 technology, which is a whole lot better than the GPU boost that the GK104 chips feature, whereby clock speeds will be now boosted based on power consumption as well as heat. What’s more, a user is now given control over how they want GPU boost to operate based on Frequency, power consumption and heat. For example, you can crank up the GPU Boost frequency at the expense of heat, or limit your boost clocks based on your cards max temperature – defined by you – So if you don’t want your card to get hotter than say, 65 degrees, you now have that ability…at the expense of speed obviously.

The new software package called NVIDA GeForce Experience (GFE) ships with the latest NVIDIA driver package (320.18). GeForce Experience is a whole new software interface that doesn’t replace the NVIDA control panel, but rather integrates the existing NVIDIA control panel and a whole bunch of new features, including GPU Boost 2.0 into a larger graphics management software platform. It is a lot like Steam’s Big Picture. It includes a new automated settings configuration utility for a number of ‘approved’ games, such as Boarderlands 2, Battlefield 3 and many more, whereby allowing you to quickly optimise the graphics settings of each game based on your current hardware setup, giving you maximum frames at the best possible settings. It works well for the most part, and is a great feature for console gamers porting to PC. For avid hardware junkies like myself, who like tweaking every possible setting in order to get the best possible gameplay and visual experience, this is a frivolous feature that I will never use.

GTX-780-Shadow-Play

A very cool feature to look forward to down the line, is ShadowPlay. It is a gameplay capture system, like FRAPS, that is currently being built into GFE that leverages the h.264 encoder that is built into Kepler (600, 700 series) GPU’s. NVIDIA have promised that because this neat little feature takes advantage of the cards Native design spec, the performance cost when using it will be far less than that of third party recording software, such as FRAPS. As little as a 3% performance reduction is what is currently on the table…although at the rate that NVIDIA optimize software, only time will tell.

In summary:

So really, if you are looking to buy more than one GTX780, or if you were considering buying a Titan, don’t bother looking any further than a GTX690.

GTX690

Yeah, after all is said and done, the GTX690 is STILL better.

Overall though, we are very impressed with the improvements made to the 7-Series chipsets and cannot wait for the GTX770, GTX760 and more importantly GTX790 to be revealed later on this year. Here’s hoping to the Titan pedigree all through the high-end 7-series cards…

 

Sources: Gamespot and Toms Hardware.