This site may earn affiliate commissions from the links on this page. Terms of utilise.

Over the by few months nosotros've been examining various aspects of the GPU market, and how gamers waiting for prices to calm down can take various steps to improve their situation by optimizing games or considering a new AMD APU as an entry-level solution. We've also discussed how the inflation hits AMD and Nvidia GPUs differently, delved into the risks of buying a used GPU, and suggested steps readers can accept to protect themselves and their investments. Our used GPU advice, still, presupposed that the reader was looking to buy a card of relatively recent vintage — think GPUs from Nvidia's Maxwell generation (2014), or AMD's Fury (2015) or kickoff-generation Polaris (2016) production families. Merely what about older cards, similar flagship models from years agone?

Techspot recently took the GTX 680 for a spin to examination that hypothesis, and its results are interesting. The GTX 680, for those of y'all who don't recall, was a major architectural overhaul from Fermi, Nvidia's previous GPU architecture. The GK104 that Nvidia lunched dorsum on March 22, 2012 was a 1536:128:32 (core count:texture mapping units:return output units) architecture with 192 GPU cores per SMX cake. Information technology competed against AMD'due south GCN ane.0 architecture at launch and drew significantly less power than AMD'due south various solutions. This was the beginning of a situation that has persisted until the present day: While AMD has absolutely fabricated its own power efficiency improvements from GCN 1.0 to its current Vega 64, Nvidia has maintained a consistent edge for the concluding six years.

On paper, the GTX 680 looks fairly modern. Its core counts, fill up rate (32.2GPixels/southward) and memory bandwidth (192.2GB/s) are both on par with modern midrange cards. But given its age — vi years one-time is old for a GPU, even in our mod era — and its architectural obsolescence, can it compete with modernistic cards at all?

According to Techspot's results, the answer is "absolutely, yes." The site put the GTX 680 up confronting the old AMD Radeon HD 7950 Heave, equally well as matching it confronting the GTX 1050 Ti, GTX 1050, RX 560, and GTX 580. You lot should hit the site for a game-past-game breakup; we've got the 8-game boilerplate effect below:

Image past Techspot.

The overall performance between the 1050 Ti and the GTX 680 is remarkably like. And fifty-fifty better, older cards like this should exist largely immune from the cryptocurrency mining craze. The GTX 680 and other cards from the Kepler family were terrible at cryptocurrency mining thanks to a number of factors, including the manner Nvidia organized the GPU — with 192 cores per SMX, Nvidia needed to extract very high levels of parallelism to keep each SMX unit busy. This is practiced news for modern buyers, since information technology ways in that location's no need to worry nigh the carte du jour's having been inappropriately abused in mining workloads — though there is, of course, even so the chance that they've been abused in general, given the age of the GPUs at this point.

Best of all, eBay prices on GPUs like the GTX 680 are pretty reasonable, with auctions and listings in the $65 to $100 range. The current lowest price on a GTX 1050 Ti, according to Newegg, is $220. Be advised the GTX 780 and 780 Ti are both Kepler-era cards with significantly more horsepower than the GTX 680, and they're available on eBay equally well, for around $100 to $150 (GTX 780) and ~$200ish (GTX 780 Ti). It's hard to recommend people driblet $200 on a GTX 780 Ti given that information technology's limited to 3GB of RAM and was new in 2013, only it's fifty-fifty harder to recommend people spend $220 on a GTX 1050 Ti, an entry-level GPU currently selling for 1.58x its MSRP. If the GTX 680 tin friction match the 1050 Ti, the 780 Ti will admittedly level it.

Besides, check Techspot'south link for some interesting data on organization level power consumption during these tests. GPU performance may not grow at the speed it used to back when doubling every single twelvemonth was normal. But the huge efficiency gain from the GTX 680 to the GTX 1050 Ti shows how much progress we've made in performance/watt.