There are no marks on my bench!

Aug 14, 2009 21:22


I have two desktop computers. One is a year old TranquilPC - small, black, sexy, fanless, draws about 20 Watts and cost £200 new. The other is a six year old, huge roaring behemoth of a beige gaming machine that sucks up electricity like a fan heater, heats the room like a fan heater, and sounds like... well, a fan heater.

I dread to think how much the behemoth cost when new - it came to me for free from a friend about four years ago, because she was upgrading her primary gaming machine and this just wasn't shit-hot enough any more. At the time it was so much more shit-hot than my main desktop machine that I didn't even have to switch it on for the first year I owned it for it to do everything I needed it to do.

Anyway, in an arm-wrestling contest of CPU grunt, behemoth and tranquil are reasonably evenly matched... despite his age, behemoth still wins - just about - but only with a lot of snarling and a noticeable dimming of the street-lights. In fact the only field in which behemoth remains unchallenged (and indeed the only reason I keep him around) is 3D graphics performance.

You see, as a young and virile gaming machine, behemoth was veritably tumescent in his polygon-pushing power, and although the long intervening years have taken their toll (these days he frequently awakens convinced that it is once again January 2003, having forgotten all about his BIOS settings) he still maintains enough of his faculties to play World of Warcraft at marginally-above minimum detail levels.

But things in computing move so fast, and given behemoth's burgeoning senility, I found myself wondering that if last year's netbook-class CPU can give a six-year-old number-crunching deity a run for its money, then perhaps the same might be true of graphics cards... and I could finally have just one quiet, low-power-consumption desktop computer that would fulfill all of my computational needs (which are, let's face it, meagre compared to the standards of today's latest games).

So to try and answer this (seemingly straightforward) question, I started searching for graphics card reviews. For the benefit of anyone who has had the good fortune to never have looked at a graphics card review site, I'll give you a brief precis of the sorts of things you find, using the medium of absurdist comedy:

Both NVIDIA™ and ATI™ have been pushing the envelope recently with their 27th Generation cards both scoring between 7 and 8 arbitrary points of shit-hot-ness more than their respective 26G cards. Well today we pit their 28th Generation GPUs against each other in a BATTLE TO THE UTTER DESTRUCTION OF THE ENTIRE UNIVERSE ahead of next week's expected release of the 29th Gen cores.

In the red corner we have the ATI Radioactive 98000001+, based on the innovative Rocket-powered Ninja on Speed GPU core; and in the green corner NVIDIA's BruteForce 3.1415926535898 with its 16-pipe NerdyWankToy scrungertron-coprocessor.

On the next page we'll find out which card's tachyon field opened the biggest rift in the space-time continuum, and how many trillion giga-pixels a wood-chuck would chuck, if a wood-chuck could push polygons.

What they're actually talking about, when it comes down to it, beneath all the hype and the bullshit, is the difference between a frame-rate of 120 FPS and 121 FPS on a generation of games that I'm never ever going to play. More importantly both of those graphics cards costs in excess of £200, which, if you have a longer memory than a goldfish, you might recall is what I consider to be a reasonable price for a whole fucking computer!

What I want to know, and what the internet will not tell me, is this: Which modern on-board non-name graphics chipset is going to give me a noticeable performance-hike over a senescent, six-year-old mid-range gaming card. Is that really too much to ask?

rant, geekery

Previous post Next post
Up