The Most Disappointing Graphics Chips in the Last Decade (1998-2008)
Read Time:12 Minute, 24 Second

The Most Disappointing Graphics Chips in the Last Decade (1998-2008)

0 0

Never has there been a better time to be a PC gamer.  With all the great titles available now including Far Cry 2, Dead Space, Left4Dead, etc. I just don’t have enough time to bask in all the awesomeness.  Bolstered in no small part by great PC hardware products from every player out there, 3D graphics have never looked so good nor performed so well.

But all’s not been so in the world of graphics accelerators.  The API showdown of the late 90s that sputtered out in the first part of this decade produced some truly awful and horried graphics accelerators.  The shift from dedicated accelerators to 2D/3D combo accelerators and the emergence of DirectX and downfall of Glide brought forth stupendously dumb products, vaporware, and massive disappointments.

So in this festive holiday season, let’s get our Grinch on and take a trip down memory lane:

#6 Intel i740 – Release Date Early 1998

Intel i740Intel’s i740 was their first foray into the 3D market. Some research firms said outlandish things like “Make no mistake about it, Intel’s entry into the 3D graphics is a wake up call to the industry and marks a significant milestone for 3D graphics capabilities that will forever change the landscape of the industry.”  Wow, that’s amazing, and I’m sure it was copied directly from an Intel slide deck.  However, the i740 was a massive disappointment.  It was designed to take full advantage of AGP that included AGP texturing, and on board memory was used exclusively for the frame buffer.  This meant the video card had to fight for precious memory  bandwidth with the CPU and other devices in your PC.  PCI variants of this card often turned out to be faster than their AGP brethren because both textures and the frame buffer had to be stored locally, mitigating the rather slow AGP texturing process.

Furthermore, the i740 was one of the first cards to optimize for synthetic benchmarks, getting everyone excited about this low-cost, kick ass solution that was going to change the game.  However, it only provided  competition to the Riva 128 and got absolutely destroyed by the dominating Voodoo II.  Some may say that the Voodoo II was 3D only and required a 2D solution in addition, but remember that most PCs sold back then had built in 2D video to begin with.  And the i740 only had a couple of months before the Riva TNT hit the market.  By January of ‘99, most review publications didn’t even include the i740 in their roundups with newcomers like the Banshee, Savage3D, and Rage128 coming down the pipe.  Personally, I used my i740 merely for a 2D companion to my Voodoo II SLI setup.

#5 3DFX Voodoo Banshee – Release Date Late 1998

Hey guys, what’s going on in this market?
With NVIDIA, Matrox, and ATI all with their eye on the prize, 3DFX was the company to beat.  They had mind share.  Their GLide API was considered the best in the business.  But emerging standards such as Direct3D from Microsoft and the cross-platform and highly versatile OpenGL were make strides.  So much so that 3DFX was forced to create drivers to support these competing APIs.  The Voodoo Banshee came on the heels of the 3D-only Voodoo II and was designed to take on 2D/3D accelerators like the nVidia/nVIDIA/NVIDA Riva TNT.  It wasn’t the first 2D/3D combo from 3DFX, the first being the Voodoo Rush, and it’s a toss-up as to which was a bigger failure.  However, the Rush was released in 1997, disqualifying it for this list.

While the Riva TNT couldn’t stand up to the Voodoo II because it had to be clocked lower because of heat (90MHz vs a planned 125MHz, actually making a it a candidate for this list), it shined against the Voodoo Banshee that discarded the second TMU (texture management unit) design of the venerable Voodoo II.  While a hit with the value crowd, it marked the first time that 3DFX started losing market share to NVIDIA.  OEMs by that point were hooked on the TNT’s support for 32-bit color, which the Banshee did not support, and 3DFX would not support until the Voodoo 4/5, released nearly 2 years later.  By that time the nails were being hammered into 3DFX, which was later bought by NVIDIA.  Personally, I think 3DFX should have never bought STB.

#4 NVIDIA GeForce FX 5800 Ultra – Release Date Early-Mid 2003

Sorry, I can’t hear you over the SOUND OF HOW MUCH THIS CARD SUCKED.
The first fruit of the buyout of 3DFX, the GeForce FX 5800 was probably one of the most anticipated and most hyped NVIDIA graphics chips ever.  And it was the biggest, most abject failure since the NV1.  Nicknamed “The Dustbuster,” the GeForce FX 5800 Ultra was the noisy attempt from NVIDIA to reclaim the speed crown from ATI’s balls-to-the-wall Radeon 9700 Pro and its successor, the 9800 Pro.  It failed miserably.  The problem was the emergence of the importance of “shader” calculations in DirectX 9.  This threw the simple concept of pixel pipelines out the window.

NVIDIA hyped up its Shader Model 2.0a features as the “dawn of cinematic computing,” but ended up falling well short of the competition in speed.  Keep in mind this was well after the 9700 Pro was released.  Its poor performance is attributed primarily to two factors.  The first was its mixed precision programming of FP16 and FP32, the former was lower than the required standard, and the latter was just plain slow.  ATI’s 9700 Pro operated at FP24 100% of the time, which was the required DirectX 9 standard.  The second issue was that the card’s performance relied heavily on the driver’s shader compiler.  Properly sorted and programmed, updated drivers could significantly boost the GeForce 5800 Ultra’s performance.  Without them, pipeline stalls and poor instruction order would severely cripple the card in some games.  This required optimization would lead NVIDIA to create one of the most important programs in company history: “The Way It’s Meant To Be Played” – allowing game developers complete access to technical resources to make their games better.  And hopefully better on NVIDIA cards.

Speaking of developers, probably the most embarrassing thing about the FX debacle was Valve’s Gabe Newell making the announcement that in his new blockbuster, Half Life 2, the updated FX 5900 Ultra flagship was no faster than ATI’s mid-range Radeon 9600.  Furthermore, because of this, Valve decided to force FX-series video cards render in DirectX 8.1 mode, its only strong point.  Ouch.

#3 S3 Savage 2000 – Release Date Late 1999

I don’t know the name of this card, I just know what it looks like when it LIES.
I have a strong loathing for this particular card.  See, back in the 90’s, S3 was still a contender.  It was during a time that anyone could snatch the performance crown with their next video card release.  And their proprietary S3TC texture compression was truly cool and made a real difference in 3D quality.  The flame wars of the late nineties over Matrox, NVIDIA, 3DFX, ATI, and S3 were epic.  All you forum fanbois of today are a bunch of weaklings compared to the utter rage that these discussions invariably led to.

But, back to the trash that was the Savage 2000. At the turn of the century,  Hardware Transformation and Lighting (TnL) were the buzz words, with NVIDIA’s GeForce (Geometry Force) SDR and DDR being the cards to beat.  S3 announced in 1999 that they had an answer.  Its awesome clock speed (on paper) of 175MHz meant it had a higher theoretical fill rate than its GeForce competition.  The hype on this card verged on the hysterical.

But it was all too good to be true.  Analysis showed that its TnL engine contained incredibly fewer transistors than did the GeForce chips, which raised some eyebrows, and it shipped without TnL enabled in the drivers.  Combined with a very disappointing drop in production clock speeds of 125MHz (vs 175MHz or up to 200MHz according to some sources), the Savage 2000 was a ho-hum product at launch with unpredictable performance.  To make matters worse, when drivers shipped that “enabled” hardware TnL, it was shown that they had no impact on performance, leading most to believe that hardware TnL was broken, poorly implemented, or just not there at all.  Driver updates ceased in 2002.  It was the last video card S3 designed before being sold to VIA.  Of course that turned out just great for them.  </sarcasm>

#2 TWO WAY TIE!!

Matrox Parhelia – Released 2002

I rode the short bus to market.
The Matrox Parhelia was long overdue for Matrox.  Their very competitive and very popular G400 was getting long in the tooth.  ATI was gearing up to destroy everyone with the 9700 Pro and the GeForce 4 was the current king of the hill; seemingly untouchable.

The Parhelia had amazing specs.  It had a 256-bit memory bus, but was the first to feature a 512-bit “ring bus.”  Sound familiar?  However, it featured absolutely ZERO bandwidth saving features.  NVIDIA and ATI had LMA II and HyperZ, but Matrox had nothing.  It supported ridiculously high 16x fragment anti-aliasing, which was impressive.  It also had fantastic 2D performance thanks to its 10-bit, 400MHz RAMDAC.

The problem was that the top of the line 256MB Parhelia at $399 got its ass kicked by the older GeForce 4 Ti 4600.  I mean it got destroyed.  It basically performed at the level of the previous-generation GeForce 3.  Its 4×4 pixel pipeline design offered zero real-world advantages.

But the worst part was that it was supposed to support DirectX 9.0 shaders, but didn’t.  Later in its life Matrox acknowledged that their vertex shaders were not DirectX 9-compliant, as advertised.  But it didn’t matter, the Parhelia sucked in DX9 titles, even without more complicated DX9 shader code to run.

In the end, it was concluded that Matrox engineers simply weren’t as talented as NVIDIA’s and ATI’s.  Ouch.  Matrox continues to pump out professional cards that perform well in 2D and multimedia applications, but never would they set foot in the consumer 3D arena.  Well, unless you count the M-Series announced in June, which FINALLY added support for Windows Vista Aero.  Double ouch.

(Silver Lining: the Parhelia chip founds its stride in Matrox’s HD video editing solution, which I absolutely loved when I worked for BOXX Tecnologies.)

ATI Rage Fury MAXX (Dual Rage128 Pro) – Released January 2000

Windows 2000 or XP? NO DUAL CHIP FOR YOU!
OMG, a DUAL graphics chip video card? WOW.  Unlike the Voodoo II with its two separate texture management units, the Rage Fury MAXX featured two full Rage 128 chips that worked in tandem on a single card to accelerate 3D games.  The method that this was acheived by was called “Alternate Frame Rendering” – one chip would render one frame, the other chip would render the next.

There were several problems with this.  ATI’s Rage128 chip was crap when it was launched – nearly NINE MONTHS LATE.  The drivers were abysmal.  I know, I bought one for $249 from Best Buy the friggin’ day it came out.  Back to the MAXX though – while the Rage128 drivers had matured, so had the chip, and there was tough competition from NVIDA and 3DFX with the GeForce series and the Voodoo 3 series.  So ATI decided to slap two of them together on one card and call it day.  Believe me, it almost worked.  In late December 1999 the early previews were promising.  It was beating the GeForce SDR at higher resolutions but also at a higher price tag.  By February of 2000, however, the reviews were swinging back towards NVIDIA’s corner, and the GeForce DDR provided better frame rates than the MAXX.  Combined with its lack of hardware TnL and higher price tag, reviewers had a hard time what to make of this card.

In the end, and the reason that this card is high up my list is because it flat out did not work as advertised in Windows 5.x operating systems, meaning Windows 2000 and XP.  In these operating systems, which did not support the method ATI used for dual AGP graphics, the ATI Rage Fury MAXX only worked in single chip mode.  Face, meet palm.

#1 BitBoys Glaze3D – Never released

Rumor was it would be bundled with Duke Nukem Forever.
Oh boy there was nothing more fun than taking jabs at BitBoys and their never-released Glaze3D.  Seriosly, this was the first time I had every heard the word “vaporware.”   Its specs seemed to magically morph every time a new card was released by NVIDIA, 3DFX, or ATI to make it look like it was a killer solution.

First announced in 2000, the BitBoys Glaze3D specs would place it as the equivalent of the 3-years-away GeForce FX 5200 Ultra while its claimed performance would place it at the same level as a GeForce 3 Ti 500.  I remember very well their claims of 200 frames per second in Quake III.  They released screen shots of what they said it was capable of and they looked as good as a DirectX 9 video game.  Remember, this was back in 2000.

Screenshot from the year 2000. And 2001. And 2002. And 2003. And then we didn’t care.
It was a friggin’ joke which would be released first, the Glaze3D, or Duke Nukem Forever.  I said more than once that if BitBoys every released a consumer desktop graphics card I’d grind it up and drink it in a shake.

While BitBoys claimed that bug-hunting and production issues kept them from releasing the Glaze3D, it didn’t keep them from talking about new vaporware chips they were developing.  Subsequent vaporware featured embedded DRAM for stupid amounts of bandwidth to be used on anti-aliasing, as well as ever evolving support for new DirectX standards.  In the end, BitBoys focused on handheld graphics, and were eventually picked up by ATI in 2006.  So now AMD owns them.  So I guess they showed us.

Dishonorable mentions:

2900XT for being a big, hot letdown and not able to beat the nearly year old 8800GTX.

The 7950GX2 Quad SLI for being a total bitch that only boutiques and certain OEMs got initial access to it.  Oh, and for not having Vista support till it was completely irrelevant.

3DLabs P10/P9 – what happened to this “game changing” chip?  It made its way into workstations but the big buzz was the purported advanced performance for consumers.  Oh well.

 

Happy
Happy
0 %
Sad
Sad
0 %
Excited
Excited
0 %
Sleepy
Sleepy
0 %
Angry
Angry
0 %
Surprise
Surprise
0 %

Average Rating

5 Star
0%
4 Star
0%
3 Star
0%
2 Star
0%
1 Star
0%

Leave a Reply