If one thing is clear about video cards today, it’s that there’s always something better coming right around the corner. I find it a little disconcerting that the fastest video card of last year, the NVIDIA GeForcee 7800GTX 512MB is probably going to be replaced by the next NVIDIA card within a month or two and ATI’s X1800XT which was announced in October will likely be replaced this month by something better.. How companies like ASUS and GIGABYTE can survive within the constantly released video card circus, I have no clue.
In any event, NVIDIA has been around over 13 years, a relatively short time in the scheme of video cards and computers with companies like GIGABYTE being around many years longer than that. NVIDIA is currently concentrated on the PC market with forays into the gaming console market with the PS3 and handheld market with several cell phone deals. Today, NVIDIA is a multi-billion dollar company with hundreds of millions of dollars in the bank.
GIGABYTE is a Tier-1 motherboard manufacturer based in Taiwan. A computer component manufacturer that’s been in the industry for many years, I’ve reviewed many a GIGABYTE product in the last few years. One thing that Gigabyte has been known for recently is their dual video cards. In 2005 GIGABYTE announced their first 3D1 card based upon the ever-popular 6600GT card. Today I’m reviewing the 3D1 card based upon 2 6800GT cards. The concept of dual video chips on the same video card for the consumer gaming market originated by a company called 3DFX with the advent of their Voodoo chip in conjunction with a company called Obsidian way back in 1996. It is hard to imagine that it’s been almost 10 years since that time.
Brand | Gigabyte |
Model | 3D1 68GT |
Graphics Chip | NV40 |
Graphics Memory Type | DDR3 |
Memory (MB) | 2×256 |
Graphics Core Clock (MHz) | 375 |
Memory Clock (MHz) | 1000 |
Memory Speed (ns) | 2.0 |
RAMDAC Frequency (MHz) | Dual 400 |
Active Cooling on Graphics Chip | Yes |
Heatsink on Memory | Yes |
Video Capture | Yes |
Ports | |
Dual Monitor Support | Yes |
VGA Out | DVI-I, DVI-I, D-Sub, D-Sub |
Video In and Out | S-Video Out |
Package and Support | |
Printed Manual | Yes |
Driver CD | Yes |
Performance Tool Software | Vtune2 |
Major Games | Joint Operations, Xpand Rally |
Major Software | None |
VR Glasses | No |
DVD Player Software | PowerDVD |
Video Recording Software | None |
One thing upon opening the 3D1 box that stood out at me was the fact that the 3D1 68GT card length. The card is longer than almost any video card I’ve ever seen, being over 10 inches in length. In a small case with crowded insides, the card would probably block one of the hard drive bays from being in operation. The weight of the card is a solid 2 pounds, making it one of the heaviest cards in existance as well.
The card has a very interesting cooling design. The fan is in the center of the heatsink assembly. It is a 7-fin radial fan that provides cooling for the two graphics chips. The heatsink over the front of the PCB is huge. Putting this card in a computer requires two adjacent slots to be open from the front. The rear of the card has another heat sink covering the majority of the PCB and another fan in the center.
PCI Express was the replacement for the AGP bus. Introduced on the PC with the Intel 925X platform, PCI Express is a Serial Point To Point protoccol. The maximum bandwidth on the PCI Express bus is 4GB/second, nearly doubling that of AGP 8x. Today, almost every motherboard sold is sold with a PCI Express x16 slot for the graphics card, and few modern video cards use the AGP standard.
One thing about GIGABYTE’s 3D1 cards are that they only work in GIGABYTE branded motherboards if you want the best performance (i.e. 3D1 mode). The card will operate on any PCI Express x16 slot on any motherboard, but will probably be relegated to 1 6800GT performance. Otherwise GIGABYTE recommends that the card be used on a GIGABYTE SLI motherboard.
One thing that’s rather unique about GIGABYTE’s 3D1 card is the ability to drive 4 monitors at one time with one video card. This is accomplished with the addition of a 2 D-Sub bracket. There are some NVIDIA cards like the NVS and Matrox cards like the MMS that support 4 monitors, but none of those have near the gaming performance of the GIGABYTE 3D1 68GT. The side bracket of the main card has 2 DVI-I connectors, allowing for two LCD and two DVI monitors to be attached at the same time.
Features
DirectX 9.0c
Pixel Shader 3.0
Vertex Shader 3.0
HDR
SLI on one card
350MHz core
1.0 GHz memory
2x 256-bit memory buses
2 sets of 256MB memory
SLI Antialiasing
The 3D1 68GT has two 6800GT Graphics Processing Units on one card. This card is the equivalent of SLI on one PCB. One thing about the card in question here is not the lack of performance but the need for power to the card. Normal 6800GTs require external power in addition to the 75W maximum that the PCI Express x16 slot provides. The 3D1 only requires the extra power provided by the PCI Express x16 power cord.
The key features required of any video card released today are support for DirectX 9.0c Pixel and Vertex Shader 3.0. Only a very few ATI cards S3 cards and XGI cards are still stuck on the Shader Model 2.0 standard. Going forward, the minimum requirement for a mid-range to high-end card will be SM 3.0. SM 3.0 has the following advantages over Sm 2.0: Longer instruction length pixel and vertex shaders, dynamic branching in the pixel shader, geometry instancing, and floating point 16-bit support throughout the pipeline.
Modern games use Pixel and Vertex Shaders in many wonderful ways. F.E.A.R. is the latest game from Sierra. This is the scariest game I’ve ever played. Pixel Shaders are used in the shiny reflective water, in the fire, in the walls and other special effects. Age of Empires III uses Shaders in everything, the water, the cannon barrels, the reflections.
GIGABYTE clocked their 3D1 card at 375MHz for each core. This gives a maximum combined fillrate with the 16 pixel pipelines of 12 Gigapixels a second. That’s an amazing number, I remember just thinking about the days of 3DFX where pixel fill rates were about 667 Megapixels for the much vaunted Voodoo 5 6000 that never came out. The memory bandwidth on this card is 2 sets, one for each card providing 32GB/second for each GeForce 6800GT on the card. The memory is clocked at 500MHz and is on a 256-bit bus for each chip.
Performance
With the limitations of the 3D1 card only working in SLI mode on GIGABYTE SLI motherboards, I was limited in my choice of how to show performance. To that end I chose GIGABYTE’s new GA-8N Quad Royal SLI motherboard supporting the Intel platform. Paired with a 3.8GHz Intel 670 CPU and a gigabyte of Crucial Ballistix DDR2 memory, the cards I tested with are single mode, dual 6800GTs on the 3D1 and a 7800GT to offer performance comparison to.
Test System
Intel Pentium 4 670 3.8GHz
1GB Crucial Ballistix DDR2 667MHz
2 74GB WD Raptor SATA HDDs
GIGABYTE 3D1 68GT
EVGA 7800GT
Test Software
3Dmark05
3Dmark03
Quake4 1024×768, 1280×1024, 1600×1200 32-bit no AA no AF, 4x AA 16x AF
Doom 3 1024×768 1280×1024, 1600×1200 32-bit no AA no AF, 4x AA 16x AF
F.E.A.R. 1024×768 1280×960, 1600×1200 32-bit no AA no AF, 4x AA 16x AF
COD2 1024×768 1280×1024, 1600×1200 32-bit no AA no AF, 4x AA 16x AF
Far Cry 1024×768 1280×1024, 1600×1200 32-bit no AA no AF, 4x AA 16x AF
Gaming
Many of today’s popular games are bottlenecked by the CPU and not by the video card. The games that are not are few and far between. As we move further into 2006, games will be getting more and more complicated with games like F.E.A.R. and Splinter Cell Chaos Theory are increasingly taxing the video card more and more.
The first game I played on the 3D1-68GT was First Encounter Assault Recon or F.E.A.R.. In single card mode, I was able to play the game at 1024×768 32-bit with maximum settings. F.E.A.R on the 6800GT looks and plays wonderfully, but it’s the scariest game I’ve ever played. The game scared the pants out of me the first time I played. The game played 1600×1200 with 4x AA and 16x AF pretty solidly in dual mode.
The second game I tried was Civilization IV by Sid Meier and Firaxis Software. Civilization was one of my first games on the computer many moons ago. Turn based, with upgraded graphics from the last Civilization game, Civ IV is a good game that you have to get used to if you haven’t played Civilization games before. If you have played these games Civilization 4 is as good a game as you can get. The 3D1 card was able to play this game at maximum resolution and settings.
One thing I miss about modern games is the modern flight simulation. Microsoft is currently working on a sequel to their best-selling Flight Simulator series. Flight Simulator 2004 was the last game in the long-running series. The 3D1-68GT played the game with little trouble at 1600×1200 with 8x SLI AA and 16x AF. Hopefully the next Flight Simulator game from Microsoft will have extensive use of Pixel Shader 3.0 and Vertex Shader 3.0
Conclusion
First thing I have to say about the GIGABYTE 3D1 68GT is it is a great gaming card by itself. In most cases it outperforms a single 7800GT card in games and benchmarks by a significant margin. The idea of putting two video cards on the same card goes back to the old days of 3DFX. The 6800GT was a great card when it was released in 2004, but today it’s been replaced by the 7800GT as the main stream card from the folks at NVIDIA.
The issues I have with the 3D1-68GT include the size, the weight, and finally the price/performance ratio. The size of the card precludes any small form factor PC from using the card due to these factors. Further the card weighs more than a motherboard or any other component on that motherboard and it is a bit of a test on the PCIe slot. The price of this card is around $550.00 retail, making it equivalent in price to an ATI X1800XT video Card, the top of the line solution currently in the ATI lineup, so it’s definitely not a budget product, but in fact geared toward high-end gamers.
Having said all of that, I can recommend this card for owners of GIGABYTE SLI motherboards without qualm. The performance of this card when combined with a fast CPU makes this Dual 6800GT the equal of almost every NVIDIA video card on the market besides the very hard to find elusive 7800GTX 512MB card. If you want the ability to play your games in SLI level performance, but not have to actually get two cards, the new Dual 6800GT is a fine product. Its drawbacks to me beyond what stated earlier is that it only works on Gigabyte SLI motherboards, which does narrow down the user base by quite a margin. Gigabyte has a great idea and a very decent product. It deserves a silver medal in any case due to the sheer guts of Gigabyte engineering and getting this product to market.