For those of you not already familiar with the Computer Games Development Conference, it's a trade show and conference series for game technology companies, developers, and would-be developers. It's not the games showcase that E3 is, especially since E3 is so close, but it is a good place to learn about the tools of the trade and get a peek into the near future of gaming equipment.
If nothing else, CGDC is a great place to corner hardware vendors for a while and talk about their plans. The goal this year was to try to get an early impression of how competing 3d accelerator cards compared to Voodoo2 and perhaps detect a few common trends. One thing is for sure - you couldn't turn your head without bumping into Unreal, Forsaken, or Motorhead being garishly displayed somewhere.
First off is USB. From the look of things, USB won't stand for Useless Serial Bus for much longer. Both CH Products, Thrustmaster, and Logitech are working on USB joysticks, starting on the mainstream sticks at first, but we should be seeing the hardcore controls switching over sometime next year. At one of the booths, I even spied something that looked a lot like a USB version of a 10BaseT hub - probably being used to expand the available ports.
PCI sound cards are very hot. The days of ISA are really becoming numbered now, but it seems like there are never enough PCI slots to go around. Nevertheless, PCI-based sound does help alleviate CPU occupancy for sound generation, and the new generation of PCI sound is designed to support running many sound streams simultaneously.
Used properly, this could be a real boon for sims, where it would be an advantage to independently layer sounds for turbine whine, exhaust roar, airflow roar, and many of the other sounds associated with the modern day combat vehicle. Rather than recording and premixing the sounds in a studio, having each one run independently allows an audio programmer more flexibility. For example, he could change the sound effect only for the part of the soundscape that should change according to what's going on, rather than making only gross changes.
Logitech is developing a new force feedback joystick. This joystick looks like it may be the first to have useful application for sim players. It doesn't have all the buttons and hats of the high-end combat sticks, but it does have a superior 2nd generation force-feedback system. Rather than the relatively blunt and clunky gear-driven deisgn of the Sidewinder FF Pro, the Logitech stick is cable-driven, handles a wide variety of more subtle effects, and has much lower response latency. It's a good step closer in quality to the awesome $2000+ prototype stick we played with last year. Why should you care?
It doesn't matter very much for "special effects" - things like shaking from a crash, buzzing from machine-gun recoil, or other incidental effects. However, if you want to be able to play a flight simulator where you can feel air compressibility, control surface flutter, or other flight-model effects response needs to be smoother, have a wider dynamic range, and happen faster. The Wingman Force seems to have the potential to do a respectable job at it. It will support both serial and USB ports, and is expected to be in stores around August or September.
When it comes to 3d chipsets, it's abundantly clear that everyone is gunning for 3dfx. Almost every 3d technology on display was at least comparable to Voodoo1. 3dfx has literally set the standard by which 3d chipsets are judged, and this year, nobody would flinch if you asked them how their chipset compared to 3dfx. They know they have to be willing to be compared directly against the Voodoo if they're going to win over gamers. Not only that, but it seems like chipset developers are really embracing OpenGL now. The competition doesn't want 3dfx being the only viable choice for the vocal Quake market out there, and by the end of the year, virtually every next-generation chipset that comes out will support OpenGL - and most of them will support the entire ICD, not just a mini-GL driver.
Most of you probably have already heard about the Matrox G200 chipset. Matrox didn't get very far in the world of 3d acceleration with any of the Millenium or Mystique versions, although they made a noble effort. The "revenge" sequel is the G200, and Matrox does indeed have something interesting in their corner. The G200 is the basis for a new version of the Mystique and Millenium. They support: twin 64-bit rendering pipelines, 32 bit Z-buffer, 32-bit internal rendering, a 230 or 250 MHz RAMDAC, the expected suite of 3d rendering features (alpha blending, bilinear filtering, trilinear mipmapping, fogging, anti-aliasing, and specular highlights), support for strips, fans, and vectors, MPEG-2 hardware acceleration, and a 3d setup engine designed to handle parallel instructions.
What will all this mean to you? Let's take a look at each feature, especially because you'll hear about a number of these features scattered among the competition's new chipsets as well. Matrox claims the 64-bit symmetric rendering architecture is designed to speedup memory transfer rates. The 32-bit Z-buffer helps improve the quality of polygon depth sorting for more accurate depth rendering. 32-bit internal rendering means that the system does its internal color computations in 32 bit before rendering to 16 or 24 bit on screen. The high-speed RAMDAC is there to improve screen refresh rates at high resolution.
Support for non-triangle items like strips, fans, and vectors generally allow more flexibility for developers and tend to run a little faster (thanks to less overhead than the same structures made entirely of triangle lists). MPEG-2 and DVD acceleration might be part of an expansion - they allow full-screen high-quality video in either compression format. Finally, the 3d-setup engine is designed to help keep the CPU working on geometry transforms instead of edge interpolation. The Mystique and Millenium are basically the same, but the Millenium adds SGRAM instead of SDRAM, TV Out, and uses the 250 MHz RAMDAC.
The show demos of the G200 were impressive. Demos at CGDC were typically running at 1024x768, and running at a very reasonable frame rate to boot. Certainly better than Voodoo1, not nearly as good as Voodoo2 SLI, but I was under the impression that it wasn't far from a single Voodoo2. The G200 shares the available memory between frame buffer and texture maps. Once you get over 1024x768, the frame rate becomes fill-limited again and starts to drop, but the strong performance (on a P2-300) is nothing to sneeze at. Open GL support, good speed at high resolutions, advanced rendering for 2d and 3d, improved color rendering, and all for an estimated $169 street price (for the Mystique). The G200 seems to have the pieces in place to make an exceptional value for a do-it-all graphics card this July.
nVidia still hasn't shown their big hand at CGDC, the Riva TNT. Instead they were handing sheets with the specs of the TNT, and one of their reps gave us a little more information. The TNT should handle fill-rates a little more than twice as fast as the 128ZX. It also will have 32-bit internal rendering, support full-scene antialiasing, TV Out, improved AGP, an improved geometry setup processor, a 250 MHz RAMDAC, support for 200Mhz memory, and full OpenGL ICD support. However, without any actual equipment on display, all the numbers have limited meaning. Sounds like the TNT will compete pretty closely with the G200 - it seems to on paper.
There has been some confusion between the PCX2, the PVNG, and the PVRSG. To keep it simple, the new PowerVR chipset is referred to by the company as the "Second Generation", the PVRSG. A couple demos were on display, but there wasn't enough time to meet with the company reps to discuss details. People experienced with 3d hardware probably remember that the PowerVR doesn't use a Z-buffer, but the PVRSG will use 32-bit accuracy for doing hidden surface removal, which is the equivalent of a 32-bit Z-buffer, just without the heavy memory requirements. It also supports full-scene antialiasing and bump-mapping. At the show, early impressions were that the PVRSG isn't nearly as smooth as Voodoo2 SLI, but it may be close to a single Voodoo 2.
3Dlabs was present showing off the Permedia 2. The boards have been out for a little over six months now, and the technology is really more akin to Voodoo1 than anything else. However it is worth noting that it uses full OpenGL implementation, geometry setup, 2d with 230MHz RAMDAC, MPEG-2 acceleration, and full scene antialiasing. Sounds like a good card for a CAD/3DStudio/Lightwave animator who likes to play games and isn't too picky about the resolution the games run in.
The rumors are true about S3. They're finally getting serious about real 3d acceleration. S3 showed off the Savage 3D at CGDC. Believe it or not, even the Savage 3d appeared like it could compete with the PVRSG and G200. It's special features include trilinear filtering, 125 MHz SDR SGRAM support, TV Out, OpenGL, and a 250 MHz RAMDAC. Notice the similarity here? The performance of many of the new-generation of cards seems fairly similar at first glance. While there are certainly considerable differences between them, the indication is that the fill-rate on these cards is so fast that unless you're trying to push high resolutions (especially over 1024x768) the bottleneck will be how fast your CPU can do geometry calculations. This is a big part of why the fancy Sega arcade boards in continue to be faster than PC 3d cards that look faster on paper.
Which brings us to Rendition. While Rendition didn't have any new chipsets to showoff at CGDC (the most I could get out of them was "yes, we're working on something new"), they are trying to help solve the CPU problem. The effort is dedicated to facilitate board vendors adding Fujitsu's geometry processor to boost the number of polygons per second. While this important feature should certainly work in Rendition's proprietary RRedline API, and apparently can be easily adapted to OpenGL, whether the geometry processing can be made to work in D3D without support from Microsoft is another issue.
While my personal 3d programming experience may be somewhat limited, it does seem that any use of geometry transform acceleration will require explicit support within the API in order to be put to use. Otherwise, you run solely off of the CPU for transforms. Should it succeed, Rendition and Fujitsu could be paving the way for a coming generation of 3d cards that combine fine rendering with finely detailed scenes. Unfortunately it sounds like it will be an uphill battle. We'll try to get a little more information on this as soon as possible.
CH had a new entry-level USB stick on display modeled after their "Gamestick 14". This version was much simpler: a couple buttons, trigger, hat, and throttle control. It's designed to look like a futuristic flight stick and interestingly enough, is ambidextrous as well. The throttle was awkward to use, but if the production version has a smoother throttle and a fairly low price point (unknown as of CGDC) then it should make a good beginner's stick. Naturally you'll want to add some rudder pedals if you're going to use it for a helicopter sim.
Now you probably have some idea of the trends in hardware at CGDC, but let's just recap:
PCI sound has become the new standard for sound card manufacturers. While it promises a little extra speed for gamers, it will have to offer something substantially newer to entice people to upgrade from their current sound card equipment - which is where the large number of simultaneous sound threads comes in. Game developers will need to support it to make it worthwhile, which somehow seems doubtful at the moment.
USB is "in", although you may have to wait a little over a year before you can replace your high-end sticks with USB versions. Nevertheless, it should be much more sanitary than the current tangle of cables we find ourselves in.
OpenGL is definitely in. Amazing how industry support for an entirely different API can be spurred on by just one series of games (discounting modeling packages, anyhow). Expect substantially more games to be announced supporting OpenGL by the end of the year. One game that was originally being written for D3D is having its engine re-written to use OpenGL. Fill rates are up everywhere. In order to get good 3d performance, you've got to have both good polygon performance and good fill rate. That means that faster 3d cards will let you run higher resolutions, but Intel (or maybe Fujitsu) will be what lets you increase the object detail or number of objects.
3d cards will support faster ram configurations, support more accurate color and z-plane rendering, support TV Out, full triangle setup, and the ones that support 2d will have greatly improved AGP support and at least a 230 MHz RAMDAC - good timing considering how affordable 19 and 21" monitors have become.
However you look at it, it does look like the PC will transition over the next couple years into a machine that handles the needs of gamers much more smoothly. Less joystick mess, smoother and more standardized 3d acceleration, more flexible sound. Rather than having game-playing ability force-fed into a machine that just wasn't really meant to do that, the gamer's PC will be a much more game-friendly environment. This will allow developers to focus more on the game and less on hardware limitations. That issue is, after all, the real goal for hardware and game designers.