Home Page

Today's News

Ground ZERO
Latest Rumor
Rumor Archive

 

 
 


mesg. board 
 
 
 

2015 © KickAss Gear

 

 

ELSA GLadiac GeForce2
mini review Dr. John

  The ELSA 'Gladiac' video card has been out for awhile, but we had not taken a careful look at it until now. The Gladiac is based on NVidia's GeForce-2 GTS graphics chip, coupled with double data rate (DDR) memory.  The model that we looked at (revision B) contained 32MB of Infineon DDR SGRAM (synchronous graphics RAM).  The layout of the board was nice, and the hardware appeared well built. I would have liked to see a slightly larger heat sink to go with the fan, but since core overclocking on GTS cards does not increase performance, it was not a big problem.

  Setup: The basic setup that I installed the Gladiac in was a Pentium III 700MHz test system.

 700MHz Pentium III "E" (retail), Asus CUV4X motherboard (BIOS rev. 1004), 128MB PC-133 SDRAM, and an Antec 250W power supply, 

  The setup had a fresh install of Windows 98SE plus, Direct X 7.0a, version 4.22 of VIA's AGP miniport driver (in the VIA 4 in 1 driver set) and the 5.30 version of NVidia's 'Detonator' unified driver set.  BIOS settings included 2x AGP, a 64MB AGP aperture, and SDRAM configured by SPD.  When the bus frequency was set at 100MHz, I set the memory clock/CPU clock ratio to 4/3, meaning that the memory was running at 133MHz.  When the front side bus frequency was overclocked above 112MHz, the CPU/memory clock ratio was set back to equal.

  Unlike our earlier experiences testing GeForce cards, the newer Detonator drivers (5.30 and above) made the Gladiac installation a plug & play experience.  No more black screens, no more flashing textures, just good clean 3D acceleration! If you buy a GeForce or GeForce-2 card, get the latest Detonator drivers here.

Performance Scaling: To get some basic numbers, I tested the Gladiac in three systems, with and without overclocking.  First was a Celeron-2 600 running at 600MHz, and overclocked to 765MHz (85MHz bus).  The second system was a Pentium III 500E, with PC-133 SDRAM, running at both 500MHz, and 750MHz. The third was a PIII 700E running at 700MHz.  The graph below shows the numbers from the newest version of 3D Mark 2000 ( version 1.1, Madonion.com).

  In the legend in the graph above, the system speed is represented by the first number, and bus speed by the second. Clearly, CPU speed has a big effect on the Gladiac's performance. In tests with the Celeron-2, the CPU is obviously the rate limiting factor in performance, since changing the resolution and color depth had little effect on Direct 3D performance.  The Celeron-2, even when overclocked to 765MHz, did not equal the performance offered by the 500MHz Pentium III.  But at higher resolutions and 32 bit color, the Gladiac starts to become the limiting factor, and you can see the performance differences between test systems become much smaller.

Card Overclocking:  To test the benefit of overclocking the GeForce 2 GTS, I used 3D Mark 2000 to test Direct 3D performance.  I ran these tests with and without overclocking, and compared the results.  I used Coolbits, built-into the Detonator drivers, to do the overclocking.

(HKEY_LOCAL_MACHINE\Software\NVIDIA Corporation\Global\NVTweak). Make a DWORD value in the NVTweak key called 'Coolbits' (no quote marks) and set the value to 3. 

  The chart below is for a Pentium III system at 700MHz (100MHz front side bus), and a Gladiac GeForce GTS card running at both default and overclocked speeds.  The default speed on the Gladiac card is 200MHz on the graphics core, and 333MHz on the DDR memory.  I was able to overclock the Gladiac to 216MHz on the core, and to 360MHz on the memory without creating visual artifacts.  Overclocking the Gladiac did not have a dramatic effect on it's Direct 3D performance.  The chart below compares results with default settings, and with a core setting of 212MHz, and the memory set to 360MHz.

  Overall, the increase in Direct 3D performance with GeForce overclocking was very minimal, only around 4% in the best circumstances. As with other GeForce-2 GTS cards, boosting the core speed without boosting the memory speed had almost no effect on benchmark scores (less than 1% at 216MHz). 

  Only memory overclocking had a significant effect on performance, and even then, the performance boost was marginal. When overclocking the memory on the Gladiac alone, up from 333MHz to 360MHz, the 3D Mark 2000 scores went from 4478 to 4688, or about four and a half percent (1024x768x32).  Obviously, the memory speed, rather than the core speed, is more of a bottleneck for GeForce2 GTS cards.

Bus Overclocking:  We ran the Gladiac at a bus speed of 155MHz with the PIII 500E.  The AGP was set to 1/2 of the 155MHz speed, for an AGP frequency of 77.5MHz.  The card did not give me any problems at that speed.  I also tried the 120MHz/40MHz frequency option, where the AGP slot was set to 80MHz (twice the PCI frequency).  In most overclocking situations, 80MHz is the fastest you will need an AGP card to run.  The system ran perfectly at this speed, meaning that the Gladiac is stable when the AGP slot speed is increased from 66MHz (default) to 80MHz, a 20% increase.

Texture and Lighting Engine:  The so-called texture and lighting (T&L) engine in GeForce and GeForce-2 cards has been the source of much controversy.  The fact is, that any type of 3D hardware needs software support.  So software developers need to write specific code into games to make the Pentium III's "streaming-SIMD" instructions work to speed the game up.  Likewise, developers need to write different code to take advantage of the T&L engine on GeForce cards.  Because GeForce cards are quickly becoming the dominant video platform for the PC, you can expect most future games to have T&L code included.  But older games will not benefit from the GeForce and GeForce-2 as much, due to the lack of T&L code.

  To see how the T&L stacks up to the PIII's SIMD extensions, I compared 3D Mark results with the two different 3D optimization methods enabled, and compared the results.  The two sets of bar graphs below show the difference with a Pentium III 500E running at 500MHz.

  The next thing I wanted to check was if the PIII optimizations would fare better with a higher clock speed.  So I set the 500E at 150MHz (750MHz system rating), and ran 3D Mark 2000 with both GeForce T&L and Pentium III SIMD optimizations.  The graph below shows the results.

  Even at 750MHz (with a 150MHz bus) the T&L engine outperforms the PIII's SIMD instructions by a wide margin (nearly 30% at lower resolutions).  At higher resolutions and color depths, the performance difference between the two 3D-boosting methods is greatly reduced, as the GeForce's fill rate becomes the bottleneck.  But T&L is clearly superior to SIMD at common gaming resolutions.  This means that GeForce and GeForce-2 cards will age well, as more games come out with T&L support.

Full Scene Antialiasing (FSAA):  Full scene antialiasing (FSAA) can improve picture quality, but can also introduce some blurring of 3D objects.  The GeForce-2 does have hardware-assisted FSAA, which can be implemented using the newer detonator drivers. I only tested 2 settings; moving the D3D FSAA slider to the mid point, and moving it all the way to the right.

  Image quality was very good at the middle setting, but slight blurring was evident.  Benchmarks were reduced significantly, but at lower resolutions the scores were still plenty good enough for gaming.  Setting FSAA to 'full' did not improve image quality over the middle setting to my eye.  The chart below shows the performance hit for the two FSAA settings I used. 

I'm not certain that the FSAA slider works correctly in NVidia's 5.30 driver set.  Benchmarks, and image quality appeared almost identical at most resolutions and color depths when the FSAA slider was set in the middle (+4), or at maximum (+8). And as you can see from the above graph, Direct 3D benchmarks at most resolutions and color depths were the same at 1/2 and full FSAA settings.  This suggests that the slider is more like an on-off switch than a slider. I installed the newer 5.32 detonator driver files, and tried again.  The 3D Mark scores dropped by a few points, but the FSAA slider still exhibited the same odd behavior.  The newest Detonator-3 (6.18) drivers just came out, but I did not have time to test them out with the Gladiac before posting this.

I think for most people, it will be a toss-up whether you prefer 1200 x 1024 with 32 bit color and no FSAA, or 800 x 600 with 16 bit color and mid-level FSAA. The image is sharper with the higher resolution, but more realistic with FSAA and a lower resolution.  Some people will find FSAA irritating due to the slight blurring that it introduces into the image.  But it does eliminate most artifacts present in 3D accelerated moving images.  

Quake III Arena Scores:  No review would be complete without Quake III.  Most new games primarily support Direct 3D, but there are some great OpenGL games out there other than Quake, like Homeworld. I tested the Gladiac in an Athlon 700 overclocked to 105MHz on the bus (735MHz total), running on an Asus K7V motherboard. 

Settings: V-sync off, sound disabled, geometry and detail set to maximum, lightmap, demo001.

Incompatibilities:  There have been reports of Gladiac incompatibilities with certain systems. I can confirm that initially, we could not get the Gladiac to work on an Abit VT6X4 motherboard.  I was using a Gigabyte FC-PGA to slot-1 adapter card, and the Intel boxed 800E processor (SL463).  The system booted into Windows without problems, but would not run any 3D application without hanging.  I downloaded the latest flash BIOS update for the Gladiac, which you can get here, and updated the BIOS from version 0.00.03 to version 1.03.13.  The update worked without a hitch, and it fixed the incompatibility with the VT6X4 motherboard.  We also tested the Gladiac with an Asus K7V motherboard and Athlon processor without any problems.

Summary:  As NVidia continues to further implement things like improved FSAA support, and level of detail bias controls in their driver set, there are less and less reasons for recommending a Voodoo5 5500 card. But in the current driver sets available, NVidia's FSAA is not as well implemented as in 3dfx's V5 5500. And yet, GeForce-2 GTS cards like the Elsa Gladiac have substantially more 3D horsepower than the Voodoo5 5500 card, and the 32MB models cost about the same. If you have a very large monitor, like 21", and you expect to run your Gladiac at high resolutions like 1600 x 1200, then you will want the 64MB model.  But for folks with 19" or smaller monitors, the 32MB model should be fine.

  Until GeForce-Ultra cards hit the shelves, GeForce-2 GTS boards are the fastest available consumer level 3D accelerator cards. They have just about all the features you could want (except maybe environmental bump mapping).  The Elsa Gladiac is a well built, excellent performing GeForce GTS card that runs reliably at high AGP frequencies, making it a good card for overclockers. All of the 'Cons' listed below can be considered relatively minor.  


Pros: 
  • Among the fastest gaming cards available 
    ('nuff said! but there's also...)
  • Most features you could want in a 3D card
  • High quality construction
  • Really good drivers and utilities (finally!)
  • Overclocks well on the AGP slot (at least 80MHz)
  • Decent full-scene spatial antialiasing (FSAA)

Cons: 
  • Fairly high price
  • Possible motherboard incompatibilities 
    (a BIOS update for the Gladiac will fix this)
  • No environmental bump mapping
  • FSAA needs more work
  • No Glide support

Price: Approximately $300 US

Rating, Elsa Gladiac 32MB: 4.8 out of 5 smiley faces (96%) 
:) :) :) :) :)-

Availability: Good

 

Copyright August 16th, 2000