greomni.blogg.se

Directx 8.1
Directx 8.1













directx 8.1

Real world performance is that I see a marked increase in framerates at 1024x768 with medium / high quality when I swap my DX8 card for my 6800GT. The first is that I discount all 3dmark type benchmarks, they are synthetic and all manufacturers code the drivers to behave differently when in these applications. I hear what you say, but have 2 real problems with that arguement. With the GeForce 6 series, you can have both the image quality and the performance. Thus Far Cry runs better with lower shader models but at the expense of image quality. The GeForce 5900XT results in nasty blocky looking banding in places (like the floor of the blue room in the Architect level). Compare Far Cry's image quality of a GeForce FX 5900XT, which uses SM1.1 and 1.4 with that of an ATI Radeon 9800, which can run SM2.0 well, and the difference in the image quality is immediately apparent. The GeForce 6800 runs both DirectX 8.1 (UT2004) and DirectX 9 (Far Cry) games superbly whereas the GeForce FX 5900XT ran UT2004 well but struggled with Far Cry when SM2.0 was forced (it uses SM1.1 and 1.4 by default). For example, in upgrading from the poor-SM2.0-performing GeForce FX 5900XT card to a GeForce 6800 card, I have a massive increase of over 5,400 in the DirectX 9 testing 3DMark03 (from 4,900 to 10,300) compared with only 900 with 3DMark2001SE, which more CPU dependent. It's actually down to the card itself not the software as to how well DirectX 8.1 and 9 perform.















Directx 8.1