Jump to content
Sign in to follow this  
Chill

Gforce 2 Fix!

Recommended Posts

If you guys are using Detornator 2 reinstall your original drivers! For some reason Detornator 2 drivers do not allow the use of W buffer.

(Edited by Chill at 3:56 am on Dec. 8, 2001)

Share this post


Link to post
Share on other sites

Most people have moved on and are using Detonator 3 or Detonator 4 drivers. Which particular version -- not generation -- of the Detonators are you using, and with which card?

Share this post


Link to post
Share on other sites

umm. i had detonator 2 and it w buffer works on me.. but i switch to an older driver just in case. but is it nvidia cards?

Share this post


Link to post
Share on other sites

No, NVIDIA cards are officially supported, and work fine. However, they should be run in 32-bit color mode, not 16-bit. Using 16-bit mode can cause the problems with flickering or missing textures on things like helmets etc.

Share this post


Link to post
Share on other sites

Haha Mister Frag, you try running OPF on a GeForce2 MX at 1024x768 in 32 bit color-- you'll be watching a slideshow. It runs smooth as silk on mine in 16bit color but crappy in 32bit, which is why i wish the w-buffer worked. Stinks that it's older drivers that work, I guess we either have to live with it running bad and looking good or running good and looking bad, if you've got a slower GeForce2.

Share this post


Link to post
Share on other sites

Ok I downgraded my Nvidia Drivers down to version 6.something and I could then enable the w-buffer-- whoohoo! But then I couldn't go into resolutions above 800x600 in OFP, not to mention it ran pretty poorly. Is there any way to hack the newer drivers to enable the w-buffer?

Share this post


Link to post
Share on other sites

</span><table border="0" align="center" width="95%" cellpadding="3" cellspacing="1"><tr><td>Quote </td></tr><tr><td id="QUOTE">Quote: from TheRealAlan on 6:58 am on Dec. 8, 2001

Haha Mister Frag, you try running OPF on a GeForce2 MX at 1024x768 in 32 bit color-- you'll be watching a slideshow. It runs smooth as silk on mine in 16bit color but crappy in 32bit, which is why i wish the w-buffer worked. Stinks that it's older drivers that work, I guess we either have to live with it running bad and looking good or running good and looking bad, if you've got a slower GeForce2.

<span id='postcolor'>

Geforce 2 MX (Creative 3D Annilator MX)

1024x768, 32-bit color mode

Latest Detonator Drivers

---- Absolutely no problems here, smooth as can be.

Share this post


Link to post
Share on other sites

Ok since I seem to be the idiot here, I have the following:

1.2ghz Tbird

384mb ram

Geforce2MX 32mb

DirectX 8.1

and a VIA chipset with lates 4in1 drivers

and anything i've ever tried to run in 32 bit color has been very choppy, even scrolling through webpages with the desktop at 32bit. And I'm a tweaking nut, i'm always trying to get things to run better, but I"ve never gotten the performance out of my Geforce 2 that i've wanted but i thought that that why it was a $50 card. So if there's some trick or somethign to getting 32bit color playable, please let me know. The very newest detonator drivers that came out last week seem to decrease performance so i went back to version 22.83 or whatever they are.

Share this post


Link to post
Share on other sites

what kind of performance increase would w buffer give for say a geforce2 pro?

atm i run in 1024x768x32 on a p3 800 and it goes around 25-35 fps on that malden intro (where the apc comes along and shoots those russian doodes)

Share this post


Link to post
Share on other sites

23.11 makes ofp run slightly smoother but i still cant enable w buffer from the video menu

do i have to do some command line thing or what?

Share this post


Link to post
Share on other sites

nocabiwik, the w-buffer is only for if you're running in 16bit color. And it's to make it look better, not run better. In 32bit color you don't need it. And you're not going to be able to enable it anyway cus apparently in the last several generations of drivers it hasn't been supported.

Share this post


Link to post
Share on other sites

TheRealAlan, have you tried running MadOnion's 3DMark2001 and used the online results browser to compare your throughput to that of other people with similar systems?

I have no idea right now what might be wrong with your particular configuration, but a GeForce2 MX should run OFP just fine.

Regarding the W-Buffer, I can enable it on my system (with the V23.11 Detonator drivers) using NVMax.

Share this post


Link to post
Share on other sites

In 16 bit mode it is so so smooth! There is absolutely no diff in 32 to 16 bot, except in 16 bit it is a lot smoother! I have a p3 1gz with a 64 m geforce 2 card.

Share this post


Link to post
Share on other sites

do u have udma enabled?i had to download fix for my chipset to get this to work--speeds everything up no end-b4 i installed fix udma mode wouldnt even cut and paste files above 20 megs-i can now install games in seconds rather than 20 mins tounge.gifwink.gifwink.gif

Share this post


Link to post
Share on other sites

Ok people, the answer to your problems are as follows:

I e-mailed NVidia a few months back asking about the exact same GeForce 2 MX, 32 compared to 16 bit performance issues you are discussing. They said that some of the GeForce 2 MX family of cards have a lower Memory Interface than others, which is the reason why some people are reporting that their cards run 16 and 32 bit colour with no framerate loss, whilst others have really poor 32 bit performance.

One example is this,

Many different brand GeForce 2 MX 200's have a 64-bit memory interface, whilst the GeForce 2 MX 400 has a 128-bit interface. I have actually compared these two cards in performance tests (using the Palit Daytona brand name cards) and the 400 version runs 32-bit colour at exactly the same speed as when it was running in 16-bit colour mode.

The 200 version on the other hand ran 32-bit colour at a crawl, especially when i was testing it out on Return To Castle Wolfenstein and Op Flash. However, at 16-bit colour it ran like a dream but in many games the 16-bit colour mode caused ALOT of graphic glitches and in Op Flash especially.

This isn't just restricted to the 200 and 400 versions of the GeForce 2 MX cards. Because i have tested Op Flash on an old Hercules 3d Prophet GeForce 2 MX (yep, the very first version) and in 32-bit colour mode it ran exactly the same as 16-bit colour mode.

The whole problem comes down to what sized Memory Interface the manufacturer decides to add to it's GeForce 2 card. Some brands have a 128-bit Interface on their lowly GeForce 2 MX 100's while other brands decide to put a 32-bit interface on their same version of the card.

It seems like when it comes to building the cards there is no industry standard as to what the features should be, especially when it comes to memory.

So i hope that exaplins it for you. The people who are complaining about bad 32 bit framerates might have a 64mb GeForce 2 MX 200, but it only has a Memory Interface of 64-bit or lower. While the people who are saying that their cards run at the exact same speed whether it be 32-bit colour or 16-bit colour, might have a 64mb GeForce 2 MX 200, but with a 128-bit Memory Interface.

Like i said, the whole issue comes down to what sized Memory Interface the manufacturer has decided to put on the card. You might seem to have a great 64mb GeForce 2 MX 400, but it might only have a 64-bit Memory Interface.

Keep it in mind when your going to look for another GeForce 2 card.

But I also have a feeling that nowadays, anything over a GeForce 2 MX 400 DOES have a 128-bit Memory Interface. I searched alot of different card specs on the net and all of the cards that were newer than the GeForce 2 MX 400 (newer being the GeForce 2 GTS DDR all the way up to the latest GeForce 3 cards) said that they had a 128-bit Interface.

Oh, and i tested the cards using the latest, and some of the older Detonator Drivers and I also tweaked the BIOS settings (and used NVMAX) to get the best performance possible. So it DOESN'T have anything to do with what features are enabled or disabled, it is (as NVIDIA itself explained) the Memory Interface.

Memory Interface my friends..............Memory Interface indeed.......

Share this post


Link to post
Share on other sites

Hey RealAlan,

What Motherboard to you have???

A7V?

A7VPro

A7V266??

This might help us with come up with a fix for you.

thanks,

Burl

A7V

ThunderBird 750

512MB SDRAM PC133

20G WD HDD

Geforce2 MX400 64MB  ( Jaton )

SoundBlaster Live Value

Netgear 10/100 NIC

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
Sign in to follow this  

×