Jump to content
Sign in to follow this  
Helmut_AUT

Performance woes - 1920x1200 with 9600GT

Recommended Posts

You guys are funny. Want me to provide FRAPS screenshots of Stalker and other games so you can compare how they look and run here, or what?

I know what I'm seeing. And when I see Fallout3 (released about six months ago, so I would count that as pretty current) running at full detail 1920x1200 at full tilt, with grass, trees, fences, houses everywhere in a very large view distance, and then I turn A2 down to 500 meters VD, set all details to low and get half the framerate at best - I guess that means suddenly my 9600GT degraded in the last six months magically?

I really shouldn't care what some fellow gamers on a message board tell me, but the fact is that I won't accept the argument that the 9600GT can't run modern games at high resolution. It can, and it does.

However it never managed to get anywhere near the framerate out of A1 that it gets for Stalker and FO3 either, so maybe we could just agree that BIS never had the fastest graphics engine in town, especially since walking into a forrest at Chenarus can bring much newer cards to their knees.

But I guess I'm just stupid for expecting a current game at low details to perform equal to a six-month old game at high details, hmm?

http://www.guru3d.com/article/geforce-9600-gt-512mb-6way-shootout-review/25

Nope, and no amount of arguing will change anything. Sincerely, enjoy your 9600 GT.

Eth

Edited by BangTail

Share this post


Link to post
Share on other sites

I really shouldn't care what some fellow gamers on a message board tell me, but the fact is that I won't accept the argument that the 9600GT can't run modern games at high resolution. It can, and it does.

Ok, run ARMA on an empty island with no AI.. tell me how it goes?

You can't compare Stalker/Fallout to this game at all because of the AI etc. Wheras the "enemies" just spawn for you in these games, they don't in arma so the cpu is doing a lot more then just processing 10 monsters near you.

Its not apples to apples, and while I agree that the ARMA II engine is far from optimised at this stage (and drivers also), a 9600gt almost costs as much as the game itself.. doesn't that say something.. Its a midrange card from over a year ago, heck I have had 2 GFX cards in that time :) Its a budget card aimed at casual gamers that usually don't need all the "bells and whistles" or run super high resolutions with AA/AF. Ie - i'd get one as a option for a HT system perhaps etc..

LOL, I have a state of the art system and I can't get more the 35fps on avg and like I said I'd be happy with 50 :) (Normally anything less then 100 i cry)

Share this post


Link to post
Share on other sites

Dear Ethne,

your gleefully posted benchmark proves actually nothing. They are using software-based Anti-Aliasing, while newer drivers since then both brought more performance AND support for normal hardware FSAA (I'm using 2x FSAA). They are using 16x Aniso, I'm using 2x Aniso.

And my 9600GT is factory overclocked to 2000Mhz memory and 700Mhz Core, not 1800 and 650.

They are getting 32 avg. up to 37 for the overclocked card, I'm getting 40+ which is easily explained by the above differences in settings, drivers (174 to 185) and clock speed compared to a standard sample.

So, the fact remains I can run modern games - as recent as F3 which performs even better than Stalker since Nvidia really optimized their drivers for F3 and Oblivion - at very playable framerates (always above 30) with max. details, but A2 gets framerate drops down to 15 FPS even at lowest quality settings. Don't tell me it's just the resolution, the facts clearly speak against it. A2 has no advanced graphic effects or features (compared to Stalker, F3) that would justify such a much lower framerate by themself, and even turning down the view distance to 500m has no noticeable effect.

Frag, I test Arma Framerates always on empty islands in the editor, running or being driven by AI around a specific path, with some weapons firing to get smoke FX. So the CPU has zero do to in these tests.

As you say, your "state of the art system" gets you 35 avg, I get 29 avg on my settings. I guess you can also get drops as low as sub-20 on your system inside forrests with smoke lingering or multiple light sources at night. Shouldn't the gap be MUCH larger if the engine was properly using it's ressources?

It's turned into quite a pointless discussion here now - I actually had forgotten about this topic until someone bumped it yesterday. But if you guys insist on knowing better than I myself what my framerates are and should be in other current games, I'm here all night to prove you wrong.

Edited by Helmut_AUT

Share this post


Link to post
Share on other sites
Dear Ethne,

your gleefully posted benchmark proves actually nothing. They are using software-based Anti-Aliasing, while newer drivers since then both brought more performance AND support for normal hardware FSAA (I'm suing 2x FSAA). They are using 16x Aniso, I'm using 2x Aniso.

Gleeful? Hardly. I do tire of people wondering why their 2+ year old hardware won't run new games at high resolutions/details. It's great that it ran FO3 and STALKER to your satisfaction. You were lucky, it doesnt change the fact that the 9600 GT is an old, budget oriented card.

And my 9600GT is factory overclocked to 2000Mhz memory and 700Mhz Core.

They are getting 32 avg. at 1920x1200, I'm getting 40+ which is easily explained by the above differences in settings, drivers and clock speed.

It's not just about speed, there are MANY other factors involved.

So, the fact remains I can run modern games - as recent as F3 - at very playable framerates with max. details, but A2 gets framerate drops down to 15 FPS even at lowest quality settings. Don't tell me it's just the resolution, the facts clearly speak against it.

Not a fact when it comes to ArmA 2 (and probably most other hardware intensive games now, and in the future).

Im not going to continue this any further.

Budget card = Budget performance. That's just the way it is I'm afraid.

As I said before, I could have saved thousands if the 9600 GT could compete with other more expensive solutions. Sadly, it can't.

I sincerely wish you the best of luck finding a solution that makes the game playable for you :)

Eth

Edited by BangTail

Share this post


Link to post
Share on other sites
The 9600GT only has 64 shading units - I'm gonna try a "Green edition" (need the low wattage for my weak PSU) 9800GT with 112 SPUs and see if that makes a remarkable difference.

It won't. Get a HD4890 or the like. Your CPU could also be an issue if its an old Athlon dual-core.

Edited by echo1

Share this post


Link to post
Share on other sites
It's turned into quite a pointless discussion here now - I actually had forgotten about this topic until someone bumped it yesterday. But if you guys insist on knowing better than I myself what my framerates are and should be in other current games, I'm here all night to prove you wrong.

Yes it has hasn't it :) I do wish you luck in getting things running better, and I wait patiently for next few weeks to hopefully get a performance boost.. My GPU is not even in 2nd gear (only goes 10C above ambient ffs!) and my CPU is used at most 30% so yeah I feel your pain, fingers crossed for the next update!

Share this post


Link to post
Share on other sites

Ethne, if it makes you feel all superior to claim that a 9600GT can't run recent games in 1920x1200, then be my guest. End of discussion for me too since you obviously ignore your own benchmark links.

Frag, I expect either the 1.02 or 1.03 patch to enable FSAA and come with at least some performance improvements. Reason being they should know by now that people REALLY want FSAA, and while they are redoing the graphics code to include this they might as well pull out a few stops. For example by adding the option to disable Motion Blur, or reduce shader demands.

Most lovely would be an option to totally disable Post Processing. I still think they added this in A1 just to generate effects (Sunglare, NVG-eye adjustments) on top of an old engine that couldn't otherwise do it, while other current games use different and more efficient shading methods for light effects.

That is perhaps the weirdest thing about Arma's engine at all: Most other games have some quality settings that have a massive impact on framerate. Like, 30% more frames by disabling such options. In A1 and A2, the most you get by going from super high to super low might be 15%, there is not a single option that really has a particular significant impact. Which leds me to believe it's the heavy shading that can not be turned off.

Edited by Helmut_AUT

Share this post


Link to post
Share on other sites

I hope they will improve ArmA 2 with 1.02 like they did with ArmA's 1.08 :) It is bad if you play in MP and lokking down the ground (only the ground texture) does not increase your fps. In editor or with only two mates MP is near 40 fps but with many players scattered around chernarus gives me only ~25 fps with my 4850e@ 2,7Ghz + HD 4850. But as said above high end systems got similar results...

Share this post


Link to post
Share on other sites
Minimal PC Requirements

CPU: Dual Core Intel Pentium 4 3.0 GHz / Intel Core 2.0 GHz / AMD Athlon 3200+ or faster

RAM: 1 GB

Video Card: NVIDIA GeForce 7800 / ATI Radeon 1800 with Shader Model 3 and 256 MB VRAM or faster

OS: Windows XP

Optimal PC Requirements

CPU: Intel Core 2.8 GHz / AMD Athlon 64 X2 4400+ or faster

RAM: 2 GB

Video Card: NVIDIA GeForce 8800GT / ATI Radeon 4850 with Shader Model 3 and 512 MB VRAM or faster

OS: Windows XP or Vista

<- from the offical wiki.

Have to say I get the guy's point - he's at the so-called 'recommended optimal' spec & can't play with anything over low settings. Makes you wonder how the game performs with the 'recommended mininium' spec......

I remember this farce well from Arma 1 - the specs quoted on the packet were woefully understated, to the point of a downright lie..., probablt be 1.08 patch before you get a playable game....

Share this post


Link to post
Share on other sites

That is perhaps the weirdest thing about Arma's engine at all: Most other games have some quality settings that have a massive impact on framerate. Like, 30% more frames by disabling such options. In A1 and A2, the most you get by going from super high to super low might be 15%, there is not a single option that really has a particular significant impact. Which leds me to believe it's the heavy shading that can not be turned off.

Totally agree. I have come to the same conclusion. There must be something wrong in the engine. Difference between very low and very high is too minimal.

Only thing that matters is fillrate. If i put it to 200% my game turns in to slide show 1-5 fps. :D

Share this post


Link to post
Share on other sites

Have to say I get the guy's point - he's at the so-called 'recommended optimal' spec & can't play with anything over low settings. Makes you wonder how the game performs with the 'recommended mininium' spec.......

If you are referring to the original poster, his CPU isn't much better than the bare minimum, and for what it's worth, the graphics card is somewhat slower than the recommended one.

At any rate, the vast vast majority of games on the market will not be able to run at 1920x1200 at high quality settings when run on a PC of the "recommended" spec. It's fair enough though - if you gave the specs required to run at smooth frames on high res screens (baring in mind that most people out there have screens that are 1680x1050 or less) at high details, they'd be so high as to put people off. BIS is no different from most other companies in this regard.

Share this post


Link to post
Share on other sites
If you are referring to the original poster, his CPU isn't much better than the bare minimum

think his cpu is faster than the rec. opt. spec if you go by this -

AMD Athlon 64 X2 4400+ or faster (I got 5000+),

Fair point about the resolution - though i've seen plenty of people with above the rec. opt. spec unable to get above lo-med settings : 1280 x 1080 res.

Rec min. spec should be the bare min require to have a PLAYABLE game at low settings..... Rec. Opt spec should be that req'ed to have a PLAYABLE game at HIGH spec at a reasonable resolution......

It really cheered me up when I was preordering the game to have a quick look at the Troubleshooting section afterwards - and see whole threads devoted to pipeline burst ratio/pci-e exchange rates, DCP latencies etc & a whole pile of other techincal cr*p which is way over the heads of the average PC user....

I particularly liked the 'Check your CPU>GPU<CPU etc etc' thread - lots of tests suggested to try by one of the moderators...... basically saying that if certain ratios aren't high enough, you're basically stuffed, regardless of the hardware you're using. What's the point in providing a benchmark spec at all ?

(what cheered me up was the warm sense of deja vue from Arma 1 ....

Share this post


Link to post
Share on other sites

The system requirements are kinda strange, because it seems to think that a 2.8GHz Core 2 Duo and an Athlon X2 4400+ are in the same category (very big difference). And I'd say that the Intel spec is more correct, and the corresponding AMD would should be 3GHz+ (X2 6000+ I think is the appropriate model)

I have a similar spec to his so I'm somewhat concerned. Although my monitor is 1600x1200 not 1900x1200.

Edited by echo1

Share this post


Link to post
Share on other sites

To clarify my issue with the "Minimum" and "Optimal Recommended" Specs:

Like I said, I would expect "Optimal PC Requirements" (thanks Gutter for posting them) to allow for high quality graphics (at least medium to high settings) at average resolution (1280x1024). If using a higher resolution than average (1920x1200) in turn it should be possible to run with low details and still get very good framerate.

In my case, even if I'm was using 1280x1024, I still get slowdowns under 20 FPS regularly (in an empty editor map, just with some smoke on screen) if using medium to high quality settings. That is certainly not right to call this "Optimal system specs" unless BIS thinks frames under 20 are "optimal".

And an overclocked 9600GT like mine is in many Crysis Benchmarks or other modern games maybe 10, max. 15% slower than a 8800GT. That bit of performance difference doesn't explain 15 Frames a second slowdowns.

As for CPU: Running the included Warfare Missions on Chenarus gives me 16 FPS constant. CPU is bottlenecking, even if I disable some AI on both sides. So with a CPU better than "Optimal PC requirements" (I have an Ahtlon 64 X2 5000+) the majority of included missions (also the freeform campaign missions) are unplayable due to CPU Bottleneck.

It's not like I believed the official system specs, as a long-time gamer I learned that they are usually false, but I find it really cheap that a moderately aged dual-core CPU can't even play the included campaign since it requires too many AI handling. Basically right now I can only play small editor missions that I'm creating myself (thankfully SecOps and ACM makes this a bit more random) - and seriously, the 5000+ AMD isn't THAT old that it should be unable to run the included content.

Edited by Helmut_AUT

Share this post


Link to post
Share on other sites

Well you'll get more answers when it releases in Europe I'm sure there will be more complaints here :D

Share this post


Link to post
Share on other sites
Panthe, you are right that the sluggish feel never entirely goes away, even in scenes with 40+ FPS (open field). It's like the movement is stuck at first, then accelerates to normal speed, compared to the direct, even movement speed you get in normal shooters where mouse movement and screen movement seem utterly directly related. Not so with BIS engines.

If you haven't, you might want to try the 185 Drivers for the 8800GTS - they gave me 10% free performance coming from the 182. On WinXP they work very well, I heard problems for Vista Users however.

I'm running on XP and I upgraded my Nvidia drivers from 182.x to 185.x yesterday, unfortunately I experienced a noticeable performance loss. How much exactly I have to fraps first. When running on 182.x forcing off VSync gave me a little boost but with the 185.x I saw no difference. So far it sounds all bad for me but the good news I do no longer experience the "Receiving ..." bug mid-game mentioned in this other thread hence I can now fully enjoy multiplayer games.

Are you using Kegety's no bloom, no blur mod yet to lower your shader load, even though it allowed on every server ? I'm giving it a shot now, the blurry zoom as the tank gunner is almost unbearable anyway.

Wish you best of luck with your performance tuning also since we both run similar graphic cards, I'll subscribed to this thread and read up on it in case you achieve some major breakthrough :)

Share this post


Link to post
Share on other sites

Shame to hear about the 185 problems for you. What card are you using? Maybe some default settings in the driver kicked in that you had disabled before?

Ketgets mod gets me about 5 FPS, up from 22 to 28 even. Very important, although I liked the effect itself. Another thing that gave a few FPS is object detail to very low - haven't yet seen what the difference is visually, but it helps notable with framerate.

VD on the other hand I now have at 3000 instead of 1200, and the benchmark in Strelak village comes out the same for me.

I think that about was the deal - newer driver, no blur and very low object detail braucht my average from 22 to 29, and the lows from 15 to 19. It doesn't look quite as nice as A1 however with everthing low, which is kind of ironic.

Share this post


Link to post
Share on other sites
I'm running on XP and I upgraded my Nvidia drivers from 182.x to 185.x yesterday, unfortunately I experienced a noticeable performance loss. How much exactly I have to fraps first. When running on 182.x forcing off VSync gave me a little boost but with the 185.x I saw no difference. So far it sounds all bad for me but the good news I do no longer experience the "Receiving ..." bug mid-game mentioned in this other thread hence I can now fully enjoy multiplayer games.

Are you using Kegety's no bloom, no blur mod yet to lower your shader load, even though it allowed on every server ? I'm giving it a shot now, the blurry zoom as the tank gunner is almost unbearable anyway.

Wish you best of luck with your performance tuning also since we both run similar graphic cards, I'll subscribed to this thread and read up on it in case you achieve some major breakthrough :)

Get rid of the 185 drivers, they are no good. Go with the 186's, they give better performance.

Share this post


Link to post
Share on other sites
Another thing that gave a few FPS is object detail to very low - haven't yet seen what the difference is visually, but it helps notable with framerate.

Did that, looks the same and performance is way better. Great tip !

Get rid of the 185 drivers, they are no good. Go with the 186's, they give better performance.

Also updated today, didn't know new drivers were out. Works like a charm, thanks !

Share this post


Link to post
Share on other sites
Hi Guys.

Hi

Has anyone got a good handle on what the true performance bottlenecks for high-resolution graphics in A2 are?

Yes, Match your Hardware to your Display. Then set the appropriate settings.

The weird thing is that most of the settings have little to no impact. I get 20 avg. on my 9600GT at 1920x1200, ...

And at that resolution(1900/1200) you will be getting less than 18fps in Crysis. Your bottle neck your card or your Display.

The 9600GT only has 64 shading units - I'm gonna try a "Green edition" (need the low wattage for my weak PSU) 9800GT with 112 SPUs and see if that makes a remarkable difference.

It didnt "make a remarkable" difference.A 9800GT will be getting less than 25fps in Crysis @1900/1200. Then you have the Vsync frame rate of 60hz/30.

Your bottle neck your card or your Display. Or play at 1280/1024 default settings

Edited by kklownboy

Share this post


Link to post
Share on other sites

I had same problem as topic author, (had limit of 30 fps , with a drops to 20 between more objects), i solved that by changing nvidia driver settings , like canceling overriding AA or something like that.. I hope that will give you clue..I own 8800 gts, and fps improved abit , also removing that stupid fps cap

Share this post


Link to post
Share on other sites
Yes, Match your Hardware to your Display. Then set the appropriate settings.

Can we please stop with the "9600GT can not run 1920x1200 games" mythos? If you consider F3 and Stalker to be ugly games, maybe. Personally I'd say, with A2 at low details and low view distance it can actually look a bit worse than these games - and still runs like crap.

Share this post


Link to post
Share on other sites

9600GT is a low to mid range card these days (hell an 8800GTX kicks it's ass & the top of the lines 9 series card too), another case of people overstretching thier demands on hardware, big resolutions (in ARMA 2) require big graphics cards & big cpu's.

Edited by Razorman

Share this post


Link to post
Share on other sites

And why does A2 require "big graphics card" when at lowest setting it doesn't look any better than other recent games running at twice the speed?

What exactly is Arma2 doing more, graphics-wise, that justifies the double demand on Hardware? Just because they are triple-shading everthing on the screen since the core engine is to old to do these effects any other way?

EDITED the rest since Razor and I are obviously much more in agreement than I thought. Sorry for the harsh tone.

Edited by Helmut_AUT

Share this post


Link to post
Share on other sites
And why does A2 require "big graphics card" when at lowest setting it doesn't look any better than other recent games running at twice the speed?

What exactly is Arma2 doing more, graphics-wise, that justifies the double demand on Hardware? Just because they are triple-shading everthing on the screen since the core engine is to old to do these effects any other way?

EDITED the rest since Razor and I are obviously much more in agreement than I thought. Sorry for the harsh tone.

Not to start this again but Helmut - the game does have issues, there is no questioning that BUT the 9600 GT will never play ArmA 2 at 1920 x 1200 at a reasonable clip. As others have already explained, there is a lot more going on in ArmA 2 than in FO3 or STALKER.

Chances are that even after significant patching, the 9600 is not going to run ArmA 2 very well.

Eth

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
Sign in to follow this  

×