Helmut_AUT 0 Posted June 8, 2009 (edited) Hi Guys Has anyone got a good handle on what the true performance bottlenecks for high-resolution graphics in A2 are? The weird thing is that most of the settings have little to no impact. I get 20 avg. on my 9600GT at 1920x1200, and changing view distance, detail level or textures to low does little to increase that number. OTOH some scenery will provide 30+, but near objects (Houses, Trees) it's always down to 20. I can only assume it's the shading/post processing that truly bottlenecks the performance? Keygets shader "fix" at least gives me 5FPS in some situations, which are after all 20% more. The 9600GT only has 64 shading units - I'm gonna try a "Green edition" (need the low wattage for my weak PSU) 9800GT with 112 SPUs and see if that makes a remarkable difference. Also, I get the impression that on the larger Warfare maps, my CPU actually starts bottlenecking me (5000+ AMD). Would be cool if there was a good way to benchmark these bottlenecks. Edited June 8, 2009 by Helmut_AUT Share this post Link to post Share on other sites
novatekk 0 Posted June 8, 2009 (edited) Hey Helmut, For me the real bottleneck were the level of object detail and unfortunately the high screen resolution, I assume you really want to hold on to. Because I don't really want to have vehicles that are more than 20meters away to look like bricks, I wanted to keep the object detail on High or above. Just like you did, I tried every possible combination but found out that turning my resolution to 1280x960 gave me the best results. I know this sounds like a low resolution but this allowed me to put everything on Very High except detail settings which is on High, the viewing distance on 3000m and fillrate on 100% and still get between 40 to 50 frames in dense areas and 60+ in small villages and fields. Because of the smooth gameplay and high settings I really don't mind the "low resolution" anymore. In fact it looks better than having 1920x1200 with mediocre settings or even a fillrate below 100%. If there is any chance of lowering your resolution I would really recommend it, especially with the card you have now. EDIT: Okay, I just read in another topic that you have a 27" monitor so I guess the 1280x960 thing is not really going to work out for you, still my advice would be to lower your screen resolution to say 1680x1050 (or something like that) at least to see if that really makes all the difference. System: Core 2 Duo E6850 3 GHz, XFX 280GTX 1GIG, 4GB RAM (USE 3.5GB), WINDOWS VISTA 32BIT ULTIMATE Edited June 8, 2009 by novatekk Share this post Link to post Share on other sites
Helmut_AUT 0 Posted June 8, 2009 Thanks for the input. I actually tried some 80% fillrate (which should have pretty much the same effect as a lower 1680x resolution) and the improvements weren't that great, just a few FPS. I really love high-resolution gaming and anything less than native resolution looks really bad on that screen, so I'd rather turn down details and keep resolution. Maybe when FSAA is enabled, lower res might be an option. What drives me mad is simple that I can't put a finger on what exactly is the problem in A2. Some situations actually give me +5FPS if I switch to "over the shoulder" which takes out the motion blur, which seems a clear indicator of a shading bottleneck. In other situations this doesn't help at all. I'm really curious to see if the increased shader power of the 8800GT will make a difference. Unfortunatly I use a pre-built brand name system with 3-year onsite warranty, generally very convenient but it means my choice of Power supply and thus graphics card is limited. While the 9600GT and 9800GT are generally very close in performance, and the "green" edition actually runs lower clock speeds, I do wonder if more shaders can help a bit. Share this post Link to post Share on other sites
thr0tt 12 Posted June 8, 2009 http://forums.bistudio.com/showthread.php?t=73947 Try here it may help. Share this post Link to post Share on other sites
novatekk 0 Posted June 8, 2009 Well, for me, 1680x1050 @ 87% fillrate looked and performed far and far worse than 1280x960 @ 100% fillrate. But like you said, you really love the high resolutions so I guess that is a no-go ;) I do agree about the fact that with A2 just like in A1, it is really hard to put your finger on the problem. With A1, I had days where everything seemed to run ok and the next day, same settings, all felt sluggish. Let's hope that in the future they will replace the fillrate option with some proper AA, in the mean time good luck with your problem :( Share this post Link to post Share on other sites
Spokesperson 0 Posted June 8, 2009 I don't get a significant difference in FPS when lowering my resolution. So I keep 1920x1200. I have another screen which is 1280x1024 but there's no performance difference at all. 3 FPS better or not even that isn't worth it. Share this post Link to post Share on other sites
novatekk 0 Posted June 8, 2009 (edited) To Spokesperson: Giving your specs might actually help when discussing something like performance, also, a monitor is not just capable of displaying one resolution. Edited June 8, 2009 by novatekk Share this post Link to post Share on other sites
Helmut_AUT 0 Posted June 8, 2009 I do agree about the fact that with A2 just like in A1, it is really hard to put your finger on the problem. With A1, I had days where everything seemed to run ok and the next day, same settings, all felt sluggish. That's the kicker in the end: Even at 30+ FPS the game controls sometimes feel as "sluggish" as on 20. I seriously have no idea if I'm even truly suffering low FPS or just general slow input speed, but it just doesn't "feel" fast enough for my brain. Share this post Link to post Share on other sites
thr0tt 12 Posted June 8, 2009 Reduce everything in game to its lowest settings, keep Resolution to your screen res. reduce Fillrate to 0 (50% min I think), restart the game, it will look terrible but see if its still sluggish ? Share this post Link to post Share on other sites
maddogx 13 Posted June 8, 2009 Sluggish controls are usually caused by VSync+Triple buffering. It's best to force those off in your gfx driver control panel. Share this post Link to post Share on other sites
Helmut_AUT 0 Posted June 8, 2009 Ah, I ought to try that, Maddog and Thrott. Share this post Link to post Share on other sites
Helmut_AUT 0 Posted June 8, 2009 (edited) Okay, I tried Vsynch off. I also tried lowering resolution (not just fillrate) to 1680x1050. Results? Less than 1 FPS difference. Using the "Trail by Fire" single player mission where you assault that town in Utes. I mean, I'm aware that the 9600GT is not the hottest card anymore, never was. But the "recommended" (not minimum) specs are an AMD Athlon 64 X2 4400+ or faster (I got 5000+), 2GB RAM (got that) and 512MB 8800GT, which is in most game benchmarks only 15% faster than the 9600GT, if at all. Something is seriously wrong with this graphics engine if a system close to recommended at lowest graphic details and 1000m view distance only manages a 20fps average. The FPS nearly don't scale with the settings either, textures to low makes the only notable difference (it "feels faster"), but average FPS still vary only 3 frames. I'm tempted to say it's the CPU but in an empty editor map, moving into a village I get the same 21 FPS. I would be very surprised if we aren't going to see some patches for the graphics. They are good, but not that good to explain such a resource hunger. Edited June 8, 2009 by Helmut_AUT Share this post Link to post Share on other sites
xclusiv8 10 Posted June 15, 2009 Well since changing your graphics options doesnt give you a fps boost that must mean that your CPU is the bottleneck. Share this post Link to post Share on other sites
kklownboy 43 Posted June 15, 2009 (edited) and/or 1680/1050 is not a "recommended" resolution, more like 1024/768 for the "recommended". Welcome to the brave new world of LCD's and new games... you have to have more power if you think you can run 1600/xxxx or above. Its good to keep a nice CRT around for varying rez and refresh. Ebay get a 21 Sony , less than 100$, and will play from 8/6 to 20/15....pick your poison. Rule of thumb, pick your platform by CPU (biggest baddest you can get) everything else can be upgraded. Pick your Display (resolution) by what vidcard you have or will buy (more card than you need to run it native). Dont read the box/specs. Edited June 15, 2009 by kklownboy Share this post Link to post Share on other sites
Helmut_AUT 0 Posted June 15, 2009 I disagree on your last statement. I think it's ridicolous that the lying on the Box Specs now is at a point where "recommended" systems can't even play a low resolution/high detail or high resolution/low detail. I'm not saying I expect "Recommended" to run 2560x1200 all maxed out, but the game right now really needs optimization. Using Keygets noblur mod gives a few more frames, and depending on scene I can now get 30+. still slowdowns to 15 occasionally. Share this post Link to post Share on other sites
Heatseeker 0 Posted June 16, 2009 20 fps at 1920*1200 is realistic for that video card, we all love high resolution gaming but your not going to achieve that with a single mid range graphics card unless you play old games. The good news is that high end rigs arent getting very high frame rates either so theres hope the game can be optimised further down the road. The bad is.. well i think you might be expecting too much for too little and should have spent more on components and less on that 27" high resolution display. Share this post Link to post Share on other sites
Panthe 10 Posted June 16, 2009 That's the kicker in the end: Even at 30+ FPS the game controls sometimes feel as "sluggish" as on 20. I seriously have no idea if I'm even truly suffering low FPS or just general slow input speed, but it just doesn't "feel" fast enough for my brain. Do you get the same feeling when looking at the sky only and moving about with the mouse ? Since Bohemia employs this sluggish movement feel since ArmA 1 and I could never get rid of it, no matter the system it runs on, it felt best however when reaching 60fps which is easily achived by looking at the sky for benchmarking. The game came never even close to the realtime and precise movement in any curser-aimed first person game, so I personally just accepted that, wish I could tell you more. The tricks I'm trying out with my older 8800 GTS is forcing v-sync off by drivers and perhaps adding Kegety's no bloom, no blur mod. That also reduces your shader load. Share this post Link to post Share on other sites
mr_centipede 31 Posted June 16, 2009 @Helmut Are you using VISTA? I remember someone during ARMA1 days that he set his power management to "BALANCED". So that reduced the CPU power and as a result wont use the maximum power of your CPU. Try check it out if you're using VISTA. Share this post Link to post Share on other sites
bangtail 0 Posted June 16, 2009 No offense to the OP but 9600GT @ 1920 x 1200? The 9600 is a budget card, so expect budget performance. You have unrealistic expectations and minimum recommended means the absolute minimum that will play the game at the absolute lowest settings, not the minimum recommended to play the game at high resolutions. 8xxx/9xxx GT/GTS/GTX are old tech now. PCs are not consoles. I wish everything ran perfectly on a 9600 GT, it would have saved me a few thousand dollars :( Eth Share this post Link to post Share on other sites
Helmut_AUT 0 Posted June 16, 2009 Ethne, Heatseeker - do you think I'm a tech noobie or something? I run modern games with modern graphic engines like Stalker:CS, Fallout3 at 40+ FPS on everthing high detail at that resolution, with 4x FSAA. The 9600GT is an AWESOME card for the money, and absolutely capable of 1920x1200 resolutions. You can't tell me Fallout3 or Stalker with max view distance and max objects looks any worse or are more "dated" games than Arma2. Fact is that even A1 ran very much slower for most people than any other graphic-intense game, and the A2 engine has the same low performance. So much for "unrealistic expectations" or "expecting too much". Ethne, the RECOMMENDED SPECS are not the MINIMUM SPECS. Recommended is what BIS wrote on the box, a 8800GT, and RECOMMENDED should mean to be able and play the game at high details on average resolution (1280x1024) or low details at high resolution. Recommended used to be what you needed to really enjoy a game, these days Marketing seems to dictate that Recommended is what you need to barely play. I don't even want to know how A2 would play on Minimum listed. Panthe, you are right that the sluggish feel never entirely goes away, even in scenes with 40+ FPS (open field). It's like the movement is stuck at first, then accelerates to normal speed, compared to the direct, even movement speed you get in normal shooters where mouse movement and screen movement seem utterly directly related. Not so with BIS engines. If you haven't, you might want to try the 185 Drivers for the 8800GTS - they gave me 10% free performance coming from the 182. On WinXP they work very well, I heard problems for Vista Users however. Centipede, I'm on WinXP. But thanks for thinking out of the box. Share this post Link to post Share on other sites
FraG_AU 10 Posted June 16, 2009 Ethne, Heatseeker - do you think I'm a tech noobie or something?I run modern games with modern graphic engines like Stalker:CS, Fallout3 at 40+ FPS on everthing high detail at that resolution, with 4x FSAA. The 9600GT is an AWESOME card for the money, and absolutely capable of 1920x1200 resolutions. Helmut - 9600GT is really designed for 1280x1024 (or close to it), and in reality is lower spec then 8800GT which is 2-3 yrs old. Now to help you diagnose, what sort of FPS are you getting at 1024x768, or 1280x1024? Do they jump a huge amount? If not obviously it goes back to your cpu as well.. but sorry mate, your expectation is very very high for such a mid range card Share this post Link to post Share on other sites
bangtail 0 Posted June 16, 2009 Ethne, Heatseeker - do you think I'm a tech noobie or something?I run modern games with modern graphic engines like Stalker:CS, Fallout3 at 40+ FPS on everthing high detail at that resolution, with 4x FSAA. The 9600GT is an AWESOME card for the money, and absolutely capable of 1920x1200 resolutions. You can't tell me Fallout3 or Stalker with max view distance and max objects looks any worse or are more "dated" games than Arma2. Fact is that even A1 ran very much slower for most people than any other graphic-intense game, and the A2 engine has the same low performance. So much for "unrealistic expectations" or "expecting too much". Ethne, the RECOMMENDED SPECS are not the MINIMUM SPECS. Recommended is what BIS wrote on the box, a 8800GT, and RECOMMENDED should mean to be able and play the game at high details on average resolution (1280x1024) or low details at high resolution. Recommended used to be what you needed to really enjoy a game, these days Marketing seems to dictate that Recommended is what you need to barely play. I don't even want to know how A2 would play on Minimum listed. Panthe, you are right that the sluggish feel never entirely goes away, even in scenes with 40+ FPS (open field). It's like the movement is stuck at first, then accelerates to normal speed, compared to the direct, even movement speed you get in normal shooters where mouse movement and screen movement seem utterly directly related. Not so with BIS engines. If you haven't, you might want to try the 185 Drivers for the 8800GTS - they gave me 10% free performance coming from the 182. On WinXP they work very well, I heard problems for Vista Users however. Centipede, I'm on WinXP. But thanks for thinking out of the box. I never questioned your "tech level". I merely commented that the 9600 GT is a budget card based on 2+ year old technology and as such, you can't expect it to play the latest games at 1920 x 1280 w/ detail @ high. A2 looks better than FO3 and STALKER IMHO and is infinitely more complex. I seriously doubt you are playing STALKER on a 9600 GT @ Full detail with 4 x AA @ 1920 x 1280 with 40+ FPS. Eth PS : A1 runs perfectly, even if I switch SLI off as does A2. No matter how you slice it, the 9600 just isnt up to the challenge of current games @ high detail/resolutions. Share this post Link to post Share on other sites
Heatseeker 0 Posted June 16, 2009 A2 looks better than FO3 and STALKER IMHO and is infinitely more complex. I seriously doubt you are playing STALKER on a 9600 GT @ Full detail with 4 x AA @ 1920 x 1280 with 40+ FPS. Stalker CS in full detail (DX10) looks great, wet surfaces, smooth water edges, sun rays, the whole nine yards.. Arma II doesnt look better or is that complex, its just diferent since its a really big game. We are in agreement over the 9600GT, i checked some benchmarks and it really isnt a card for high resolutions in demanding games. 1920*1200 resolution = 20 frames per second, scores bellow a 8800 GT even. Share this post Link to post Share on other sites
bangtail 0 Posted June 16, 2009 (edited) Stalker CS in full detail (DX10) looks great, wet surfaces, smooth water edges, sun rays, the whole nine yards.. Arma II doesnt look better or is that complex, its just diferent since its a really big game. We are in agreement over the 9600GT, i checked some benchmarks and it really isnt a card for high resolutions in demanding games. 1920*1200 resolution = 20 frames per second, scores bellow a 8800 GT even. I generally think ArmA 2 looks better than STALKER, but it's all subjective at the end of the day and ArmA 2 is more complex (Not necessarily just graphically) JFYI. Sorry, never liked STALKER TBH, it does make decent use of DX10 but that's about all. Eth Edited June 16, 2009 by BangTail Share this post Link to post Share on other sites
Helmut_AUT 0 Posted June 16, 2009 (edited) You guys are funny. Want me to provide FRAPS screenshots of Stalker and other games so you can compare how they look and run here, or what? I know what I'm seeing. And when I see Fallout3 (released about six months ago, so I would count that as pretty current) running at full detail 1920x1200 at full tilt, with grass, trees, fences, houses everywhere in a very large view distance, and then I turn A2 down to 500 meters VD, set all details to low and get half the framerate at best - I guess that means suddenly my 9600GT degraded in the last six months magically? I really shouldn't care what some fellow gamers on a message board tell me, but the fact is that I won't accept the argument that the 9600GT can't run modern games at high resolution. It can, and it does. However it never managed to get anywhere near the framerate out of A1 that it gets for Stalker and FO3 either, so maybe we could just agree that BIS never had the fastest graphics engine in town, especially since walking into a forrest at Chenarus can bring much newer cards to their knees. It's not hard to see why that is - they slapped a full-screen 100% everthing post processing process on the engine back in A1 to get current graphics without a total rewrite of their core engine. And post processing is costly for every game released so far. But I guess I'm just stupid for expecting a current game at low details to perform equal to a six-month old game at high details, hmm? Or maybe the guys at other software companies simple realized that shading the whole screen just to generate some effects is not the best solution. Oh, and don't take this as an Anti-BIS Rant: I love the game, just as I loved A1. But trying to put the blame on medium range hardware when other stuff runs much faster on that same hardware is just fanboism. And at the end of day, a 9600GT averages 10 to 15% less performance in benchmarks compared to THEIR recommended specs, so maybe we can also agree that the recommended specs are wishfull thinking. Edited June 16, 2009 by Helmut_AUT Share this post Link to post Share on other sites