Jump to content
k3lt

Low CPU utilization & Low FPS

Recommended Posts

It's good that you and your group are some of the lucky demographic that can play it well. But, there are many in the other demographic as well. This is well-documented going back several years now.

I have a friend who played Arma 2 fine, with a far inferior rig to mine. I played it with fps dipping into the teens no matter what settings I chose. I've seen people with the most hoss rigs one can imagine struggle, and (of course) could only watch as their hardware goes under-utilized. There is no rhyme or reason to it. Some folks do alright with it. Others spend as much time, reading, tweaking, testing, etc as we do actually being able to play and enjoy the game. And, that is frustrating to the nth degree when coupled with the lack of acknowledgement from the developers. Was cranking along at 20-23 fps last night watching my video card use in the 30% range, never above 50%, of its potential.

I feel a bit duped. I could be wrong, but this seems an issue with the foundation of this game that was carried over from arma 2, rather than addressed at the outset. Pissed that I ended up giving them money at the moment, b/c I told myself I wouldn't until I knew they had resolved this. Now, they got it. So, one more person they don't have to worry about, I guess.

I think it's great that there is a true pc game still out there, and that they have always been good about keeping the patches coming. But, they seemingly refuse to address/choose to ignore the most fundamental flaw of their game for years now.

It certainly may not be the case, but from the consumer side of it it *feels* like they know, and have known, but for whatever reason choose not to address it due to fundamental flaw in the engine they built off of, "too much work, and not worth it", or something similar.

Under-utilization of hardware/resources, and terrible scaling as evidenced by the lack of measurable results from resolution changes, etc.

This whole notion that arma is too hoss for our hardware to handle is a fallacy that needs to be buried (imo). At least a claim that is unproven considering Arma won't even use enough of our (some of us, at least) hardware to make that determination. And, it does little more than provide cover for an actual, long-standing, fundamental issue that continues to go un-addressed, (or publicly acknowledged) for years.

well sixgears2 i don´t know your pc specs, but all of my arma 3 group can easely play in urban enviroments with ultra settings, including myself.

settings to low just is usefull for pvp/tvt, as every frame counts there. in coop when playing on a well hosted server at 50fps constantly(my^^) you are able to play at frame rates from 50-120fps. and you wont believe it but at all player on my server the gpu is at 100% load and cpu at around 70-90%.

last dev updates fixed the main engines problems: physx was fucked up and pip also, all fixed now.

slowest rigs of my group: i7 920+gtx 560ti and the slowest phenom2 x4 955+ 5850(both non oc)

fastest rigs: i7 2700k at at least 4ghz+ 7970(5 rigs)

my rig: i7 2700k@ 4,6ghz+7970 and 2133RAM, hosting dedi and client on same machine, or even host local.

i mean sixgears2, i can host locally evomission with high enemy count, fps dropping to around 45. that wasn´t even thinkable in arma 2 or even arma 1, so if your game buckles at urban enviroment, your rig must be slower than phenom2 x4 955 and 5850, or you can´t set up your settings right. posting the settings of the phenom rig asap after i have the possibility to ask the owner of it.

Share this post


Link to post
Share on other sites
Then don't expect anything other than disappointment. With my settings I get a solid 45-60fps anywhere I go as long as the mission scripts are clean and there isn't an obnoxious amount of A.I. around. Seems you do not know how to properly setup your game for your system if your not getting that. And if you are basing your experiences on Wasteland, a mod that was rushed into A3 and runs on servers that are game clients, well.

i dont play wasteland, but i have played almost everything else. my settings are mostly with things on standard/high with shadows, pip, clouds off and all ranges below 1400.

i get 15-50 fps. mostly around 25. on any server with more than 10 people with any fight going on.

i would like a video with you playing any mission at any server with 20+ people and some ai running around/shotting at something and vehicles/helis with those 45-60fps. i really do, because i have never seen one, by anyone.

am i calling you a liar? yes i am.

Edited by white

Share this post


Link to post
Share on other sites

well sixgears, if you would read the thread well i allready wrote that amd fx Processors just have a way too slow frontend to even support low end cards like yours proberly. i alos wrote that with amd cpus object detail must be at standard or low because they can´t handle the sheer mass of objects and get their frontend stalled. you can do something against it with ocing the uncore and core parts of the fx to max, but even a 5ghz fx8350 just performs as fast as a i7 920, which is very old.

i said also that i ask for exact specs on 955: everything ultra and objects low, no aa and postfx. and the owner of the system knows that he just met the recommended specs so he knows also that he normally is just quaranteet to play on standard settings. and as he is a smart guy he played evo not with 2500m viewrange but 1000m with 800m objectdistance, because he know of the weakness of the phenom 2.

i know a screenshot would proof ist but i´m afraid i don´t get him in this disagreement of you and me.

one last thing to the amd fx: if you have heavy fp load on a fx the effectice core number shrinks to half because the complex fp unit is shared by 2 integer cores, so a fx6300 is under heavy fp load just a triple core processor, and wih that you don´t even match recommended specs.

@ white: with a proberly hosted server and you with right settings(for heavy fighting set particles quality lower, and all the rest i already mentioned) will get constant at least 40fps.

your x6 processor , maybe 1100t, i recommend you too to oc your uncore clock max possible, because it´s the weakspot of any amd these days.

may you all please realize that you can´t play everything on max with budget hardware. i played ofp on pentium 3 and regret it even today, i played arma 1 on pentium 4 first and it was horrible. played and hosted arma 2 on phenom 2 x2 550, and is was just pain. with arma 2 arrowhead i got i7 2600k(as 2600k just was released) and it runs like charm, if i hadn´t roasted my 2600k i would even have it today and would have had a fast cpu for over 2 years.

and arma 3 alpha runs alot faster than arma 2 arrowhead.

hosting a local game with 150bots+at least 14 players in arma 2? not even thinkable with more than 15fps, and now 60-80fps no problem.

Share this post


Link to post
Share on other sites

@ white: with a proberly hosted server and you with right settings(for heavy fighting set particles quality lower, and all the rest i already mentioned) will get constant at least 40fps.

your x6 processor , maybe 1100t, i recommend you too to oc your uncore clock max possible, because it´s the weakspot of any amd these days.

particles are on low also, and my specs are on my sig, and this thuban has 6 full cores each with its own decoders. and about those cores, around the page 22 or so i disabled 4 cores leaving only 2 avaiable, the performance was the same and i proved with screenshots, the performance issue on this engine lies in the first core on the 1 major game thread) yeah the magic of properly hosted servers on properly made missions, ive played on a lot of different servers and missions, im yet to catch that unicorn, reminds me of the "no true scotsman" fallacy for some reason. 60 fps minimum you say? with 150 ai 14 players and lets say also flying an heli (since theyre an integral part on most missions)? well sir i extend my skepticism to you aswell.

Edited by white

Share this post


Link to post
Share on other sites
well sixgears, if you would read the thread well i allready wrote that amd fx Processors just have a way too slow frontend to even support low end cards like yours proberly. i alos wrote that with amd cpus object detail must be at standard or low because they can´t handle the sheer mass of objects and get their frontend stalled. you can do something against it with ocing the uncore and core parts of the fx to max, but even a 5ghz fx8350 just performs as fast as a i7 920, which is very old.

i said also that i ask for exact specs on 955: everything ultra and objects low, no aa and postfx. and the owner of the system knows that he just met the recommended specs so he knows also that he normally is just quaranteet to play on standard settings. and as he is a smart guy he played evo not with 2500m viewrange but 1000m with 800m objectdistance, because he know of the weakness of the phenom 2.

i know a screenshot would proof ist but i´m afraid i don´t get him in this disagreement of you and me.

one last thing to the amd fx: if you have heavy fp load on a fx the effectice core number shrinks to half because the complex fp unit is shared by 2 integer cores, so a fx6300 is under heavy fp load just a triple core processor, and wih that you don´t even match recommended

My original post was in response to the post you're talking about, so clearly I read it. Time to work on that situational awareness of yours, friend.

Funny that my CPU has no issues with a weak front-end in any other game on the market. I guess ArmA 3 must just be that much better looking. Wait, that's not true. Maybe it's got more going on. Nope, that's not the case either. There are dozens of games out there that look better and handle more than ArmA 3, and every single one works fine with AMD CPUs. That brings us back to the engine being the problem.

The fact that ArmA 3 uses my hex core as a triple core (if that's true) just goes to show you that something isn't right, regardless of how much technical mumbo jumbo you throw out. And you still haven't explained why my CPU never goes over 35% load while my GPU hovers at 65%. And how about all those Intel CPU enthusiasts having nearly identical problems?

You've yet to provide me with a compelling excuse for ArmA 3's issues. Instead, you've forced me to read a bunch of barely coherent and likely incorrect technical junk that doesn't manage to illuminate anything other than your raging bias towards Intel. If the issue was being reported only by AMD FX users maybe I'd believe you. However, it isn't and I therefore think you're full of it.

Share this post


Link to post
Share on other sites
My original post was in response to the post you're talking about, so clearly I read it. Time to work on that situational awareness of yours, friend.

Funny that my CPU has no issues with a weak front-end in any other game on the market. I guess ArmA 3 must just be that much better looking. Wait, that's not true. Maybe it's got more going on. Nope, that's not the case either. There are dozens of games out there that look better and handle more than ArmA 3, and every single one works fine with AMD CPUs. That brings us back to the engine being the problem.

The fact that ArmA 3 uses my hex core as a triple core (if that's true) just goes to show you that something isn't right, regardless of how much technical mumbo jumbo you throw out. And you still haven't explained why my CPU never goes over 35% load while my GPU hovers at 65%. And how about all those Intel CPU enthusiasts having nearly identical problems?

You've yet to provide me with a compelling excuse for ArmA 3's issues. Instead, you've forced me to read a bunch of barely coherent and likely incorrect technical junk that doesn't manage to illuminate anything other than your raging bias towards Intel. If the issue was being reported only by AMD FX users maybe I'd believe you. However, it isn't and I therefore think you're full of it.

i7 3930k @ 4.4 with Quad Sli 690s. 30 FPS on ultra. I can play every other game at like 60+ fps on triple screen, then skyrocket to the maximum limit (usually like 120 to 150, i've seen 400+ fps on games before). Arma 3? 30 fps ultra 3 screen. 30 fps ultra 1 screen. Da Faq?

Share this post


Link to post
Share on other sites

I didn't originally have a problem on the first release of the game, it ran well. Then my motherboard failed and I rebuilt around a different motherboard, a new patch come out and I went back to Windows 7 having previously used Windows 8. Between those three changes my FPS on very high has gone from mid 70's to high 20s low 30s at the same settings. So initially I didn't have this problem, my 3930k@4.4 and 2x680s ran the game very well and now it runs it very poorly and not a whole lot has changed.

The machine itself runs other games just as well as it did before. There is no performance problem in Crysis 3, Far Cry, BF3 etc etc. 3D mark shows the performance is almost identical, its almost the same hardware so no surprises there.

The very noticeable difference for Arma 3 between the two setups is that GPU usage has dropped from around 70% per card (140% in all, so some SLI scaling) to a miserable 30% per card (60% of one card). So there is no point having two cards and not even a single 680 is being used. I can't explain why the same hardware is producing dramatically different results, either its the move from an Asus motherboard to the Asrock or its the patch that was released or Windows 8. Windows sounds very unlikely and the hardware is running great otherwise so I think it was the patch, but its very strange and deeply annoying. Its kind of destroyed my enjoyment of the game, I have had to drop the graphics enormously to get it playable and now I have a game that doesn't look much better than Arma 2 and still doesn't really run very well.

Share this post


Link to post
Share on other sites

I'd be curious if some other folks can try this and report their findings.

For the hell of it, I created a mission in the editor with absolutely nothing. Just dropped in a player character. saved. exported to sp missions. fired it up.

50-75 fps generally. The difference? well, while you might say it is because nothing is going on... it was actually b/c the game was using 90-100% of my gpu on average. There were some dips down into 70% usage range, but they were short, and i still maintained 40+ under same settings. It was pretty much a 50+ fps experience.

So, the infantry showcase, and a mission with nothing in it both will make more full use of my hardware and will perform great as a result. Anything else, even the most basic of missions, and I get low hardware usage (i.e. gpu <50%) and a crappy framerate as a result.

Of course, there is still the whole issue of the game not scaling performance with settings changes, but at least there are a couple of small examples of the game actually using the horsepower that is available.

Makes me wonder if this issue might have something to do with how they are processing the ai, and that is somehow causing low hardware utilization. But, that is out of my area.

Share this post


Link to post
Share on other sites
I'd be curious if some other folks can try this and report their findings.

For the hell of it, I created a mission in the editor with absolutely nothing. Just dropped in a player character. saved. exported to sp missions. fired it up.

50-75 fps generally. The difference? well, while you might say it is because nothing is going on... it was actually b/c the game was using 90-100% of my gpu on average. There were some dips down into 70% usage range, but they were short, and i still maintained 40+ under same settings. It was pretty much a 50+ fps experience.

So, the infantry showcase, and a mission with nothing in it both will make more full use of my hardware and will perform great as a result. Anything else, even the most basic of missions, and I get low hardware usage (i.e. gpu <50%) and a crappy framerate as a result.

Of course, there is still the whole issue of the game not scaling performance with settings changes, but at least there are a couple of small examples of the game actually using the horsepower that is available.

Makes me wonder if this issue might have something to do with how they are processing the ai, and that is somehow causing low hardware utilization. But, that is out of my area.

I've done exactly that, and still experienced usage in the 40-50% range on my GPU and usually a 70%-50%-40%-30% on all 4 cores. If I throw AI into the mix I will get lower usage, usually around 20-30% GPU and usually the CPU usage will still hover in the same range of 70%-50%-40%-30%.

Keep in mind this was AI that were doing nothing, they had no waypoints and they had no scripting active.

I've also tried lowering settings and there is absolutely no difference for me between ultra and low or very low on most settings. The only one's that really have any affect are shadows, post processing and terrain detail between low and standard ( it seems it has more to do with grass being preloaded and rendered more than the actual terrain detail). Even dropping my resolution down to like 1280x720 results in almost identical performance with the exact same settings as what I normally run which is 1920x1080. Usage doesn't change with a decrease in resolution however it increases when I lower the above mentioned settings.

Edited by Insanatrix

Share this post


Link to post
Share on other sites
i7 3930k @ 4.4 with Quad Sli 690s. 30 FPS on ultra. I can play every other game at like 60+ fps on triple screen, then skyrocket to the maximum limit (usually like 120 to 150, i've seen 400+ fps on games before). Arma 3? 30 fps ultra 3 screen. 30 fps ultra 1 screen. Da Faq?

Your issues are due to the fact that your computer was designed to control Mars rovers and calculate interstellar trajectories. NASA would like it back now. :D

Seriously, though, that's a ridiculous rig. Almost so ridiculous I don't believe you. I will therefore require pictures to masterb... erm, use to verify the truthfulness of your claim.

If what you say is true (still skeptical about the $5000 rig), then all of us mere mortal PC owners are doomed.

Share this post


Link to post
Share on other sites
I'd be curious if some other folks can try this and report their findings.

For the hell of it, I created a mission in the editor with absolutely nothing. Just dropped in a player character. saved. exported to sp missions. fired it up.

50-75 fps generally. The difference? well, while you might say it is because nothing is going on... it was actually b/c the game was using 90-100% of my gpu on average. There were some dips down into 70% usage range, but they were short, and i still maintained 40+ under same settings. It was pretty much a 50+ fps experience.

So, the infantry showcase, and a mission with nothing in it both will make more full use of my hardware and will perform great as a result. Anything else, even the most basic of missions, and I get low hardware usage (i.e. gpu <50%) and a crappy framerate as a result.

Of course, there is still the whole issue of the game not scaling performance with settings changes, but at least there are a couple of small examples of the game actually using the horsepower that is available.

Makes me wonder if this issue might have something to do with how they are processing the ai, and that is somehow causing low hardware utilization. But, that is out of my area.

some guy even with no island, just the sea, went upwards with the camera until he got to 1-2 fps. with just water.

Share this post


Link to post
Share on other sites

strangely for me, a mission with nothing = my gpu using >90%. Infantry showcase gives me 90+% gpu usage. as a result, i get great performance. Anything else, even a tiny mission, and gpu usage goes way down and the game becomes unplayable as a result.

Share this post


Link to post
Share on other sites
strangely for me, a mission with nothing = my gpu using >90%. Infantry showcase gives me 90+% gpu usage. as a result, i get great performance. Anything else, even a tiny mission, and gpu usage goes way down and the game becomes unplayable as a result.

What kind of GPU? I've got a 1.5gb GTX 480 that's factory OC'd to 751/1900 and I OC'd it a little more to 791/1950.

Share this post


Link to post
Share on other sites

@Mobile_Medic

That's extremely interesting. I don't have a background in programming, but it seems like you're onto something there. I wonder why the game would use less of your hardware in situations where it should clearly use more...

I may not even be close here (remember: not a programmer), but what if BI has attempted to mitigate the level of stress placed on lower-end machines by somehow altering the code to "lock" usage at certain levels during intense sequences/heavy processing? That would explain the reports of low-end rigs running the game fine at 100% or close to it while the higher-end comps aren't able to utilize their full potential or architecture and therefore run worse.

I sincerely hope I'm an idiot and completely wrong.

Share this post


Link to post
Share on other sites
gtx 580 3gb oc'd to 824mhz.

You definitely have a more powerful card than mine, but it's only about a 25-35% increase. I got better performance with the latest dev patch, but I'm talking going from like 10-15 fps to like 28-34 fps, Certainly nothing as high as 50-75 fps even with all settings on low.

---------- Post added at 14:36 ---------- Previous post was at 14:34 ----------

@Mobile_Medic

That's extremely interesting. I don't have a background in programming, but it seems like you're onto something there. I wonder why the game would use less of your hardware in situations where it should clearly use more...

I may not even be close here (remember: not a programmer), but what if BI has attempted to mitigate the level of stress placed on lower-end machines by somehow altering the code to "lock" usage at certain levels during intense sequences/heavy processing? That would explain the reports of low-end rigs running the game fine at 100% or close to it while the higher-end comps aren't able to utilize their full potential or architecture and therefore run worse.

I sincerely hope I'm an idiot and completely wrong.

I've thought that myself. That they would put a lock on how fast threads can work in order to mitigate the stress on the streaming being done to mitigate any kind of stuttering or pausing that could be induced by it.

Share this post


Link to post
Share on other sites

@sixgears... something to do with AI, pathing, etc... if that were the case (or partly the cause) for this low hardware usage... *might* explain why resolution and other settings changes have no effect on framerate once you hit that cap. But, I am also not a programmer. I guess I could test that by lowering my resolution from 2560x1600 down to 1280x800 again, this time inside the empty mission, and seeing if there is an fps change...

It is a clearly demonstrable effect for me though. But, I would expect that insanatrix would have similarly benefited, especially with a lower-end video card, under an empty map scenario.

Well, I just tried that (lowering down to 1280x800 in an empty mission) The result was 60-70 fps and the gpu was being utilized 35-42%, or so. It is to be expected that the gpu would not need to be used as much at that low resolution. In this case, it produced a noticeable effect though. Increased framerate with lower gpu utilization.

So, the summary of that little experiment was:

2560x1600 (native rez of my monitor) vs 1280x800 in an actual mission (that is not the showcase) both produce low gpu usage and the same equally bad performance. lower resolution provides no benefit (as widely reported since arma 2)

However, the same comparison in an empty mission:

2560x1600 = 100% gpu usage and 50-60fps at camp tempest

1280x800 = 35-40'ish% gpu usage and 60-70fps at camp tempest.

conclusion (for this individual) empty missions and showcase missions (*except* for "helicopters") use 90-100% gpu and provide accompanying fps performance boost, making the game awesomely playable for me at 50+ fps at current settings. Resolution changes appear to provide measurable effect (lower gpu usage, but higher fps = net gain).

The "Helicopters" showcase, or any other mission that is not an empty map or the other showcase missions provides the familiar arma 2 experience of low gpu usage (30-50% for me) and accompanying low framerate that ranges from barely playable to completely unplayable at 15-30fps. Resolution changes (even going from 2560x1600 down to 1280x800) have seemingly no effect on performance.

Share this post


Link to post
Share on other sites
@sixgears... something to do with AI, pathing, etc... if that were the case (or partly the cause) for this low hardware usage... *might* explain why resolution and other settings changes have no effect on framerate once you hit that cap. But, I am also not a programmer. I guess I could test that by lowering my resolution from 2560x1600 down to 1280x800 again, this time inside the empty mission, and seeing if there is an fps change...

It is a clearly demonstrable effect for me though. But, I would expect that insanatrix would have similarly benefited, especially with a lower-end video card, under an empty map scenario.

Well, I just tried that (lowering down to 1280x800 in an empty mission) The result was 60-70 fps and the gpu was being utilized 35-42%, or so. It is to be expected that the gpu would not need to be used as much at that low resolution. In this case, it produced a noticeable effect though. Increased framerate with lower gpu utilization.

So, the summary of that little experiment was:

2560x1600 (native rez of my monitor) vs 1280x800 in an actual mission (that is not the showcase) both produce low gpu usage and the same equally bad performance. lower resolution provides no benefit (as widely reported since arma 2)

However, the same comparison in an empty mission:

2560x1600 = 100% gpu usage and 50-60fps at camp tempest

1280x800 = 35-40'ish% gpu usage and 60-70fps at camp tempest.

conclusion (for this individual) empty missions and showcase missions (*except* for "helicopters") use 90-100% gpu and provide accompanying fps performance boost, making the game awesomely playable for me at 50+ fps at current settings. Resolution changes appear to provide measurable effect (lower gpu usage, but higher fps = net gain).

The "Helicopters" showcase, or any other mission that is not an empty map or the other showcase missions provides the familiar arma 2 experience of low gpu usage (30-50% for me) and accompanying low framerate that ranges from barely playable to completely unplayable at 15-30fps. Resolution changes (even going from 2560x1600 down to 1280x800) have seemingly no effect on performance.

See I get the same results as you but on a much smaller scale. Like my GPU usage will increase by 10-20% so it will go from like 20-30% to 40-50%, but I never go above 50%. CPU usage does go down but it's by a very small amount. I tested with 50 AI and usage was around 65%-45%-40%-25% versus what I normally get on a plain empty map.

Share this post


Link to post
Share on other sites

does the slight bump in gpu usage carry any fps bump for you (even if on a smaller scale than what I am observing)?

I did drop a squad of friendly and a squad of enemy ai in my empty map as well, and i still had 90-100% gpu usage for the most part (some dips here and there to around 70% and accompanying frame rate drop to the 40's, but still great, and short-lived). So, just the presence of AI did not cause the gpu usage to buckle in that case (for me). With a small number of ai, that is.

But, i also did a very small mission i downloaded. at first, it was full gpu and great frame rate. when i got to the place where the two enemy AI in the mission were, my gpu usage collapsed and my framerate went with it.

Share this post


Link to post
Share on other sites
does the slight bump in gpu usage carry any fps bump for you (even if on a smaller scale than what I am observing)?

I did drop a squad of friendly and a squad of enemy ai in my empty map as well, and i still had 90-100% gpu usage for the most part (some dips here and there to around 70% and accompanying frame rate drop to the 40's, but still great, and short-lived). So, just the presence of AI did not cause the gpu usage to buckle in that case (for me). With a small number of ai, that is.

But, i also did a very small mission i downloaded. at first, it was full gpu and great frame rate. when i got to the place where the two enemy AI in the mission were, my gpu usage collapsed and my framerate went with it.

Any increase in usage does carry an increase in overall fps.

For instance, sitting in Agia Marina in between the buildings with no AI present I get a usage of around 40-50%. If I put in a few AI up to around 20-25 AI there is a drop in fps of around 5-7 fps and a drop in GPU usage down to around 30-45%. CPU usage tends to stay pretty steady around 65-70%, 40-50%, 30-40%,30-40% for each CPU core respectively.

But If I put myself in an open field in the middle of nowhere with no other editor placed entities aside from myself, my GPU usage never goes above 50%. It's completely backwards it seems because with more entities and more AI you should see an increase in usage as there is more to do.

On the topic of AI, I've tried using the disableAI command, http://community.bistudio.com/wiki/disableAI , for all value's and it has seemingly no impact on performance or usage. From what I can gather, just having the entity placed on the map seems to cause the slowdown more than the AI FSM's or logic. With a group of 25 AI, the difference between disableAI and a unit placed with no initialization was maybe 1 fps at most.

Share this post


Link to post
Share on other sites

another random test...

Using the helicopters showcase as the test bed for this one (since it also has low gpu usage) if I max the view distance at 12k, gpu usage is ~30% and fps is sub 20. If I back the view distance all the way down to 500 (can't see shit) gpu usage actually goes up into the 70-92% range. fps is 55-77. any dips in gpu usage below that did not result in fps drop. clearly it was being utilized more efficiently.

So, why would drawing so much less allow the gpu to be more fully utilized? I don't know. FPS increase is to be expected. what isn't is gpu being utilized at 70+% when drawing only 500m vs. only 30% of the gpu being utilized when drawing 12km. Backwards. p.s. the difference between 3km (for example) and 500m is significant, where the difference between 3km and 12km is much less noticeable (both in terms of fps performance, and in terms of gpu usage)

So, now.. (for me) I've been able to establish that non-helicopter showcase missions, an empty map, and minimum view distance all increase gpu usage (and are accompanied by big fps jumps). Helicopter showcase, missions with ai, etc. all seem to poorly utilize the gpu (and are accompanied by massive fps drops becoming unplayable).

EDIT to add: the effect of minimum draw distance on gpu usage in the "Whole Lotta Stratis" (on foot) mission ranged from not noticeable, to mildly noticeable. So, not as pronounced as the impact in the helicopters showcase. But, it was noticeable about 3/4 of the time. (net +10% on gpu usage on average over 3km view distance)

---------- Post added at 21:58 ---------- Previous post was at 21:17 ----------

guess i should have checked back a few pages further. this guy seems to support the idea that it is AI related issues causing low gpu usage:

http://forums.bistudio.com/showthread.php?147533-Low-CPU-utilization-amp-Low-FPS&p=2365202&viewfull=1#post2365202

Edited by Mobile_Medic

Share this post


Link to post
Share on other sites
another random test...

Using the helicopters showcase as the test bed for this one (since it also has low gpu usage) if I max the view distance at 12k, gpu usage is ~30% and fps is sub 20. If I back the view distance all the way down to 500 (can't see shit) gpu usage actually goes up into the 70-92% range. fps is 55-77. any dips in gpu usage below that did not result in fps drop. clearly it was being utilized more efficiently.

So, why would drawing so much less allow the gpu to be more fully utilized? I don't know. FPS increase is to be expected. what isn't is gpu being utilized at 70+% when drawing only 500m vs. only 30% of the gpu being utilized when drawing 12km. Backwards. p.s. the difference between 3km (for example) and 500m is significant, where the difference between 3km and 12km is much less noticeable (both in terms of fps performance, and in terms of gpu usage)

So, now.. (for me) I've been able to establish that non-helicopter showcase missions, an empty map, and minimum view distance all increase gpu usage (and are accompanied by big fps jumps). Helicopter showcase, missions with ai, etc. all seem to poorly utilize the gpu (and are accompanied by massive fps drops becoming unplayable).

EDIT to add: the effect of minimum draw distance on gpu usage in the "Whole Lotta Stratis" (on foot) mission ranged from not noticeable, to mildly noticeable. So, not as pronounced as the impact in the helicopters showcase. But, it was noticeable about 3/4 of the time. (net +10% on gpu usage on average over 3km view distance)

My best guess is that as you lower view distance, there's less data needing to be simultaneously streamed or cached. Texture's, LOD's, Object data and related configuration data etc...

Alternatively, the higher the view distance, the more that has to be held in RAM or streamed from the HDD. Even if something is out of view, say there are 50 AI soldiers on the other side of a hill 2km from you and your view distance is 3km, all the associated data would still have to be held in RAM or in disk cache.

It still enforces my belief that it's an issue with the disk streaming causing a bottleneck on all threads. It's the only constant I can put my finger on that would cause a decrease in usage as demand goes up. The entities in the mission and how large their texture's and associated data are can wildly influence the performance based on size rather than rendering quality or demand or in the case of AI their processing demand. I just don't see it being a cause of the AI because disabling all AI processing for placed entities impacts FPS by such a small margin.

Edited by Insanatrix

Share this post


Link to post
Share on other sites

@Mobile_Medic

Now I'm just confused. There's really no reason that I can think of for that kind of bizarre behavior, though I do have to admit that I'm intrigued by the streaming bottleneck hypothesis the other gentleman is proposing. Sadly, I'm simply not willing to dial back my settings--and thus my enjoyment since I am a self-proclaimed graphics whore--in order to force the engine to behave as it should in the first place.

It's starting to sound like this problem is far too deep-seated to be fixed with simple patches. It may be time for BIS to build a new engine entirely. Bummer.

I guess I'll just avoid cities and stick to lightly scripted missions for now. At least it runs properly that way. I like the game too much to walk away now even if my hope for a timely fix is fading quickly.

Edited by Sixgears2

Share this post


Link to post
Share on other sites

It still enforces my belief that it's an issue with the disk streaming causing a bottleneck on all threads.

Nah, it's just the multithreading. If you overclock the performance scales with clockspeed, per-core performance is the most important factor in arma. if streaming were the problem people with a ssd would have great performance compared to those with a normal harddrive, which isn't the case, there's no noticable difference.

Share this post


Link to post
Share on other sites
Nah, it's just the multithreading. If you overclock the performance scales with clockspeed, per-core performance is the most important factor in arma. if streaming were the problem people with a ssd would have great performance compared to those with a normal harddrive, which isn't the case, there's no noticable difference.

All so as I have said people have made RAM drives and them things are bloody fast as hell yet no FPS gain.

Dont get me wrong they will make the gameplay smoother and less stutter when loading textures as said by people that used them but no performance gain so to speak just smoother gameplay as ARMA does stream a lot of data

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now

×