Jump to content
k3lt

Low CPU utilization & Low FPS

Recommended Posts

then it's something wrong with your rig? I can play fine on 60 man full wasteland server. Average 30 fps. It depends on server hosting, of course there are servers, where you have 5 fps, but you simply leave them and look for better ones?

30fps average is horrible. 30fps minimum is passable, but I really prefer 40fps+.

Share this post


Link to post
Share on other sites

I had some sort of interesting bug last night. Was playing as usuall on MP servers with 30-50fps. Then all of the sudden I went down to 3-6fps and stayed at that, now I turned my eye towards MSI Afterburner that I have running to see some info, also has HWINFO64.

To start of here is my rig :

CPU - AMD FX8320 (8 core) @ 4.4GHz

GPU - HD6870 @ slightly overclocked

RAM - 8GB @ 1600MHz

HDD - Regular 7200RPM

And ingame stats when this occured :

FPS - 3-6

Core usage - #1 23.6% #2 22.4% #3 10.6% #4 10.7% #5 9.9% #6 4.5% #7 18.7% #8 17.6%

GPU usage - 0% (Why?)

VRAM usage - 841 MB out of 1GB

RAM usage - No idea, didn't have that selected.

CPU Heat - 44 Celcius

GPU Heat - 34 Celcius

So I started to think, why is this game only running on my CPU and why ain't it using more power. Usually the game uses around 50-70% from atleast 2 cores. And my GPU usage is usually around 50%. This didn't fix itself even after still playing with around 5fps for 10 minutes. To fix it I had to restart the game and haven't seen this occur again.

Someone knows anything at all about what the hell just made the game engine go crazy like that? I've had the game running for about 30 min when that bug occured. I do not expect a magic fix for this or anything just wanted you guys to know that some wierd things can occur.

EDIT :

This was on a almost empty server, we were like 10 guys on there and also had a friend with me and for him everything was normal.

Edited by lundell318
Added some info.

Share this post


Link to post
Share on other sites

what kind of "optimization guide" includes 8x AA?

Share this post


Link to post
Share on other sites
what kind of "optimization guide" includes 8x AA?

well, read description more carefully then.

Share this post


Link to post
Share on other sites

Well I have to say lowering Objects Quality and terrain Quality to standard gave me a FPS boost. And the best part is that I don't see any big or negative visual looks by lowering these options.

Share this post


Link to post
Share on other sites
Actually, it does use CPU.

Its got to. I get a 10-15 fps increase after I lowered Terrain/Object from Ultra to Standard and thats on a Titan.

Share this post


Link to post
Share on other sites

Objects quality low: http://i.imgur.com/Yq1QeOt.jpg

Objects quality ultra: http://i.imgur.com/hW9URAm.jpg

This is with 600m object draw distance, with 2km object draw distance there was 3-4 fps gain.

You can ignore the high fps, it was run on SP to make sure server is not restricting this values to change. (also FYI most multiplayer servers are not allowing to change object/terrain details or draw distance so it has no effect on MP where the real problem is)

Edited by k3lt

Share this post


Link to post
Share on other sites
Objects quality low: http://i.imgur.com/Yq1QeOt.jpg

Objects quality ultra: http://i.imgur.com/hW9URAm.jpg

This is with 600m object draw distance, with 2km object draw distance there was 3-4 fps gain.

You can ignore the high fps, it was run on SP to make sure server is not restricting this values to change. (also FYI most multiplayer servers are not allowing to change object/terrain details or draw distance so it has no effect on MP where the real problem is)

object details has nothing to do with draw distance in MP, as i know, its the view distance which gets changed at mp.

Share this post


Link to post
Share on other sites

Screenshots gentlemen (with fps counters), then we can see the fps levels and differences, performance at a glance..;)

I change my settings almost daily, its down to the weather I set for testing scenarios.

One set of settings does not do all, unfortunately, yet anyway..

Edited by ChrisB

Share this post


Link to post
Share on other sites

arma3alpha.cfg , set GPU_MaxFramesAhead=1; and check your fps/smoothness ;)

Share this post


Link to post
Share on other sites

What about this : GPU_DetectedFramesAhead=0;

Mine is zero. Should I change it or keep as it is?

Share this post


Link to post
Share on other sites

GPU_MaxFramesAhead=1 helped quite a bit on my side, but the performance still could be better.

Sometimes it seems like the more crap's going on, the less the CPU is doing, which seems a bit odd to me.

Share this post


Link to post
Share on other sites

My system info is in my sig, I also have 2x HD5870 in crossfire mode. At pretty much any setting only about 40% of my CPU and 35% of my system RAM is being used. One graphics cards GPU is maxed out at 100% with about 800-900 of the onboard RAM being used. The other card is idle...

So for me CPU is not a problem, it seems to be related to the GPU. Does anyone know if in the BETA or final release they will have x-fire mode?

I think CPU usage will go up if i use crossfire but doubt it will ever get close to 100%.

ALSO, I've just had a thought, I know on some other games that only used one graphics card I got worse performance then my brother who has an identical system to me but with only one hd5870. I think it may have been the Rage game, but the solution at the time was to disable one of my cards... Not a solution IMO so I never played it. Does anyone know if SLI/Crossfire is causing issues with ARMA III??

Edited by charliemilk

Share this post


Link to post
Share on other sites
What about this : GPU_DetectedFramesAhead=0;

Mine is zero. Should I change it or keep as it is?

Not sure if it has changed , but changing that value has no result in previous arma engines ( seems it's rewritten in every load just to check what it "detected" and nothing more )

If it's still the "same" only the GPU_MaxFramesAhead= cvar actually matters and changing the one you mentioned has no result.

I'm pretty sure GPU_MaxFramesAhead cvar is the Pre-rendered frames. ( If someone could correct me or confirm this I would be very appreciated )

Quoting Tweakguides :

Maximum Pre-Rendered Frames: If available, this option - previously known as 'Max Frames to Render Ahead' - controls the number of frames the CPU prepares in advanced of being rendered by the GPU. The default value is 3 - higher values tend to result in smoother but more laggy gameplay, while lower values can help reduce mouse and keyboard lag.

And according to nvidia :

The 'maximum pre-rendered frames' function operates within the DirectX API and serves to explicitly define the maximum number of frames the CPU can process ahead of the frame currently being drawn by the graphics subsystem. This is a dynamic, time-dependent queue that reacts to performance: at low frame rates, a higher value can produce more consistent gameplay, whereas a low value is less likely to introduce input latencies incurred by the extended storage time (of higher values) within the cache/memory. Its influence on performance is usually barely measurable, but its primary intent is to control peripheral latency.

I did this tweak ( GPU_MaxFramesAhead=1 ) in both arma2 and arma3, I don't get a big fps boost , but the game itself seems to "respond" better.

Kinda hard to explain , I notice it immediately although I do not get a big fps difference.

Although it contradicts the Tweakguide explanation , the game fells much smoother with GPU_MaxFramesAhead=1 at least on my computer,

I do advise ppl to try it and test it out , even if you don't see a fps increase , the game just runs better ( at least for me )

Edited by neolitejukebox

Share this post


Link to post
Share on other sites
arma3alpha.cfg , set GPU_MaxFramesAhead=1; and check your fps/smoothness ;)

I was using since very beggining, someone mentioned it before.

Honestly i didn't noticed any visible fps increase..

Share this post


Link to post
Share on other sites

It doesnt increase fps, but it can reduce input lag.

Share this post


Link to post
Share on other sites

Not that i would be worried about input lag running with 15-20 fps. :p

Share this post


Link to post
Share on other sites

at low fps a few frames of input lag can be your worst enemy.

in arma you wont notice this often, in order for the buffer of 3 frames to fill you need to be gpu limited, so the gpu can start lagging behind the cpu and can be at 100% all the time, so it can be as smooth as can be. If you have a reasonable gpu you will never have low fps due to a gpu bottleneck in arma, the cpu doesnt keep up.

reducing pre rendered frames is a must on systems with relatively slow gpu's like notebooks, way less input lag at low fps.

Share this post


Link to post
Share on other sites

Sometimes it seems like the more crap's going on, the less the CPU is doing, which seems a bit odd to me.

this behaviour might indicate some really shoddy written code :p

Share this post


Link to post
Share on other sites

- If you have the CPU and the GPU usage both at 100%, ideally, the system is perfectly balanced and the engine is working fine

- If you have the CPU at 30% and the GPU at 100%, then the CPU is bottlenecked by the the GPU

- If you have the GPU at 30% and the CPU at 100%, then the GPU is bottlenecked by the the CPU

- If both are never at 90-100%, then the game engine is broken somewhere which is the case for Arma 3 right now

Plain and simple.

Edited by k3lt

Share this post


Link to post
Share on other sites
what does it mean if the CPU has 100%? what is exactly 100% what?

Utilization/Usage, and i know it's nearly impossible to hit 100% on CPU running game.. realistically it's about 80-90% depending on game/engine, it was just example. :)

Share this post


Link to post
Share on other sites

@NeuroFunker

Your guide is half good. It points into right direction but so much wrong stuff, let me start:

(I have 670 too, 3770K and 16GB ram)

Tab - BASIC.

Resolution, gamma, brighness, interface size aren't important, lets begin with Visibility:

I can pull 5k overall view distance with fps around 50, which is ok for me and I think most of the people here. I keep object at 1600max, it might sound like big difference from overall rendering distance but in reality it looks great and I find it optimal. I agree on shadows part though.

Overall - i have set to 2,8k (I use 5k)

Object - to 1,8k (I use 1,6)

Shadow - 200m

Next Tab - Rendering:

Rendering Res - 100%

Vsync - enabled (now why would you want to have this enabled with avg. of 30 is beyond me, I always use adaptive vsync, I recommend turning this off for both NV and ATI users unless you get 60+ framerates)

It is not correct that AA and ATOC doesn't affect performance, it kills the performance, just try looking through scope when dense vegetation is in front of you - good luck! Enjoy a slide show.

Antialiasing - 8xAA (wtf?) - I disable AA and use only PPAA

PPAA - FXAA Ultra

AtoC - All trees + grass. - If you want to keep AA (2-4 max advised), disable this unless you enjoy a slide show whenever you aim through scope and dense grass

Postprocess Quality - Normal (I keep this at max setting, it doesn't affect my performance in a way you described)

HDR Quality: Standard

Anisotropic Filtering - Very High (I keep it at Ultra)

PiP (Picture in Picture)- Ultra

Dynamic Lights - Ultra

Next Tab - Quality:

Texture Qualty - Ultra (I keep it at Very high, no visible difference, eats more VRAM for no purpose on Ultra)

Obects Quality - Standard (High)

Terrain Quality - Standard (High)

Clouds Quality - Ultra

Shadows Quality - High

Particles Quality - Very High (I keep it at High, there's a slight difference but much better fps when there's a lot of smoke for example)

So, with my settings, I give you better overall quality and better framerate. Most noticable killers in your settings are in order (higher to lower); Vsync, ATOC(AA), object render distance. You can gain more with pushing Objects and Terrain quality to High and overall view distance to 5km.

Edited by Minoza

Share this post


Link to post
Share on other sites
Utilization/Usage, and i know it's nearly impossible to hit 100% on CPU running game.. realistically it's about 80-90% depending on game/engine, it was just example. :)

no - i mean in more detail. It was rather a rhetorical question of mine, I wanted to go deeper on that matter with an open mindset.

because as I get it, its just always some kind of approximation to the real thing. But I have really no hardware technical insight.

just an example what I think it works: lets say your CPU has 10 cycles or time frames to do something, and in 9 frames it does nothing, but then there comes this one task and the CPU is at 100%. Now you measure your CPU load and the measurement cannot measure every small frame, but lets say 10 frames. So it will tell you: cpu load is 10% - even if it was 100% at some time and 0% at some other time.

so what basically is needed is some splitting of of tasks into several timing frames? because my cpu has a load of around 30% in ArmA. but when I overclock it, I have double fps and still only 30% load. so when Overclocked those little spikes that need 100% CPU are calculated much faster. But there are still those times when the CPU has idle time. right?

so it is all about is it possible to split the calculations into several timing frames?

And I always thought CPUs have several instructions, like integer operations, floating point, and some other special stuff (descrp. in CPU Manuals). So if we find out what operations / functions are mostly needed in ArmA we could rethink the code thats based for these cals?

the problem is: what if we have 10 tasks and 10 timing frames. but every tasks needs the output of the task before him? so he mustz wait for that prior task, and only then can start its own task.

Normally such complicated tasks and calculations are only for science stuff etc. but maybe herein lies armas problem: the AI?

But then I dont udnerstand why we have even low fps in Multiplayer without AI?

Bascially without looking deep at the code, and having also hardware udnerstatement we cant really tell whats the problem, and if it is even possible to "fix" it right?

Edited by tremanarch

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now

×