Jump to content
k3lt

Low CPU utilization & Low FPS

Recommended Posts

please stop using steam hardware survey...

oh, i should be 2 of the other 5 guys then, since i have 2 pcs running same steam account, both clocked @ 4ghz. I wonder who the other 3 are :rolleyes:

One of them would be me. ;)

Btw. what Grumpy Old Man said seems to be quite valid. If there is nothing CPU-heavy going on in the game, it's no wonder if certain worker threads have nothing to do.

Share this post


Link to post
Share on other sites
please stop using steam hardware survey...

oh, i should be 2 of the other 5 guys then, since i have 2 pcs running same steam account, both clocked @ 4ghz. I wonder who the other 3 are :rolleyes:

What a coincidence! I'd be the remaining two after counting MadDogX! :eek:

Share this post


Link to post
Share on other sites

I should have clarified what I said I wasn't idle I was idle in-game but the server wasn't idle. It was actually Bohemias Zeus server an US one and it was NATO Zeus Master Altis with 100% of the slots filled there was a lot of activity going on in-game. I'll try and get a 240 second recording. Now here is a bit of interesting information. Lets throw CPU graph Storage and memory activity into one Analysis page. Which will link the time differences together.

http://i.imgur.com/pK3U9yn.png (214 kB)

There is actually no relationship between the memory being mapped and the CPU. But I will do a more detailed recording see what I can get out of it.

http://i.imgur.com/tz5CTvF.png (150 kB)

Both traces seem to only find a relationship between CPU time, and thread use and thread-time-line and again a full Zeus server worker threads ARE doing... wait for it.... nothing...That being said the thread operation time seems to be taken up by Readying Threads and Context Switching them? So events seem to be the primary thing taken up time on the CPU by their raw-count? The events graph is the closest match to the Thread-utilisation graph. So if this is so why is arma 3 doing a lot of thread readying and context switching?

Page faults appear to have no relationship. Nor do read-write operations.

http://imgur.com/2Qb2vv4,YP3bVMc

I think this is more than conclusive to say arma 3's multi-threading is inefficient. It's quite fascinating to note Battle-Eye has a very linear relationship when it comes to commiting memory. Very linear.

Edited by Polymath820

Share this post


Link to post
Share on other sites
please stop using steam hardware survey...

oh, i should be 2 of the other 5 guys then, since i have 2 pcs running same steam account, both clocked @ 4ghz. I wonder who the other 3 are :rolleyes:

ok, i am the next one :p

Share this post


Link to post
Share on other sites

Hi all

Just run some tests today with DDR3 speed with ArmA 3, here are the results:

Resolution 100% (1440x900)

i7 4790k (stock)

Asus ROG Maximus VII Hero

Kingston HyperX Beast DDR3 2400 8gb

Gigabyte Windforce 7870 OC @1100

Benchmark Altis V 0.60 (2 RUNS)

DDR3 1600 - 52-54 FPS

DDR3 2400 - 62-63 FPS (XMP Settings)

Game Settings

Texture: Ultra

Objects and Terrain: High

Distance: 3000

Object Distance: 2000

FSAA: 4X

Shadow: Ultra

Clould: Ultra

HDAO: High

Particles: High

FXAA: Ultra

ATOC: All Tress and Gras

ANISO. FILT: Ultra

Glad i bought a faster ram :yay:

Share this post


Link to post
Share on other sites

The benchmark has you teleporting all over the map, assets being loaded from disk, transferred to RAM. So bandwidth matters. But how is the performance impacted when you are in a town and have been for a while? Nothing should be loading from harddisk. Hence bandwidth shoud be no issue anymore.

I tried local LAN yesterday (just me running client + server). Tried with JSRS, BabeMidTex and a bunch of mods. 20 fps in town (kavala) tops.

Ran without mods, 20 fps tops. Mods had no noticable impact.

I'm not running blastcore. And GL5 really drops my fps by 10 cause the town is lit on fire and half of it is burning after a while (tested a week ago).

7-10 fps is not playable.

I should mention I have 250-300 AI on that mission. Tried with unit caching and without it(most AI are in town so doesn't cache a lot of units).Very little if any gains seen. With AI mods(Bcombat, FFIS, and some other I forgot name of) and without them.

Tried processlasso. Tried overclocking from 4ghz to 4.4 ghz (both memory and multiplier). Doesn't matter what I throw at it. Same fps and GPU is idling.

Edited by mamasan8

Share this post


Link to post
Share on other sites
I should have clarified what I said I wasn't idle I was idle in-game but the server wasn't idle. It was actually Bohemias Zeus server an US one and it was NATO Zeus Master Altis with 100% of the slots filled there was a lot of activity going on in-game. I'll try and get a 240 second recording. Now here is a bit of interesting information. Lets throw CPU graph Storage and memory activity into one Analysis page. Which will link the time differences together.

http://i.imgur.com/pK3U9yn.png (214 kB)

There is actually no relationship between the memory being mapped and the CPU. But I will do a more detailed recording see what I can get out of it.

http://i.imgur.com/tz5CTvF.png (150 kB)

Both traces seem to only find a relationship between CPU time, and thread use and thread-time-line and again a full Zeus server worker threads ARE doing... wait for it.... nothing...That being said the thread operation time seems to be taken up by Readying Threads and Context Switching them? So events seem to be the primary thing taken up time on the CPU by their raw-count? The events graph is the closest match to the Thread-utilisation graph. So if this is so why is arma 3 doing a lot of thread readying and context switching?

Page faults appear to have no relationship. Nor do read-write operations.

http://imgur.com/2Qb2vv4,YP3bVMc

I think this is more than conclusive to say arma 3's multi-threading is inefficient. It's quite fascinating to note Battle-Eye has a very linear relationship when it comes to commiting memory. Very linear.

There will always be a relationship between memory mapped and the CPU.

Context switching is when a process is switched out for another one; this takes an unavoidable amount of time. This happens when the task at hand is waiting for the synchronization of timings/threads (pretty much the basis of Amdahl's law), or when the task is waiting for I/O (THIS INCLUDES MEMORY!!). What this tells me is that it's not a stupidly large amount of data being referenced from memory at each call (this would cause lock ups and stuttering), but rather a LOT of references being called in total.

This makes sense because while the memory operation itself might not take that long, the context switching back to the task when the memory reference has completed ends up taking even more time. So, it's not that the multi-threading is inefficient, per-say, but rather that the memory references causes threading to BECOME inefficient.

All in all, the problem is that things are referenced in memory way too often; instantiated game objects have their pointers, and are constantly referenced to. This explains why people gain so much from faster RAM.

Edited by ruhtraeel

Share this post


Link to post
Share on other sites
Tried overclocking from 4ghz to 4.4 ghz (both memory and multiplier). Doesn't matter what I throw at it. Same fps and GPU is idling.
I have the complete opposite result with lots of AI in a town: the more cpu- or ram-overclocking the more fps. I assume your result is not representative :o

@Geraldus

congrats :). Try to overclock your ram a little bit!

Share this post


Link to post
Share on other sites

Maybe someone wants to give the GeForce v340.52 Beta a go, as my 3D-Mark-Scores improved quite a bit, but I have no time atm. to do some A3-Benches.

New Benchmark-Scores with the latest GeForce v340.52 Beta:

3D-Mark-Score:

Firestrike (standard):

18240

Fire Strike (extreme):

10185

:)

Share this post


Link to post
Share on other sites
Maybe someone wants to give the GeForce v340.52 Beta a go, as my 3D-Mark-Scores improved quite a bit, but I have no time atm. to do some A3-Benches.

New Benchmark-Scores with the latest GeForce v340.52 Beta:

3D-Mark-Score:

Firestrike (standard):

18240

Fire Strike (extreme):

10185

:)

Does it really make any difference if you get better Frames with latest drivers? When AI is loaded and the fighting breaks out you still get the same results as before.

Edited by Nikiforos

Share this post


Link to post
Share on other sites
They're actually WHQL drivers as well, not beta.

Thanks for the info. Missed that.

@Nikiforos:

It depends. Sometimes you get an improvement in A3 and sometimes not.

:)

Share this post


Link to post
Share on other sites

You know I got a new GPU and I get the same frames running altis benchmark. Same settings as before. Of course when I'm alone on the map I get much better FPS than before but I guess that's not the question.

Share this post


Link to post
Share on other sites

New drivers reminds me of this

http://www.extremetech.com/gaming/180088-nvidias-questionable-geforce-337-50-driver-or-why-you-shouldnt-trust-manufacturer-provided-numbers

A lot of hyperbole. I wish it was true since I now own an Nvidia-card. Been using ATI for 10 years prior. Never seen as ridiculous claims.

Btw, still getting 14-20 fps on my MP mission, run locally. Went from 337 to 340 drivers.

Share this post


Link to post
Share on other sites
You know I got a new GPU and I get the same frames running altis benchmark. Same settings as before. Of course when I'm alone on the map I get much better FPS than before but I guess that's not the question.

That's really sad and I wonder if I get any improvements ...

:confused:

New drivers reminds me of this

http://www.extremetech.com/gaming/180088-nvidias-questionable-geforce-337-50-driver-or-why-you-shouldnt-trust-manufacturer-provided-numbers

A lot of hyperbole. I wish it was true since I now own an Nvidia-card. Been using ATI for 10 years prior. Never seen as ridiculous claims.

Btw, still getting 14-20 fps on my MP mission, run locally. Went from 337 to 340 drivers.

Thanks for checking it out and sharing your outcome.

I don't really believe any ads without doing some research about it.

:)

Share this post


Link to post
Share on other sites

Honestly, I think the days of driver updates massively improving performance are long gone. Barring, of course, some horrible bug in the driver that causes abnormally low performance on release of a certain title.

Share this post


Link to post
Share on other sites
Honestly, I think the days of driver updates massively improving performance are long gone. Barring, of course, some horrible bug in the driver that causes abnormally low performance on release of a certain title.

Yeah, it's usually only after we got some new GPU's on the market and they optimized the drivers for it or added new SLI-Profiles.

:)

Share this post


Link to post
Share on other sites

Thats not true maverick, ~2 month ago Nvidia released a driver which pushes the frames at several games. Arma 3 was one of the Games which gains 0% from it. No suprise for me when i take a look at "Performance Patches and fixes" which were implemented in 1 year since release, same amount as the driver -> 0%

The driver didnt just profit from SLI, Single GFX Systems had a great boost too... The driver is there, but making the devs start to work for this to improve some frames... Hahahahahaha, makes me laugh, this will not happen.

SLI

780tisli-4770k-per_1s3kn8.png 780tisli-3960x-per_1pcjc5.png

Single-GPU

780ti-3960x-perm4dq1.png 750ti-3960x-perdzdw3.png

Share this post


Link to post
Share on other sites
Thats not true maverick, ~2 month ago Nvidia released a driver which pushes the frames at several games. Arma 3 was one of the Games which gains 0% from it. No suprise for me when i take a look at "Performance Patches and fixes" which were implemented in 1 year since release, same amount as the driver -> 0%

The driver didnt just profit from SLI, Single GFX Systems had a great boost too... The driver is there, but making the devs start to work for this to improve some frames... Hahahahahaha, makes me laugh, this will not happen.

SLI

http://abload.de/thumb2/780tisli-4770k-per_1s3kn8.png http://abload.de/thumb2/780tisli-3960x-per_1pcjc5.png

Single-GPU

http://abload.de/thumb2/780ti-3960x-perm4dq1.png http://abload.de/thumb2/750ti-3960x-perdzdw3.png

Yeah true, but that's not happening all the time and not ALL games are effected.

:)

Share this post


Link to post
Share on other sites

I just wonder...since those tests were with an i7, how much of the performance was gained because the previous drivers were buggy/insufficient and was there some kind of optimizations turned on by default in new driver? Like 'Brilinear', Aniso optimization etc.

I mean, an i7 is hardly bottlenecking any system. So kinda irrevelevant to test a system with i7 installed. Mid range GPU and CPU would have told us something.

I know both companies "optimize". Brilinear is an old trick both AMD and Nvidia has used for years. A mix between bilinear and trilinear, depending on situation.

Optimize = better performance, lower quality picture.

Share this post


Link to post
Share on other sites

Doesnt matter, if the devs wont start (i bet they will never start to optimize Engine) to optimize the prehistoric engine it´s senseless to discuss about anything which goes in direction to performance, nothing will increase FPS effectivly and instantly until Engine optimizations were done (never)

You can compare it to this: buy a FIAT Panda and put the wheels of a Lamborghini on your FIAT, even with this by yourself optimized parts you FIAT wont reach 140kmh. With Arma is the same, you can optimize the hell up on earth, Arma wont run faster. Cause its not anticipated by that Fred Feuerstein Engine

Share this post


Link to post
Share on other sites

I have to agree with LSD_Timewarp82 people need to understand that BI is the only one who can optimize the game properly without outside interference.

Share this post


Link to post
Share on other sites
Doesnt matter, if the devs wont start (i bet they will never start to optimize Engine) to optimize the prehistoric engine it´s senseless to discuss about anything which goes in direction to performance, nothing will increase FPS effectivly and instantly until Engine optimizations were done (never)

You can compare it to this: buy a FIAT Panda and put the wheels of a Lamborghini on your FIAT, even with this by yourself optimized parts you FIAT wont reach 140kmh. With Arma is the same, you can optimize the hell up on earth, Arma wont run faster. Cause its not anticipated by that Fred Feuerstein Engine

I can only agree with this if OFP is any guideline.

I fired it up a month ago, thinking I should have good fps. I locked the FPS to 100 Hz but it still dropped to 60 fps in places. Sitting in the back of the truck, basically just drawing the road and a couple trees. What suprised me was that the dip came when OFP was drawing LESS.

In any other game from that time period I would probably get 300-500 FPS.

Then I think about Mafia, Hidden and Dangerous. Also had abysmal performance that never got fixed.

http://www.rockpapershotgun.com/2011/07/26/czech-veterans-form-new-studio-warhorse/

Think that game is a buy? =)

Edited by mamasan8

Share this post


Link to post
Share on other sites

Got a second R9 280X real cheap for a crossfireX config. Loaded up ArmA 3 and was surprised to get about 1 1/2 the usual FPS, around 90-110 FPS, wasn't really expecting the scaling to be all that great even though it is in most everything else bar some niche games and flight simulators. Hop in a multiplayer server, go back down to 24 fps. SMH.

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now

×