Jump to content
k3lt

Low CPU utilization & Low FPS

Recommended Posts

Ah but that is not a good test unless you had a script delete all the bodies and other ground objects (blood, casings, dropped weapons, etc)

i thought as much. so stuff obejcts being in the world cause fps drop too. and this i think could be optimized in todays computers! a wild guess.

i like armas hyperrealistic approach, but what if it could delete generic items like helms and vests after 30 minutes or so to delete some of the junk and maybe "magically put stuff thats laying around 50 m of each other on one stack etc.. stuff like that - maybe would work? i dont know the code.. but most of the Objects generated by dead bodies are clothing and some stuff no ones uses...

Share this post


Link to post
Share on other sites
i thought as much. so stuff obejcts being in the world cause fps drop too. and this i think could be optimized in todays computers! a wild guess.

i like armas hyperrealistic approach, but what if it could delete generic items like helms and vests after 30 minutes or so to delete some of the junk and maybe "magically put stuff thats laying around 50 m of each other on one stack etc.. stuff like that - maybe would work? i dont know the code.. but most of the Objects generated by dead bodies are clothing and some stuff no ones uses...

I'm not sure those modules are already in a3, but in a2 there are modules in the mission editor that spawn and unspawn units slightly beyond player range. Takes a little effort though.

Share this post


Link to post
Share on other sites

Tried adding -winxp and -exthreads=7 to the launch parameters. No difference in FPS at all.

tl;dr Launch Parameters do absolutely nothing for performance.

Share this post


Link to post
Share on other sites

these who always expect theirs multicore CPU maxxed out by games fail to realize that there is always overhead by syncing or minimal timeframe needed to finish operation on actual primary thread ,

there is also http://en.wikipedia.org/wiki/Amdahl's_law and much more problems in multithreaded coding (there are whole books about it)

so 99+% utilization of both CPU / GPU or just all multiple CPU/GPU in complex gaming is yet to be seen , they not benchmarks and specialized tasks ...

we will work on improving multithreaded capability of the Arma 3 engine,

yet this feature is in Arma 2 engine since 2009 http://www.bistudio.com/english/company/developers-blog/91-real-virtuality-going-multicore

ironically the last paragraph from the article still does apply

Edited by Dwarden

Share this post


Link to post
Share on other sites

Personally i dont expect my CPU maxxed by Arma3 but also not like 30% utilization, homever since i have well over recommended specs i would expect atleast 30 fps on Low visual settings, instead i'm stuck with 15-20. (in populated multiplayer servers, or with large number of AI/Vehicles even lower than that)

And if you noticed, even people with latest gen. intel i5/i7's have issues obtaining stable fps.

http://feedback.arma3.com/view.php?id=716&nbn=83#bugnotes

Edited by k3lt

Share this post


Link to post
Share on other sites
these who always expect theirs multicore CPU maxxed out by games fail to realize that there is always overhead by syncing or minimal timeframe needed to finish operation on actual primary thread ,

there is also http://en.wikipedia.org/wiki/Amdahl's_law and much more problems in multithreaded coding (there are whole books about it)

so 99+% utilization of both CPU / GPU or just all multiple CPU/GPU in complex gaming is yet to be seen , they not benchmarks and specialized tasks ...

we will work on improving multithreaded capability of the Arma 3 engine, yet this feature is in Arma 2 engine since 2009 http://www.bistudio.com/english/company/developers-blog/91-real-virtuality-going-multicore

Just get the game to where i wont need a 3000 dollar computer to have to runt his game on 40 fps in towns or heavily populated AI. i should be able to run this game with my card and cpu with no problem. I do hope for the best and have some what faith in you Devs, get er done plz.

Share this post


Link to post
Share on other sites
Personally i dont expect my CPU maxxed by Arma3, homever since i have well over recommended specs i would expect atleast 30 fps on Low visual settings, instead i'm stuck with 15-20. (in populated multiplayer servers, or with large number of AI/Vehicles even lower than that)

And if you noticed, even people with latest gen. intel i5/i7's have issues obtaining stable fps.

http://feedback.arma3.com/view.php?id=716&nbn=83#bugnotes

expect the MP part to improve when servers gets true dedicated servers ....

Share this post


Link to post
Share on other sites

Oh, actually good that you've mentioned it.. how it's actually possible that Multiplayer servers are affecting client's performance (fps wise).. i think Arma franchise is the only game around with that issue, could you put some light on it i mean how is that happening exactly ?

Also if you could predict what kind of performance percentage gain we can expect with true dedicated servers? (obviously not asking for a exact number, just estimate)

Drop some bone for us poor souls. ;)

Edited by k3lt

Share this post


Link to post
Share on other sites

I'm sorry but this a fallacy. I can expect this engine to do so as CryEngine can do so and it -is- your competition. Complex games are already in th works or out in beta as we speak that calculate unbelievable amounts of algorithms while pushing cpu's well in excess of 75% usage, closer to 90%. It can be done, and the fact arma cannot is one of the most defining reasons it's not taken off. a dual core i3 can run this game virtually the same speed clock for clock, this was certainly true of Arma 2. This is unacceptable, period, CryEngine can get into so much detail as water currents across rivers spanning island sizes much larger than Stratis, so stop pulling our legs.

I'm not stupid, and neither are the people who wish to play your game. Get it in gear, make this game playable for everyone and reward those who bought serious machines to play this. The fact my older machine is not effected in FPS make or otherwise (Aside 3D res, not enough VRAM) yet NEITHER my GPU -or- cpu is pushed much past 30-50% and I lag either way is -NOT ACCEPTABLE, PERIOD- Get it together, so that we can make this one of the largest games on PC, and show people what a -real- PC game is.

P.S.: Love Arma, love B.I. And will continue to support you, bought the digital collector's pack, will buy all DLC and expansions as I did before. But it's hard to get my friends to play because they got 3-7 thousand dollar custom built rigs that can't play the game any better then vastly inferior machines, all of which lag. It's a major put off, and to anyone who says you don't need the high FPS because of the pace of the game, this is a HORRIBLE and SERIOUSLY incorrect mindset, smoothness is smoothness, if you move at all, if anything moves, you want it to be smooth, people can and do see higher FPS, the eye can be trained like any other part of your body, stop with the fud.

Edited by Kevaskous

Share this post


Link to post
Share on other sites

i guess people dont see the massive difference. I dont want to be sided, just give some ideas:

arma has higher priority in syncing those many objects etc

while battlefield 3 doesnt sync so much. did you drive with two jeeps in BF3 -> you see strange rubberband lags. Chests and stuff wont be synced between clients too!

Share this post


Link to post
Share on other sites

What about a benchmark mission like the ArmA2mark in order to test several settings reliably across different PCs ????

Share this post


Link to post
Share on other sites
What about a benchmark mission like the ArmA2mark in order to test several settings reliably across different PCs ????
the engine supports autotest benchmarking, so it's just question of time

Share this post


Link to post
Share on other sites
i guess people dont see the massive difference. I dont want to be sided, just give some ideas:

arma has higher priority in syncing those many objects etc

while battlefield 3 doesnt sync so much. did you drive with two jeeps in BF3 -> you see strange rubberband lags. Chests and stuff wont be synced between clients too!

Syncing is mostly bandwidth and latency, however like anything does have a small performance hit, but not in the magnitude we see now.

Share this post


Link to post
Share on other sites

Did a couple co-op missions with a friend. The FPS i'm getting on there is magnitudes higher than normal MP servers or Wasteland servers.. but I suppose it was just 2 players vs. AI hence the decent FPS.

Share this post


Link to post
Share on other sites
Just get the game to where i wont need a 3000 dollar computer to have to runt his game on 40 fps in towns or heavily populated AI. i should be able to run this game with my card and cpu with no problem. I do hope for the best and have some what faith in you Devs, get er done plz.

You need a new video card. See my other posts, I had a 6850, upgraded, now i dont complain anymore.

Game runs fine, I even have a worse CPU.

I got a 7870 for $240 and it runs the game way better.

Share this post


Link to post
Share on other sites

I'm confused as to how CPU and arma work together, I have an AMD 6100 with '6 cores', I know not all 6 are used for the game as I've used afterburner, but why kid of percentages should I be looking for on each core/overall? Can somebody give me a brief rundown... Thanks.

---------- Post added at 00:12 ---------- Previous post was at 00:11 ----------

What kind of*

Share this post


Link to post
Share on other sites
these who always expect theirs multicore CPU maxxed out by games fail to realize that there is always overhead by syncing or minimal timeframe needed to finish operation on actual primary thread ,

there is also http://en.wikipedia.org/wiki/Amdahl's_law and much more problems in multithreaded coding (there are whole books about it)

so 99+% utilization of both CPU / GPU or just all multiple CPU/GPU in complex gaming is yet to be seen , they not benchmarks and specialized tasks ...

we will work on improving multithreaded capability of the Arma 3 engine,

yet this feature is in Arma 2 engine since 2009 http://www.bistudio.com/english/company/developers-blog/91-real-virtuality-going-multicore

ironically the last paragraph from the article still does apply

That's true, but it should still get pretty high. For example, I am currently compiling the entire linux 2.6 kernel with a makefile with -j 4 parameters to use all 4 cores (for an operating systems computer science course). The usage of the cpu is still 100 percent for at least one core most of the time.

If the process niceness algorithm in the os is decent, we should still be seeing decent speed even with all the context switches and stuff. Unless the paging is messed up and we end up page faulting constantly and waste time looking for the page in disk, which doesn't seem to be the case because people with ssd's still get this problem.

From what I read of that last paragraph, I wouldn't say that using the cpu is as simple as spawning a bunch of threads and spinning them in an infinite loop. Firstly, the os wouldn't let that one process take up all the cpu's time; it's not running a fifo process management algorithm or anything (that would be disastrous for concurrency). High cpu usage should mean that each process is running its fair share of time on the cpu, then being switched out for another process depending on its niceness value. Even with the overhead of finishing the process on the primary cpu shouldn't mean the cpu idles and it's usage tanks; it should immediately load another process in there to work on

Edited by ruhtraeel

Share this post


Link to post
Share on other sites

I'm not a c/c++ developer but if there are any out there could they try creating a malloc implementation with the latest version from http://threadingbuildingblocks.org/ I can see that they released an update February 7, 2013

I dont know if the update brings any major performance gains.

I'm currently using -malloc=TCMalloc_bi as I think it performs the best.

-malloc=TBB4Malloc_bi is good as well, so it could be fun to compare the current version we have with the updated version :-)

Share this post


Link to post
Share on other sites

You can add me to the list.

I got a Q6600 OC'ed to 3.5GHz, with a GTX 670 running at 1200MHz. The problem exists with me where I have low cpu usage and in turn my GTX 670 is not used to its full capacity. I get ~20 fps with settings turned high but not ultra.

Moreover, As I play longer and longer, the game just becomes slower and slower. At one point, playing MP, I got down to 9-11FPS, so I turned down everything to low, lowered the resolution to 1280x768 from 1920X1200, and the FPS did not change.

I doubt that this is a problem with my hardware, as every other game does take advantage of everything I have.

ARMA II did that btw, so whatever that was copied over to ARMA III included whatever problem I, and many other on here seem to be having.

I used CPU affinity to make ARMA III use 2 cores, instead of all cores, and the fps did not change meaning 2 cores arn't even being used. I tried all the perameters that were mentioned on this thread and I got no performance increase

My specs again:

Intel Q6600 2.4GHz @ 3.5GHz

GTX 670 2GB @ 1200 MHz

8GB 1066 DDR2 memory 5-5-5-15

Edited by Quakky

Share this post


Link to post
Share on other sites
I'm sorry but this a fallacy. I can expect this engine to do so as CryEngine can do so and it -is- your competition. Complex games are already in th works or out in beta as we speak that calculate unbelievable amounts of algorithms while pushing cpu's well in excess of 75% usage, closer to 90%. It can be done, and the fact arma cannot is one of the most defining reasons it's not taken off. a dual core i3 can run this game virtually the same speed clock for clock, this was certainly true of Arma 2. This is unacceptable, period, CryEngine can get into so much detail as water currents across rivers spanning island sizes much larger than Stratis, so stop pulling our legs.

I'm not stupid, and neither are the people who wish to play your game. Get it in gear, make this game playable for everyone and reward those who bought serious machines to play this. The fact my older machine is not effected in FPS make or otherwise (Aside 3D res, not enough VRAM) yet NEITHER my GPU -or- cpu is pushed much past 30-50% and I lag either way is -NOT ACCEPTABLE, PERIOD- Get it together, so that we can make this one of the largest games on PC, and show people what a -real- PC game is.

P.S.: Love Arma, love B.I. And will continue to support you, bought the digital collector's pack, will buy all DLC and expansions as I did before. But it's hard to get my friends to play because they got 3-7 thousand dollar custom built rigs that can't play the game any better then vastly inferior machines, all of which lag. It's a major put off, and to anyone who says you don't need the high FPS because of the pace of the game, this is a HORRIBLE and SERIOUSLY incorrect mindset, smoothness is smoothness, if you move at all, if anything moves, you want it to be smooth, people can and do see higher FPS, the eye can be trained like any other part of your body, stop with the fud.

no, you CAN'T! can you jump 300 mts high if you train hard?

is not the eye, is the brain that "compute" that

30 is the max, you will not notice the difference between 40-60 or 200 fps unless you move your mouse like an epileptic, so you will may notice some ghost images at 30-40 perhaps

Share this post


Link to post
Share on other sites
no, you CAN'T! can you jump 300 mts high if you train hard?

is not the eye, is the brain that "compute" that

30 is the max, you will not notice the difference between 40-60 or 200 fps unless you move your mouse like an epileptic, so you will may notice some ghost images at 30-40 perhaps

eye doesnt see fps like cameras, 24 is the minimum for our brain to perceive pictures as continuous motion but in fact our eyes can notice 400fps or more. there is celar difference between 100 fps and 30 fps, to me 30 fps is choppy and broken. playable but it isnt smooth. around 60 fps its smooth enough so it plays great, but 80-120 fps is completely smooth to me. thats the kind of fps i play on fast paced fps games like counter strike, hard reset (where theres a lot of shit going on and the difference on fps becomes clear) and others.

this 30fps human eye crap is a myth.

people do jum higher with training but of course theres a limit, but about the eye, its more about learning how to perceive the difference than training the eye to go higher. btw, theres a reason there are 120hz-144hz gaming monitors. although your reflexes is something its proven you can train to get a lot faster.

http://www.100fps.com/how_many_frames_can_humans_see.htm

----------

these who always expect theirs multicore CPU maxxed out by games fail to realize that there is always overhead by syncing or minimal timeframe needed to finish operation on actual primary thread ,

there is also http://en.wikipedia.org/wiki/Amdahl's_law and much more problems in multithreaded coding (there are whole books about it)

Of course there are when their cpu utilization wont go higher than 35% (true 6 core cpu) at any given time and the game go as low as 10fps (with 35%cpu and 40% gpu on ultra), i think thats completely reasonable, and about syncing, i accept that as an performance impact issue but when someone else on the same server has a higher performance per core cpu and it gets higher fps, that difference is studied and its proven by this pattern that the game is per core dependant, the issue becomes clear that the fault is not using the cpu properly (more than 2 cores).

so 99+% utilization of both CPU / GPU or just all multiple CPU/GPU in complex gaming is yet to be seen , they not benchmarks and specialized tasks ...

Acctually some gaming benchmarks with low resolution do get to use close to 100% cpu, and theyre used as cpu performance comparison on review sites, but thats not really what we are asking, even if it were 75% it would be fine, its just that being limited to 2 cores is bad on a cpu hindered software. 50% on 4 cores and 35% on 6 cores clearly shows the game is bound by 2 cores max. but if bohemia oficially stats that it is how it is and we need better performing 2 cores, fine, but dont state that the game is multicore because it isnt.

we will work on improving multithreaded capability of the Arma 3 engine,

yet this feature is in Arma 2 engine since 2009 http://www.bistudio.com/english/company/developers-blog/91-real-virtuality-going-multicore

ironically the last paragraph from the article still does apply

Thats all we want to hear, that you will work on it, hopefully making multicore work properly for the first time around on an arma game. can bohemia check this issue out (its already reported with maximum votes) and make a statement if proper multicore support will be possible or if this usage will continue as is? my worry is that despite your efforts to optimize the game more, which is natural, this specific problem wont be addressed or recognized. like it wasnt on arma 2. issues like this are completely acceptable on an alpha but if the game launches like this i will ask for my money back. (had to say, most companies only respond to that)

and that article is wrong on practice, if i turn off 4 of my 6 cores i get the exact same performance on both arma 2 or 3. both games dont use more than 2 cores, windows just show it does because it keeps switching what 2 cores its using at any given moment between the avaiable cores, but the workload never goes higher than exactly the equivalent of 2 cores like i said before. also, on 3d rendering using more cores scales perfectly no matter how much cores are thrown at it, it all depends on how the workload is coded.

if that wasnt intended, the code is broken on both games and you guys havent realized it so far, despite years of people shouting about this, but hey, here it is.

anyway, thats my only major gripe and gamebreaking issue with the arma series, cpu being very underutilized (fps going very low affecting how shootouts are defined), i could go on about all the things i do love about it but i guess thats a different topic.

Edited by white

Share this post


Link to post
Share on other sites
no, you CAN'T! can you jump 300 mts high if you train hard?

is not the eye, is the brain that "compute" that

30 is the max, you will not notice the difference between 40-60 or 200 fps unless you move your mouse like an epileptic, so you will may notice some ghost images at 30-40 perhaps

You are a complete idiot, and have clearly only been playing on a shit PC. You do not see in FPS, there is NO LIMIT to the amount of FPS you can see, no limit. What happens is that there is just diminishing returns the higher you go. You're never going to see the difference between 1,000,000 and 1,000,030 fps, but you'll be seeing it at 1,000,030. There is huge difference between 40 and 60 FPS, and there is just as big of a difference between 60 and 120fps. Go out, get a good graphics card, a 120hz monitor, and you will see the difference.

Share this post


Link to post
Share on other sites
no, you CAN'T! can you jump 300 mts high if you train hard?

is not the eye, is the brain that "compute" that

30 is the max, you will not notice the difference between 40-60 or 200 fps unless you move your mouse like an epileptic, so you will may notice some ghost images at 30-40 perhaps

This is completely and absolutely wrong. It takes 24 consecutive frames per second for your brain to take a string of images and perceive it as "video". That's true. But the eye sees much much more.

Of course, the higher you go the harder it would be to notice. That is also true. But that's well into the 100s. Go look at a 24 fps video, and a 48 FPS video. Go watch a 60FPS twitch stream. It's night and day difference especially in competitive fast paced combat. You are much more aware of your surroundings when everything updates so fast. You can move the view much faster and still get a clear glance at what's in that area.

Share this post


Link to post
Share on other sites
You are a complete idiot, and have clearly only been playing on a shit PC. You do not see in FPS, there is NO LIMIT to the amount of FPS you can see, no limit. What happens is that there is just diminishing returns the higher you go. You're never going to see the difference between 1,000,000 and 1,000,030 fps, but you'll be seeing it at 1,000,030. There is huge difference between 40 and 60 FPS, and there is just as big of a difference between 60 and 120fps. Go out, get a good graphics card, a 120hz monitor, and you will see the difference.

Just to add to this, the MINIMUM frame time (i.e. maximum frame rate) for your brain to register sufficient information to form an image is 1/220 of a second (220fps). Past this point you'll only register differences in smoothness or responsiveness, you won't gain any more "visual smoothness". In other words, watching a movie at any more than 220fps in unnecessary, playing a game or interacting with anything you can push it until there is minimal response time, which is far beyond 220fps.

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now

×