Jump to content

Sign in to follow this  
Erikson20

What is bottlenecking my PC?

Recommended Posts

Hi. Today I just bought a new graphic card. I still only manage to get up to 30 FPS (down to 15 FPS in towns).

My specs are:

Graphic card - AMD Radeon R9 290

CPU - AMD Phenom II X6 1090T

RAM - 8GB DDR3

And the startup paremeters I use are :

-cpuCount=6 -exThreads=7 -maxMem=7168 -maxVram=4095 -noPause -noSplash -world=empty -high -malloc=tbbmalloc

Share this post


Link to post
Share on other sites

its a game problem, not a hardware problem. no matter what CPU/GPU and how much ram u use, it wont be better since its something wrong with the engine.

even when u have 128 gb RAM, intel core I8 - 9,9 GHz, Nvidia Geforce 1280GTX 30 GB Ram, nothing will change, BI are the only one who can access theyr engine so only they are able to fix it.

we have this problem since OFP.

watch this video:

Share this post


Link to post
Share on other sites
  Erikson20 said:
Hi. Today I just bought a new graphic card. I still only manage to get up to 30 FPS (down to 15 FPS in towns).

My specs are:

Graphic card - AMD Radeon R9 290

CPU - AMD Phenom II X6 1090T

RAM - 8GB DDR3

And the startup paremeters I use are :

-cpuCount=6 -exThreads=7 -maxMem=7168 -maxVram=4095 -noPause -noSplash -world=empty -high -malloc=tbbmalloc

I had nearly the same gear, but i have an AMD-FX 8350 CPU.

Until yesterday i had the Radeon R9 290X in my PC too, now i sended this graficcard back to vendor because i got more performance with my old Geforce GTX 670!

It is a CPU Problem.

I have poor performance too, mostly on multiplayermatches.

Edited by Clawhammer

Share this post


Link to post
Share on other sites

Part of it is the AI. It all runs on a single core. Not a problem for those who play MP only, but for those who play single player it's definitely one of the big performance hits. Another reason is that, like explained in the video, the CPU has to do a crap-ton of calculations, loading from the hard drive (64bit would help), and caching all at the same time. My suggestion: Until BI releases an update that drastically improves performance, have the auto-detect find the best settings for you, then tweak them until you find your balance of playability and prettyness.

Share this post


Link to post
Share on other sites
  devilslayersbane said:
Part of it is the AI. It all runs on a single core. Not a problem for those who play MP only, but for those who play single player it's definitely one of the big performance hits. Another reason is that, like explained in the video, the CPU has to do a crap-ton of calculations, loading from the hard drive (64bit would help), and caching all at the same time. My suggestion: Until BI releases an update that drastically improves performance, have the auto-detect find the best settings for you, then tweak them until you find your balance of playability and prettyness.

Currently i can set in multiplayer everythink on the lowest settings and nothing helps :P

Share this post


Link to post
Share on other sites

Lowest settings aren't optimal for arma 3. Try this guide. It was linked in the latest sitrep. Don't OC if you don't have to. I'm running a 3 year old gaming laptop and I get 30-40 in mp depending on the map.

Share this post


Link to post
Share on other sites
  devilslayersbane said:
Lowest settings aren't optimal for arma 3. Try this guide. It was linked in the latest sitrep. Don't OC if you don't have to. I'm running a 3 year old gaming laptop and I get 30-40 in mp depending on the map.

I know this guide, this parameters and this dirty set shadow to ultra tricks but currently it dont helps much. Iam hoping the next patch brings back the old performance back before patch 1.28 comes out.

Share this post


Link to post
Share on other sites

Due to arma 3's map-size the FPS you get in game is almost entirely dependent on the server itself. If the server is not sending enough messages per simulation tick you will get FPS drop. I've found that if you have 384 messages per tick at a maximum of 2048bytes It will take 39.86MB/s of upload for the server nearly 1/3 of a 125MB/s upload (1Gbps port)

Now you ask what this has to do with FPS?

Well Server Tick Simulation time <-> Client FPS they are related why? Arma 3 does something pretty difficult it keeps track of not just 1 or 2 objects but thousands of objects. For every rock, mine, car, player house etc. Is reporting it's position to client and the server updating as necessary people with further draw-distance will ask for draw-call update very frequently those who have smaller will do it less. In having Millions of objects in a scene in arma 3 the game not only has to keep track of this but also get network updates about it.

Your system may well be able to pull 60+ FPS doesn't mean the server is.

Edited by Polymath820

Share this post


Link to post
Share on other sites
  Polymath820 said:
Due to arma 3's map-size the FPS you get in game is almost entirely dependent on the server itself. If the server is not sending enough messages per simulation tick you will get FPS drop. I've found that if you have 384 messages per tick at a maximum of 2048bytes It will take 39.86MB/s of upload for the server nearly 1/3 of a 125MB/s upload (1Gbps port)

Now you ask what this has to do with FPS?

Well Server Tick Simulation time <-> Client FPS they are related why? Arma 3 does something pretty difficult it keeps track of not just 1 or 2 objects but thousands of objects. For every rock, mine, car, player house etc. Is reporting it's position to client and the server updating as necessary people with further draw-distance will ask for draw-call update very frequently those who have smaller will do it less. In having Millions of objects in a scene in arma 3 the game not only has to keep track of this but also get network updates about it.

Your system may well be able to pull 60+ FPS doesn't mean the server is.

Yeah but my friend gets twice as many FPS's at the same time and my graphic card is better. Even though he has intel i7 2600k which is 4 core (3.4) but it' s suppose to perform better than my 6 core AMD (3.2)

Share this post


Link to post
Share on other sites
  Erikson20 said:
I just overclocked my processor to 3.7 and my fps is the same :confused:

do not listen to those who say that your system is the problem.

its fact that its a game problem and only BI can fix it.

128 GB RAM, Intel Core I10 with 8 core 12 GHZ, Geforce GTX999 with 20 GB VRAM wont fix the problem.

Share this post


Link to post
Share on other sites

I OCed my processor from 2.8 ghz to 3.5, it didnt have that much more fps in game overall, but when it got to a really heavy urban firefight, without OC usually I would get so laggy the audio would glitch and chop, and it would freeze for seconds at a time, with the OC I stayed at like 7-15 FPS during heavy urban firefight ( same mission) without any glitching or real freezing, so it had an effect for me, but my CPU overheated when I oced anyway.

Share this post


Link to post
Share on other sites
  MikeTim said:
I OCed my processor from 2.8 ghz to 3.5, it didnt have that much more fps in game overall, but when it got to a really heavy urban firefight, without OC usually I would get so laggy the audio would glitch and chop, and it would freeze for seconds at a time, with the OC I stayed at like 7-15 FPS during heavy urban firefight ( same mission) without any glitching or real freezing, so it had an effect for me, but my CPU overheated when I oced anyway.

Overclocking doesn't improve anything. If the server you are playing on isn't up to the challenge best case on a high performance server You need an upload speed of 39.89MB/s+

The server needs to have network tweaks done.

http://forums.bistudio.com/showthread.php?156684-Tutorial-Server-bandwidth-amp-Optimisation

Edited by Polymath820

Share this post


Link to post
Share on other sites

I made some tests, trying to find `ze best malloc flag. i was using ArmA 3 v0.51.Stratis Benchmark (because i had that..), and tried different malloc flags, without touching anything else.

Startup options (of course i changed the malloc flag...): -noSplash -skipIntro -cpuCount=8 -exthreads=7 -high -maxMem=8192 -noLogs -malloc=tbbmalloc -empty

System:

CPU i7 4770k @4,4ghz

Ram: 16gb cl9 @ 1600

GPU: GTX 660 TI OC 2gb

HDD: LSI 9271-8i SAS2 8x + Cache Vault => RAID10 with 6x 480 GB SATA III Intel SSD MLC 2,5" (DC S3500 series)

Settings of Arma3: Mostly high, some very high - but anyways, the settings were untouched, same stuff like i use to play.

All malloc startup options had 5 tests each, posted result is rounded.

tbb4malloc_bi 94,1 fps

tbbmalloc 93,3 fps (Fred41)

system 88,9 fps

tbb3malloc_bi 95,6 fps

jemalloc_bi 91,9 fps

So whatever makes you happy - but i`d say it just don`t specify a malloc parameter since the results show: it`s all so close together - and tbb3 won ;P

Have a nice weekend, Nik

Share this post


Link to post
Share on other sites

Yeah, its a large chunk of change because you will need a new board, but this game really needs a 4770k or 4790k to shine... even then it can be brought to its knees by enough AI instances... I can go from 80+ fps on Stratis MP to less than 20 fps on SP Stratis w/ a lot of AI. I use max settings w/ 1600/1200/100 draw. Consider draw distance reduction as first step to reduce CPU overhead. I went from from 770 SLI getting 60% core utilization to >90% with those draw distance settings... It will take time to find what works best for you.... Max single core CPU performance, mem speed / latency, fast SSD, and a pretty high-end graphics card are needed to really enjoy this game on max settings and even then your single-core performance will likely be your bottleneck.

Its best (and not easy) to find that sweet spot where you are close to maxing out your CPU and GPU cores to get best FPS. Obviously, its a dynamic target and it moves as AI instances increase or decrease. Its not hard to find the sweet spot for MP, its SP scenarios that are more difficult to tune for.

Get 78fps from the Stratis benchmark on max settings w/ draws mentioned above

1080p 144hz

4770k @ 4.3

8GB 2133

2x 770 SLI 4GB

SSD

Edited by stang725

Share this post


Link to post
Share on other sites
  Niktator said:
I made some tests, trying to find `ze best malloc flag. i was using ArmA 3 v0.51.Stratis Benchmark (because i had that..), and tried different malloc flags, without touching anything else.

Startup options (of course i changed the malloc flag...): -noSplash -skipIntro -cpuCount=8 -exthreads=7 -high -maxMem=8192 -noLogs -malloc=tbbmalloc -empty

System:

CPU i7 4770k @4,4ghz

Ram: 16gb cl9 @ 1600

GPU: GTX 660 TI OC 2gb

HDD: LSI 9271-8i SAS2 8x + Cache Vault => RAID10 with 6x 480 GB SATA III Intel SSD MLC 2,5" (DC S3500 series)

Settings of Arma3: Mostly high, some very high - but anyways, the settings were untouched, same stuff like i use to play.

All malloc startup options had 5 tests each, posted result is rounded.

tbb4malloc_bi 94,1 fps

tbbmalloc 93,3 fps (Fred41)

system 88,9 fps

tbb3malloc_bi 95,6 fps

jemalloc_bi 91,9 fps

So whatever makes you happy - but i`d say it just don`t specify a malloc parameter since the results show: it`s all so close together - and tbb3 won ;P

Have a nice weekend, Nik

Someone humor me, overclock your video-card? See what FPS improvement you get? Overclocking the video-card is probably a better option why? The reality is that R9 290X with it's thousands of shader-cores will translate to large FPS improvements in small overclocks why? cores * GPU clock.

Edited by Polymath820

Share this post


Link to post
Share on other sites
  Polymath820 said:
Someone humor me, overclock your video-card? See what FPS improvement you get? Overclocking the video-card is probably a better option why? The reality is that R9 290X with it's thousands of shader-cores will translate to large FPS improvements in small overclocks why? cores * GPU clock.

Well, I`d say the CPU is the bottleneck. But actually, my test was about malloc. I was "inspired" by a statement, that optimising flags are kinda useless, some may not even exist. So I picked malloc.

I just bought a GTX 870 from MSI - overclocked by factory. Hopefully it will arrive monday. That should be enough to get some more eyecandy stuff.

TBH, even the 660 would be fine, if they would start optimising Arma3. But it seems like, they just need so long for "finishing" the game, that the hardware will be fast enough - even without touching the engine :j:

Share this post


Link to post
Share on other sites
  Erikson20 said:

CPU - AMD Phenom II X6 1090T

This right here is the problem.

Even if it's the same speed as an Intel processor, it will still perform significantly worse.

Share this post


Link to post
Share on other sites
  Niktator said:
Well, I`d say the CPU is the bottleneck. But actually, my test was about malloc. I was "inspired" by a statement, that optimising flags are kinda useless, some may not even exist. So I picked malloc.

I just bought a GTX 870 from MSI - overclocked by factory. Hopefully it will arrive monday. That should be enough to get some more eyecandy stuff.

TBH, even the 660 would be fine, if they would start optimising Arma3. But it seems like, they just need so long for "finishing" the game, that the hardware will be fast enough - even without touching the engine :j:

Overclock the GPU see if you get a different result.

Share this post


Link to post
Share on other sites

Ups, I did not buy a 870, it`s a 970 of course ;)

I can`t overclock my 660 TI any further without watercooling. The 970 is still not here, so I`ll have to wait. I really can`t wait for the first benchmark with the 970 and untouched settings.

Share this post


Link to post
Share on other sites

It won't matter. Pay 2000$ to barely manage 30 fps on a bad but populated Arma3 server or you're stuck. The developers of this game are shit at optimizing. Their optimizations are nonexistant and so is their ability to do anything but spam content onto an overloaded and crappy base ystem i.e. a simple system that lets you walk endlessly in any direction and highly outdated/unoptimized code. I can run Crysis3 on ultra and the latest DX11 games with 60+ fps but regardless on if I use taht system, or my laptop which has a 4.0GHZ intel processor, it can only manage a 10-25 fps on any populated arma3 server due to the horrible optimization. I feel like complaining but the devs clearly do not see this asa problem.

If the average user could get 40 fps and not be restrained by bullshit bottlenecking that is related to their multiplayer code then maybe the game would be less boring overall.

I used to play OFP as a kid and I tried to like Arma as a series, but I grew up and realized Arma is a waste of time and horribly broken.. defending it for being nonboring or nonbroken is just a waste of time because it actually is. There are a few optimizations I would suggest codewise for them to try and also would suggest doing a simple trial and error or debug session and log how long it takes to perform certain operations in milliseconds.. maybe that way they can finally find out why there's only 20 fps in multiplayer by checking what is bottlenecking in the main threads. Hackers are too bored to fix your game because for them it's like 5x harder to do something like that without source code. Anyway don't waste money or time. Just have fun with what you have because everyone hast he same problem. If you get 30 fps you might have a big advantage in a PvP game but it's really not your fault.. it happens if you're on lowest too or Ultra, doesn't matter at all..

Share this post


Link to post
Share on other sites

Well not true at all. You can get 40 FPS with mid-class hardware. Sure, you can`t max it out. But Arma3 looks decent on normal settings.

I want to max it out, as good as I can. So this is expensive. It`s always the same: The last 15% doubles the price ;)

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
Sign in to follow this  

×