Jump to content
Sign in to follow this  
incontrovertible

ArmA 3 Alpha Performance Tweaks and Settings Guide

Recommended Posts

good guide but

"Shadow Quality: Low or DISABLED, higher means more frame lag for not much gain visually or otherwise."

Shadows on low I get 30 fps, just changing shadows to standard/high instantly gives me a +30 fps increase because they were shifted onto the GPU. Not much difference between the frame hit on high/ultra though

Difference between disabled and high is only 4 fps

Share this post


Link to post
Share on other sites

Pretty good. My one bit of constructive criticism is that it would be good if you said what certain settings did, maybe provided comparison screen shots, rather than just saying "not much gain visually or otherwise". I find that which settings improve the visuals largely depends on the player. An example being that I can't stand the pop-in on lower terrain settings.

Share this post


Link to post
Share on other sites

I went from 75 fps to 75 when adding -cpuCount 4 -exThreads 7! :)

Oh, Arma 3 Alpha update ATM, developer build.

Share this post


Link to post
Share on other sites

Interestingly, and support of Win8 will be for this game?

After all all kernels, at times seldom to involve.

If you are the owner of the "budgetary" laptop! :j:

Share this post


Link to post
Share on other sites
I went from 75 fps to 75 when adding -cpuCount 4 -exThreads 7! :)

So what you're saying is that it's useless?

Also, can we please get stuff like this to the wiki? It's nice to have a centralized place, instead of everything going everywhere.

Share this post


Link to post
Share on other sites
good guide but

"Shadow Quality: Low or DISABLED, higher means more frame lag for not much gain visually or otherwise."

Shadows on low I get 30 fps, just changing shadows to standard/high instantly gives me a +30 fps increase because they were shifted onto the GPU. Not much difference between the frame hit on high/ultra though

Difference between disabled and high is only 4 fps

That is a good find and I will add it to the guide.

Pretty good. My one bit of constructive criticism is that it would be good if you said what certain settings did, maybe provided comparison screen shots, rather than just saying "not much gain visually or otherwise". I find that which settings improve the visuals largely depends on the player. An example being that I can't stand the pop-in on lower terrain settings.

Yeah fair enough, I have however tried to keep the guide fairly simple and short so it isn't totally overwhelming. I will have another sit down sometime soon and see how to add such information yet keep it simple and short.

I went from 75 fps to 75 when adding -cpuCount 4 -exThreads 7! :)

Oh, Arma 3 Alpha update ATM, developer build.

Yep not all settings will do something for everyone, depends on your PCs build.

Interestingly, and support of Win8 will be for this game?

After all all kernels, at times seldom to involve.

If you are the owner of the "budgetary" laptop! :j:

I am running Windows 7 but some of my mates are running it on Windows 8 so it definitely works

well done, please keep updating this as Fuse said :D

Cheers, I will do my best to keep it in good condition.

Thank-you for all the replies and feedback I really appreciate it.

Share this post


Link to post
Share on other sites

This was a good help - I thought I had found all my FPS balanced with Quality, however, running through this I gained about another 4 FPS and had better looking visuals!

Share this post


Link to post
Share on other sites

Heres some information and settings:

Windows 7 64 bit

I5 2500 3.3ghz

GTX580 1.5GB latest drivers

8GB DDR3

Raptor 300 and 120 GB hardrives 10000RPM

Thermaltake level 10 gt

i start my test from the sign in the town with fraps for 60 seconds and finsh towards the radio tower following the line in the road.

by putting oject detail to standard i averaged a gain of 10 fps.

2013-03-24 19:15:23 - arma3

Frames: 3761 - Time: 60000ms - Avg: 62.683 - Min: 54 - Max: 77 WITH HIGH OBJECT DETAIL

2013-03-24 19:20:53 - arma3

Frames: 3755 - Time: 60000ms - Avg: 62.583 - Min: 53 - Max: 76 WITH HIGH OBJECT DETAIL

2013-03-25 15:50:27 - arma3

Frames: 3697 - Time: 60000ms - Avg: 61.617 - Min: 53 - Max: 79 WITH HIGH OBJECT DETAIL

2013-03-25 15:54:54 - arma3

Frames: 4487 - Time: 60000ms - Avg: 74.783 - Min: 64 - Max: 97 WITH STANDARD

2013-03-25 15:57:27 - arma3

Frames: 4519 - Time: 60000ms - Avg: 75.317 - Min: 64 - Max: 96 WITH STANDARD

2013-03-25 16:06:23 - arma3

Frames: 4348 - Time: 60000ms - Avg: 72.467 - Min: 62 - Max: 90 WITH STANDARD

2013-03-25 16:08:31 - arma3

Frames: 4487 - Time: 60000ms - Avg: 74.783 - Min: 62 - Max: 92 WITH STANDARD

photostream

photostream

photostream

photostream

images above show settings and start point i used. If you guys done the same settings and posted specs and fps data we could compare i no its not intense but it is in the town and easy to do.

sorry not sure how to add my images so heres the link

http://www.flickr.com/photos/94398048@N06/8590748678/in/photostream/

Share this post


Link to post
Share on other sites
This was a good help - I thought I had found all my FPS balanced with Quality, however, running through this I gained about another 4 FPS and had better looking visuals!

That's awesome I'm glad it helped, I'm continuing to update it so stay tuned!

-snip-

great idea mate I ran through the same test on my settings and then changed all the quality tab settings to lowest, low, standard, high, very high and ultra.

My MP Settings

Frames, Time (ms), Min, Max, Avg

4416, 60000, 60, 96, 73.600

Lowest

Frames, Time (ms), Min, Max, Avg

6191, 60000, 85, 130, 103.183

Low

Frames, Time (ms), Min, Max, Avg

5725, 60000, 82, 123, 95.417

Standard

Frames, Time (ms), Min, Max, Avg

4854, 60000, 69, 103, 80.900

High

Frames, Time (ms), Min, Max, Avg

4400, 60000, 60, 96, 73.333

Very High

Frames, Time (ms), Min, Max, Avg

3980, 60000, 53, 86, 66.333

Ultra

Frames, Time (ms), Min, Max, Avg

3599, 60000, 47, 78, 59.983

Share this post


Link to post
Share on other sites

Thanks for all the time and effort in making this guide mate, helped heaps.

When I get time I will post my benchmarks on a lower end PC to show the benefits to slower rigs.

Share this post


Link to post
Share on other sites

Would be really helpful if we could get official word on what settings are used by CPU/GPU etc.

eg: My pc runs SLI, but middle range CPU, so i would like to keep the settings that use GPU high, while making the CPU settings low or disabled.

Share this post


Link to post
Share on other sites
Would be really helpful if we could get official word on what settings are used by CPU/GPU etc.

eg: My pc runs SLI, but middle range CPU, so i would like to keep the settings that use GPU high, while making the CPU settings low or disabled.

That's easy. Change a setting and see if CPU or GPU use changes.

Share this post


Link to post
Share on other sites

Going to have to nitpick some of that:

  1. HDR quality - Low has better performance and, imo, looks better.
  2. Shadow Q - lower values have worse performance. The highest has best performance.
  3. Maxframes - 4 is best. 1 is basically the lowest the GPU can do, and it has the opposite effect of what you described: it only prerenders 1 frame and makes FPS more volatile. I saw no difference in performance between 1 and 4, just smoothness.

Additionally, GPU settings:

  1. Threaded optimization: no clear impact on performance, but it does decrease the CPU's thread usage and smooths it out
  2. Texture filtering: Full quality. This actually increases performance when loading new textures. At least on an older card.

Share this post


Link to post
Share on other sites
Would be really helpful if we could get official word on what settings are used by CPU/GPU etc.

CPU/GPU= View distance, Terrain quality, Object distance, Object quality (Geomety and the GPU has to display it all)

HDD/GPU= Texture quality (HDD because textures stream from it) SSD's rule.

GPU= Clouds, Particles, Shadows, AA, PPAA, ATOC, AF, Dynamic lights, Post Processing, HDR (All processed in GPU)

Edited by SIMJEDI

Share this post


Link to post
Share on other sites
Going to have to nitpick some of that:

  1. HDR quality - Low has better performance and, imo, looks better.
  2. Shadow Q - lower values have worse performance. The highest has best performance.
  3. Maxframes - 4 is best. 1 is basically the lowest the GPU can do, and it has the opposite effect of what you described: it only prerenders 1 frame and makes FPS more volatile. I saw no difference in performance between 1 and 4, just smoothness.

Additionally, GPU settings:

  1. Threaded optimization: no clear impact on performance, but it does decrease the CPU's thread usage and smooths it out
  2. Texture filtering: Full quality. This actually increases performance when loading new textures. At least on an older card.

Hey thanks for the reply, yeah I have noticed some of that stuff as well espceically the shadows since the patch, I will update the guide with this information when I can.

CPU/GPU= View distance, Terrain quality, Object distance, Object quality (Geomety and the GPU has to display it all)

HDD/GPU= Texture quality (HDD because textures stream from it) SSD's rule.

GPU= Clouds, Particles, Shadows, AA, PPAA, ATOC, AF, Dynamic lights, Post Processing, HDR (All processed in GPU)

Thanks for the info will add it to the guide.

I will also be re-doing my benchmarks as I have found a consistent way to do so.

Share this post


Link to post
Share on other sites

Just figured I would add something I just tried and worked well for my system.

I am using your command line (-cpuCount=4 -exThreads=7 -high -maxMem=8192 -noPause -noSplash -world=empty)

When in game open the Windows Task Manager and use the set affinity option(Right Click Arma3.exe). I uncheck cpu 0 and 1.

I noticed a pretty good gain overall. I also noticed way more usage between the 3 other cores. Oddly enough the game stopped using the virtual cores.

The reasoning behind this for me is that Windows like's to put all of my stuff on CPU 0 and 1 for some reason.

Also my specs:

i7 2600k @ 4.6GHz

16 GB DDR3 1600MHz

GTX580SC

Hope's this helps some people! Also I think you have to do it every time.

EDIT: Forgot to mention this is on the current Dev Build. Will test on current public build.

EDIT 2:

Run 1

Public Build No Affinity Change
nTest One - 42.237
nTest Two - 30.6387
nTest Three - 33.0383
nTest Four - 29.1971
nTest Five - 53.3964
nScoggs's OFPMark is 3770.15



Public Build Affinity Change
nTest One - 51.9878
nTest Two - 36.5458
nTest Three - 38.004
nTest Four - 36.4299
nTest Five - 56.3672
nScoggs's OFPMark is 4386.69



Development Build No Affinity Change
nTest One - 42.6261
nTest Two - 33.0228
nTest Three - 33.6573
nTest Four - 30.4646
nTest Five - 54.1586
nScoggs's OFPMark is 3878.59



Public Build Affinity Change
nTest One - 53.3266
nTest Two - 38.2214
nTest Three - 37.2911
nTest Four - 38.2531
nTest Five - 56.897
nScoggs's OFPMark is 4479.78


Tested with ArmA3mark0.7.Stratis.pbo

Run 2 (This time restarting Arma3.exe after every run)

Public Build No Affinity Change
nTest One - 41.8149
nTest Two - 31.4658
nTest Three - 31.8796
nTest Four - 30.1432
nTest Five - 52.1855
nScoggs's OFPMark is 3749.78


Public Build Affinity Change
nTest One - 52.1574
nTest Two - 36.8886
nTest Three - 37.7067
nTest Four - 34.9854
nTest Five - 55.6148
nScoggs's OFPMark is 4347.06


Development Build No Affinity Change
nTest One - 41.9304
nTest Two - 32.2809
nTest Three - 32.8107
nTest Four - 29.8954
nTest Five - 53.7728
nScoggs's OFPMark is 3813.8


Public Build Affinity Change
nTest One - 52.8827
nTest Two - 38.4797
nTest Three - 39.3654
nTest Four - 37.5704
nTest Five - 57.1333
nScoggs's OFPMark is 4508.63


Tested with ArmA3mark0.7.Stratis.pbo

Edited by Scoggs
Added Benchmarks

Share this post


Link to post
Share on other sites
Just figured I would add something I just tried and worked well for my system.

I am using your command line (-cpuCount=4 -exThreads=7 -high -maxMem=8192 -noPause -noSplash -world=empty)

When in game open the Windows Task Manager and use the set affinity option(Right Click Arma3.exe). I uncheck cpu 0 and 1.

I noticed a pretty good gain overall. I also noticed way more usage between the 3 other cores. Oddly enough the game stopped using the virtual cores.

The reasoning behind this for me is that Windows like's to put all of my stuff on CPU 0 and 1 for some reason.

Also my specs:

i7 2600k @ 4.6GHz

16 GB DDR3 1600MHz

GTX580SC

Hope's this helps some people! Also I think you have to do it every time.

EDIT: Forgot to mention this is on the current Dev Build. Will test on current public build.

EDIT 2:

Run 1

Public Build No Affinity Change
nTest One - 42.237
nTest Two - 30.6387
nTest Three - 33.0383
nTest Four - 29.1971
nTest Five - 53.3964
nScoggs's OFPMark is 3770.15



Public Build Affinity Change
nTest One - 51.9878
nTest Two - 36.5458
nTest Three - 38.004
nTest Four - 36.4299
nTest Five - 56.3672
nScoggs's OFPMark is 4386.69



Development Build No Affinity Change
nTest One - 42.6261
nTest Two - 33.0228
nTest Three - 33.6573
nTest Four - 30.4646
nTest Five - 54.1586
nScoggs's OFPMark is 3878.59



Public Build Affinity Change
nTest One - 53.3266
nTest Two - 38.2214
nTest Three - 37.2911
nTest Four - 38.2531
nTest Five - 56.897
nScoggs's OFPMark is 4479.78


Tested with ArmA3mark0.7.Stratis.pbo

Run 2 (This time restarting Arma3.exe after every run)

Public Build No Affinity Change
nTest One - 41.8149
nTest Two - 31.4658
nTest Three - 31.8796
nTest Four - 30.1432
nTest Five - 52.1855
nScoggs's OFPMark is 3749.78


Public Build Affinity Change
nTest One - 52.1574
nTest Two - 36.8886
nTest Three - 37.7067
nTest Four - 34.9854
nTest Five - 55.6148
nScoggs's OFPMark is 4347.06


Development Build No Affinity Change
nTest One - 41.9304
nTest Two - 32.2809
nTest Three - 32.8107
nTest Four - 29.8954
nTest Five - 53.7728
nScoggs's OFPMark is 3813.8


Public Build Affinity Change
nTest One - 52.8827
nTest Two - 38.4797
nTest Three - 39.3654
nTest Four - 37.5704
nTest Five - 57.1333
nScoggs's OFPMark is 4508.63


Tested with ArmA3mark0.7.Stratis.pbo

Your system is similar to mine what settings did you have the game on thx

Share this post


Link to post
Share on other sites

Setting Affinity didn´t do nothing for me.

But raising the process Priority to 24 (Realtime) gave me better FPS in Firefights with AI. I used Process Explorer to do that.

Share this post


Link to post
Share on other sites
Just figured I would add something I just tried and worked well for my system.

I am using your command line (-cpuCount=4 -exThreads=7 -high -maxMem=8192 -noPause -noSplash -world=empty)

When in game open the Windows Task Manager and use the set affinity option(Right Click Arma3.exe). I uncheck cpu 0 and 1.

I noticed a pretty good gain overall. I also noticed way more usage between the 3 other cores. Oddly enough the game stopped using the virtual cores.

The reasoning behind this for me is that Windows like's to put all of my stuff on CPU 0 and 1 for some reason.

Also my specs:

i7 2600k @ 4.6GHz

16 GB DDR3 1600MHz

GTX580SC

Hope's this helps some people! Also I think you have to do it every time.

Interesting results! However when I tried it on my i7 3770k I got no real difference in performance =[

Also I've updated the guide with benchmarks for each setting (cept PiP) so ya's can see where the performance drains are coming from.

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
Sign in to follow this  

×