Jump to content
R3fl3x

ArmA2 / OA (low) performance issues

Recommended Posts

well thats depend many factors, but overall system smothness are considerably boosted, including generic games[made on mainstream engines] and sythetic benchmarks.

you should only look on "minimal" FPS, NOT average.

also its more about responsivness, ie how you system "feel" in action/work, rather than somethin fictional/numerical/scientific. if you had some low-latency modules and had time to test it, you LOVE result, trust me :-)

basically its also depend, aside heavily tuned[for underperforming/slow memory]software, generation/quality/peformance/features of memory controller, you CPU use, thats where both SB-E and IB notably exceel over pervious ones, for example.

and also there AMD perform better too, despite heavy intel-side optimization in most software[including Microsoft-supplied/shipped], withins same generation/era.

its hard to explain " by words", sadly, but "in practice" its had similar effect, that actually make ["Pentium"]i5/i7 processors more attractive, than ["Celeron"]/i3/Atom-branded, despite similar clocks and more attractive prices of last one.

Overall lowest latency between mouse and screen is what matters, not fps, that's why I like the techreport reviews like this one http://techreport.com/articles.x/22653/10

As to low latency memory making the system more responsive, I'd like to see it tested properly before advocating it. Also, a 1333 cl7 kit has a higher latency than 1600 cl9, so just looking at cas isn't the best idea.

Share this post


Link to post
Share on other sites
Overall lowest latency between mouse and screen is what matters, not fps, that's why I like the techreport reviews like this one http://techreport.com/articles.x/22653/10

As to low latency memory making the system more responsive, I'd like to see it tested properly before advocating it. Also, a 1333 cl7 kit has a higher latency than 1600 cl9, so just looking at cas isn't the best idea.

well, as said above, its not about "difference in numbers/math", but PRACTICAL usability boost, clearly seen in virtually ANYTHING you do on PC everyday.

sure, "RESULTED latency" is matter, not "latency on BASE[different]friequency". also, some chips better work on low-latency, low-freq modules, such as AMD before FX, some was benefit from freqency more[sB, and especially IB and[less scale]SB-E.

returning to practical advices, ie generally-speaking there was even 1600 CL5 kits, available and 1300 CL4[never ever heard about CL3 things for DDR3], which/whose pack LOT more considerable/noticeable advantages, but they cost alone just like GOOD CPU ;-) CAS6 for 1600 are pretty common presently and probably best in terms "bang/buck ratio" from widely-available/sold things.

side note: some chips/CPU not properly use high-frequency modules, some can't use them, at all, but nearly ANY of them ENJOY low-latency/timings in modules. so for OLDER CPU's its safest path. sadly major tweaking in BIOS essential. cuz even[disabled by default in both cases, usually] DQS training or XMP/EMP wouldn't free you from that[to unleash full memory potential].

Edited by BasileyOne

Share this post


Link to post
Share on other sites
well, as said above, its not about "difference in numbers/math", but PRACTICAL usability boost, clearly seen in virtually ANYTHING you do on PC everyday.

Well if you can see it you can measure it.

Share this post


Link to post
Share on other sites
Well if you can see it you can measure it.

with proper tool, obviously.

which isn't created, yet, sadly.

to use it for that purpose.

synthetic benchmarks however is pointless to show/reveal that, sadly.

but most of them not bad in reflecting "average/approximated performance" deviations/delta resulted by hardware management/upgrade/downgrade/purchase.

point is: sometimes you had to start having OWN living/working experience. and let others have their. freeing their anger exploitation for educational purposes by rhetorical opression.

ie growing up wasn't bad thing, IMO.

and eventual for most of us.

which is good news, IMO.

Share this post


Link to post
Share on other sites

well, you can use fraps to log the frame time of all frames and make the 99th percentile plots (like the ones of techreport) of arma2 benchmark 2 at different memory speeds.

Share this post


Link to post
Share on other sites

Hey Guys,

I have very strange performacje issue. It started few days ago. So far I played Arma almost maxed out with constant 30-50FPS while recording using bandicam. Few days ago my FPS droped to 10-20 without any reason, I didn't changed anything in arma settings. The funny part is that I have 3-4 FPS in briefing screen and in editor, when I'm looking at 2d map. I have lag when I try to place single unit in editor. In game I have 10-20 FPS, and game runs terrible. I tried disabling mods, DLCs, changing graphic settings, changing GPU drivers, disabling software ruining in background. Nothing helps. Another weird part is that game sometimes goes back to previous (good) performace. It changes during game (on the fly), without changing any options. Suddenly I'm geting my ~40FPS back and game runns smooth. But after ~10 minutes games goes back to 10-20FPS. Can anyone help me?

My rig:

i7 860 2,8GHz

8GB Ram

GTX 460 (drivers 296.10)

W7 64

Share this post


Link to post
Share on other sites
well, you can use fraps to log the frame time of all frames and make the 99th percentile plots (like the ones of techreport) of arma2 benchmark 2 at different memory speeds.

well, fraps measure AVERAGE framerate :-)

never ever-ever be able/supposed to measure UI/engine response time nor LOWEST framerate :-)

secondly, its very poorly-designed hardware. if you insist of framerate measurement caps, i suggest you RivaTuner, PowerStrip[not bad for inexpensive SW] or MSI Afterburner.

and third point, Arma2 more seriously encumbered by:

1. access to reasonable AMOUNTS of memory. no workarounds/fixes, else than doing x64/64-bit binaries. rather than RAM speed.

2. computation speed. both GPR/ALU and FPU speed is matter.

3. scalability of workflow. both, AI, sound, terrain and GFX is severely bottlenecked in terms of scalability[per/cores-streams]. also C programmers mentality never ever-ever help them design such kind of things from Unix creation times. check Erlang community approach for reference/example[of "how it could/would be done"].

but "in general" fast/low-latency memory is COOL thing.

improving you PC performance SERIOUSLY.

you can deny it importance to focus on something else, when you had low-budget/cash amount, building you PC/workstation/network, but otherwise, GOOD RAM is always comes handy/useful =)

Hey Guys,

I have very strange performacje issue. It started few days ago. So far I played Arma almost maxed out with constant 30-50FPS while recording using bandicam. Few days ago my FPS droped to 10-20 without any reason, I didn't changed anything in arma settings. The funny part is that I have 3-4 FPS in briefing screen and in editor, when I'm looking at 2d map. I have lag when I try to place single unit in editor. In game I have 10-20 FPS, and game runs terrible. I tried disabling mods, DLCs, changing graphic settings, changing GPU drivers, disabling software ruining in background. Nothing helps. Another weird part is that game sometimes goes back to previous (good) performace. It changes during game (on the fly), without changing any options. Suddenly I'm geting my ~40FPS back and game runns smooth. But after ~10 minutes games goes back to 10-20FPS. Can anyone help me?

My rig:

i7 860 2,8GHz

8GB Ram

GTX 460 (drivers 296.10)

W7 64

probably something wrong WITHIN you PC.

something resource-stealing, obviously.

my guess is you fell victim of mysterious/shady space aliens, preparing assault on Earth. with [obvious;]help of you PC, of course ;-)

Share this post


Link to post
Share on other sites
well, fraps measure AVERAGE framerate :-)

never ever-ever be able/supposed to measure UI/engine response time nor LOWEST framerate :-)

Fraps measures average framerate, min/max/average as well as frametimes.

Share this post


Link to post
Share on other sites

I have:

AMD 9150e Quad Core 1.8 GHz

4gb RAM

Windows 7 32 bit

ATI Radeon HD 3200 (512mb Video memory)

47" Vizio LCD

I have installed Arma II on my system before and its just soo laggy. Can I modify the settings to make it run better?

Share this post


Link to post
Share on other sites
I have:

AMD 9150e Quad Core 1.8 GHz

4gb RAM

Windows 7 32 bit

ATI Radeon HD 3200 (512mb Video memory)

47" Vizio LCD

I have installed Arma II on my system before and its just soo laggy. Can I modify the settings to make it run better?

not really, it's a very slow phenom I quad with an onboard graphics chip, there's only so much lower settings can do.

Share this post


Link to post
Share on other sites
I have:

AMD 9150e Quad Core 1.8 GHz

4gb RAM

Windows 7 32 bit

ATI Radeon HD 3200 (512mb Video memory)

47" Vizio LCD

I have installed Arma II on my system before and its just soo laggy. Can I modify the settings to make it run better?

sadly, no.

even Phenom II/Deneb/Thuban, which is free from TLB issue/bug, had bigger cache, better memory controller,running at TWICE bigger clocks, wasn't perform satisfying.

would only recommend getting IB or SB-E intel CPU's :[ worked LOT better in Arma2 case :( maybe future BIS games become more CPU-vendor-agnostic in terms of engine binary coding/linking/assembling

Share this post


Link to post
Share on other sites

I want to know why i have so really bad fps, here the system specs:

AMD Phenom II X4 @3,6GHz

MSI DKA790GX Platinum

Ge-Force GTX570 @stock

6GB DDR2

Win 7 64-bit

I have really bad fps, in towns and on land max. ~20 to 25fps, in towns its only diashow with ~10 to 15fps, i play OA unmodded, play it with ACE or dayz mod, all the same shit -.-

I dont know, but i think more than hardcoded 2gb ram are really better for this game, gpu and cpu are not full utilized, gpu @maximum of 20% load, no core of the cpu reaches the 70%...

My graphic options at arma2oa.cfg:

language="English";
adapter=-1;
3D_Performance=93750;
Resolution_Bpp=32;
Resolution_W=1920;
Resolution_H=1080;
refresh=60;
winX=16;
winY=32;
winW=800;
winH=600;
winDefW=800;
winDefH=600;
Render_W=1920;
Render_H=1080;
FSAA=1;
postFX=0;
GPU_MaxFramesAhead=1000;
GPU_DetectedFramesAhead=3;
HDRPrecision=8;
lastDeviceId="";
localVRAM=1293352960;
nonlocalVRAM=2147483647;
vsync=0;
AToC=3;
PPAA=0;
PPAA_Level=0;
Windowed=0;
serverLongitude=0;
serverLatitude=0;
class ModLauncherList
{
class Mod1
{
	dir="ca";
	name="Arma 2";
	origin="NOT FOUND";
};
class Mod2
{
	dir="expansion";
	name="Arma 2: Operation Arrowhead";
	origin="GAME DIR";
	fullPath="I:\steam\steamapps\common\arma 2 operation arrowhead\expansion";
};
class Mod3
{
	dir="baf";
	name="Arma 2: British Armed Forces (Lite)";
	origin="NOT FOUND";
};
class Mod4
{
	dir="pmc";
	name="Arma 2: Private Military Company (Lite)";
	origin="NOT FOUND";
};
class Mod5
{
	dir="@dayz";
	name="@dayz";
	origin="GAME DIR";
	fullPath="I:\steam\steamapps\common\arma 2 operation arrowhead\@dayz";
};
class Mod6
{
	dir="Expansion\beta\Expansion";
	name="Expansion";
	origin="GAME DIR";
	fullPath="I:\steam\steamapps\common\arma 2 operation arrowhead\Expansion\beta\Expansion";
};
class Mod7
{
	dir="Expansion\beta";
	name="beta";
	origin="GAME DIR";
	fullPath="I:\steam\steamapps\common\arma 2 operation arrowhead\Expansion\beta";
};
};

I hope my english is good enough ;-)

Share this post


Link to post
Share on other sites

I moved to a spare Win7 x64 installation as I was getting a lot of stuttering and had Latency problems.

I've fixed the latency problems now but under the new Win7 x64, which has a lot less installed than the other one, I'm experiencing a massive drop in FPS.

In the old Win7, with Cat 12.6 and beta 96476, I get around 60FPS in E08 with these settings at 1920x1200 and VD 3000:

Texture Detail - V.High, Video Memory - Default, AF - Very High, AA - Disabled

Terrain Detail - V.Low, Shadows - High, Objects - Normal, , HDR - Normal

PP - Low, ATOC & PPAA - Disabled, Vsync - Disabled

and around 46 FPS with Objects - V.High.

In the new Win7, with Cat 12.8, I get about 42 FPS with Objects - Normal and 33 FPS with Objects - V.High (which is necessary to stop the horrible LOD switching bug).

I also tested in XP, with Cat 12.4 and I get 60 FPS and 45 FPS.

CCC Settings are the same for all three installations, except XP doesn't have the Tesselation settings.

http://img145.imageshack.us/img145/966/a2oacccsettingswin7a.png (665 kB)

Share this post


Link to post
Share on other sites
...t 1920x1200 and VD 3000:

Texture Detail - V.High, Video Memory - Default, AF - Very High, AA - Disabled

Terrain Detail - V.Low, Shadows - High, Objects - Normal, , HDR - Normal

PP - Low, ATOC & PPAA - Disabled, Vsync - Disabled

and around 46 FPS with Objects - V.High.

In the new Win7, with Cat 12.8, I get about 42 FPS with Objects - Normal and 33 FPS with Objects - V.High (which is necessary to stop the horrible LOD switching bug).

I also tested in XP, with Cat 12.4 and I get 60 FPS and 45 FPS.

CCC Settings are the same for all three installations, except XP doesn't have the Tesselation settings.

http://img145.imageshack.us/img145/966/a2oacccsettingswin7a.png (665 kB)

Not sure what you are after? your settings would be ugly IQ for me, and a 6950 is too low for 1900/1200(24in?). Hope you find the happy median of playability and IQ.

Do you setup your VideoCard in the editor? or do you use E08? 3000VD is nice but you can be fine with 2000?

Your CCC setting of High Quality AI, can be upped too Very High.

Also are you running the 12.7 cap 3 from AMD?

Share this post


Link to post
Share on other sites
Not sure what you are after? your settings would be ugly IQ for me, and a 6950 is too low for 1900/1200(24in?). Hope you find the happy median of playability and IQ.

Do you setup your VideoCard in the editor? or do you use E08? 3000VD is nice but you can be fine with 2000?

Your CCC setting of High Quality AI, can be upped too Very High.

Also are you running the 12.7 cap 3 from AMD?

What are you talking about, my 6950 is too low for 1920x1200? I get 60fps in my old Win7 and XP, but 42fps in my new Win7, that's the issue.

Set up my videocard in the editor? I said I my post I tested with E08.

How is upping AI to High Quality (not Very High) going to fix my low fps in my new Win7?

What is 12.7 cap 3 going to do for me when I haven't got Crossfire? I haven't got it installed in any of the three OS anyway, so it's obviously irrelevant to the FPS drop.

Share this post


Link to post
Share on other sites
I want to know why i have so really bad fps, here the system specs:

AMD Phenom II X4 @3,6GHz

MSI DKA790GX Platinum

Ge-Force GTX570 @stock

6GB DDR2

Win 7 64-bit

I have really bad fps, in towns and on land max. ~20 to 25fps, in towns its only diashow with ~10 to 15fps, i play OA unmodded, play it with ACE or dayz mod, all the same shit -.-

I dont know, but i think more than hardcoded 2gb ram are really better for this game, gpu and cpu are not full utilized, gpu @maximum of 20% load, no core of the cpu reaches the 70%...

My graphic options at arma2oa.cfg:

I hope my english is good enough ;-)

The game render towns and trees pretty smooth if u use ssd for your hdd. My previous rig is only Phenom 2 X2 with 512mb gpu. Moving through towns and trees are really smooth when i switched to ssd. If i use my old hdd, it will lag when looking at towns, trees and zoom in. Lowering the resolution and texture usually help to increase the speed.

Share this post


Link to post
Share on other sites

Catalyst 12.8 is rubbish with Arma2OA :mad:

Thanks to advice from kn00tcn at guru3d, I copied the 12.6 atiumdag.dll into the Arma2OA folder and that's got my FPS back to normal (58 FPS instead of 42 FPS) but I don't know if this affects any other games or if it will be easier just to rollback to 12.6

---------- Post added at 13:11 ---------- Previous post was at 12:49 ----------

So here's some benchmarks with 12.8 with the 12.6 atiumdag.dll. Note I kept having a problem where I set ATOC - All Trees but when I next went into the options after running E08 it was reset to Disabled, so I can't be sure it was on for these tests.

There appears to be a bug with GPU Shadows (High or Very High) as they look rather strange and flicker on and off a lot, so I much prefer Normal. AA - Low helped quite a bit with flickering textures (not Shadows) as well. I didn't seem to get any benefit from reducing VD from 3011 to 2046.

58 FPS

Texture Detail - V.High, Video Memory - Default, AF - Very High, AA - Disabled

Terrain Detail - V.Low, Shadows - High, Objects - Normal, , HDR - Normal

PP - Low, ATOC & PPAA - Disabled, Vsync - Disabled, VD 3011

46 FPS, Objects - V.High

45 FPS, PPAA - SMAA V.High

44 FPS, ATOC - All Trees

44 FPS. CCC AI - High Quality (from Quality)

40 FPS, PP - Normal, Vsync - Enabled

43 FPS, PP - Low

44 FPS, Shadows - Normal

48 FPS, GPU Overclocked to 890/1350 (from 800/1250)

46 FPS, VD 2046, Shadows - High

48 FPS, VD 2046, Shadows - Normal, ATOC - All Trees

45 FPS, VD 2046, AA - Low

Share this post


Link to post
Share on other sites
What are you talking about, my 6950 is too low for 1920x1200? I get 60fps in my old Win7 and XP, but 42fps in my new Win7, that's the issue.

Set up my videocard in the editor? I said I my post I tested with E08.

How is upping AI to High Quality (not Very High) going to fix my low fps in my new Win7?

What is 12.7 cap 3 going to do for me when I haven't got Crossfire? I haven't got it installed in any of the three OS anyway, so it's obviously irrelevant to the FPS drop.

I say that a 6950 is to low, because you cant not run with nice filters. Its a IQ game not a highest FPS game. But thats why there is so many options to tune the games Display. I myself couldnt stand playing with all the flicker and jaggies your setup has.. but if you dont care fine. I can play the game in the 20-30 fps range just fine. But yeah 40- 50 is better, but not at the cost of low or no filtering. The more AA you use the less" flickering/crawling" you will have up till you just run up against the wacked textures (worst is the window jambs). Now this is subjective in what one puts up with. But your current ingame settings to get near your Vsync at your Display Resolution is jaggie and crawly. I had a 6790, i used the 12.6 with it. To get what i deem "playable" IQ i had to run at most 1600/1200... I now use a 7970(it rocks). I play at 1920/1200 in the 50~ with all the filters at almost max (6AA not 8AA...Multisample in CCC not SSAA...). Then again its what you want to put up with for IQ. I still found the 12.7b to be better frame wise with my 6970, than the 12.6. Thats my system your mileage will very.

As for the caps, they have more than CF performance in them. They can improve your current driver. Even the new 12.8 cap has reg entry's about powertune and switchable graphics thats not in the release notes...

Your Shadows less than high will use CPU to render them. They will improve with more AA when on High or VH.

As for setting up your vid setting during lots of CPU intensive stuff, isnt the way to find your best IQ to FPS...it is a tool to work with(E08), but you will find out more, by using no CPU stuff, then adjust for MP or high CPU/AI SP. Just a helpful suggestion.

Your pic of the CCC 3D app settings has "CAT AI" set to "Quality", I was suggesting you use the"Higher Quality" setting. It should help alot with IQ issues.

Hey glad you got your FPS up.

KK

Share this post


Link to post
Share on other sites
I say that a 6950 is to low, because you cant not run with nice filters. Its a IQ game not a highest FPS game. But thats why there is so many options to tune the games Display. I myself couldnt stand playing with all the flicker and jaggies your setup has.. but if you dont care fine. I can play the game in the 20-30 fps range just fine. But yeah 40- 50 is better, but not at the cost of low or no filtering. The more AA you use the less" flickering/crawling" you will have up till you just run up against the wacked textures (worst is the window jambs). Now this is subjective in what one puts up with. But your current ingame settings to get near your Vsync at your Display Resolution is jaggie and crawly. I had a 6790, i used the 12.6 with it. To get what i deem "playable" IQ i had to run at most 1600/1200... I now use a 7970(it rocks). I play at 1920/1200 in the 50~ with all the filters at almost max (6AA not 8AA...Multisample in CCC not SSAA...). Then again its what you want to put up with for IQ. I still found the 12.7b to be better frame wise with my 6970, than the 12.6. Thats my system your mileage will very.

As for the caps, they have more than CF performance in them. They can improve your current driver. Even the new 12.8 cap has reg entry's about powertune and switchable graphics thats not in the release notes...

Your Shadows less than high will use CPU to render them. They will improve with more AA when on High or VH.

As for setting up your vid setting during lots of CPU intensive stuff, isnt the way to find your best IQ to FPS...it is a tool to work with(E08), but you will find out more, by using no CPU stuff, then adjust for MP or high CPU/AI SP. Just a helpful suggestion.

Your pic of the CCC 3D app settings has "CAT AI" set to "Quality", I was suggesting you use the"Higher Quality" setting. It should help alot with IQ issues.

Hey glad you got your FPS up.

KK

I think the results I posted above show that I can run at around 45 FPS with nice filters (AA Low, ATOC All Trees, PPAA SMAA V.High, AI High Quality). I'd be happy with a smooth 30 FPS too if it can Vsync at that rate as I hate tearing. Perhaps I can use Radeon Pro to limit it to 30 FPS and use Triple Buffering.

I did find that AA Low majorly reduces a lot of annoying flickering (mainly on buildings) but doesn't do anything for the shadows. I tried AA Normal (37 FPS, VD 3000, GPU stock clocks) as well which didn't seem to make any extra difference to the flicker on buildings or help with the flickering shadows. I'll test with higher AA though to see what, if any, difference it makes and how much FPS it costs but I don't think any amount of AA is going to fix those GPU shadows. I don't really get bothered by jaggies but the flickering is very annoying.

This is Shadows on High (V.High is the same):

http://img51.imageshack.us/img51/9144/a2oashadowshighaalow.png (131 kB)

and this is Shadows on Normal:

http://img6.imageshack.us/img6/8392/a2oashadowsnormalaalow.png (115 kB)

Clearly there's something not right with the GPU rendered shadows and not only do they look strange but they flicker like mad as well.

Monitoring the GPU/CPU usage during E08, the GPU is at 100% and the CPU around 62% (2 cores around 70%, 2 around 80%). What "no CPU stuff" test do you recommend?

I might try the caps after backing up my system just to see if they make any difference.

Share this post


Link to post
Share on other sites

nvidia claims that FXAA works like the 4xMSAA...will the game apply AA (in this case-FXAA) if i disable AA in game (video/advanced) and if i also disable AToC in the config file? i would enable FXAA via config file.

the bottom line is that i wanna have some sort of the AA applied but i'd like to try FXAA alone.

Edited by Spotter

Share this post


Link to post
Share on other sites

In 1.62 you can enable FXAA via the Video Options menu in-game. TBH it is not a replacement for 4x MSAA, though. Both together (or MSAA + SMAA) are best IMO.

Share this post


Link to post
Share on other sites

ATOC only works with in game MSAA on.

SMAA is abit blurry to me, I like FXAA normal Sharp~ ?

FXAA alone will not remove the jaggies/crawlys shimmer and flicker Let alone make wires straight.

Share this post


Link to post
Share on other sites
In 1.62 you can enable FXAA via the Video Options menu in-game. TBH it is not a replacement for 4x MSAA, though. Both together (or MSAA + SMAA) are best IMO.

oh didn't know that, i'm still using v1.6, though gonna update now. could u explain me the settings from ur sig? it says (u wrote it:)) MSAA-VH, SMAA-Ultra, AToC-disabled.

1. where do u find SMAA settings? i don't even know what kind of AA it is.

2. is MSAA actually the setting available in game in the Video options?

3. are u sure that any AA works when u disable AToC? i thought that AToC=0 disables all the AA.

ATOC only works with in game MSAA on.

SMAA is abit blurry to me, I like FXAA normal Sharp~ ?

FXAA alone will not remove the jaggies/crawlys shimmer and flicker Let alone make wires straight.

thanks, i have the FSAA=1 in the config file. is it something useful or should i turn it off?

Share this post


Link to post
Share on other sites
oh didn't know that, i'm still using v1.6, though gonna update now. could u explain me the settings from ur sig? it says (u wrote it:)) MSAA-VH, SMAA-Ultra, AToC-disabled.

1. where do u find SMAA settings? i don't even know what kind of AA it is.

2. is MSAA actually the setting available in game in the Video options?

3. are u sure that any AA works when u disable AToC? i thought that AToC=0 disables all the AA.

1. SMAA is in the same drop-down menu as FXAA in 1.62.

2. The "Antialiasing" setting (from Disabled to Very High, and then 5, 6, etc) is MSAA

3. AToC doesn't affect AA working or not...rather, the other way around. If you disable Antialiasing AToC has no effect. Basically it is an alpha transparency improvement for AA on foliage.

Share this post


Link to post
Share on other sites
... is it something useful or should i turn it off?
ATOC,while making the trees not as nice as with out, Does allow you to see thru the leaves when right up in them. It evens the odds with AI while you are in the clutter. Now the grass is really nice to me with ATOC, but the trees loose some...

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now

×