Jump to content
Sign in to follow this  
Louisville15

Does ARMA II support SLI?

Recommended Posts

No, it won't do much.

I wouldn't push it if you are using a stock Intel cooler, best to wait until you are properly set up for OCing :)

Share this post


Link to post
Share on other sites

You could try to add -winxp to the end of your Arma shortcut, target line. I've never tried it with Win7 but with Vista it gives me an extra 5fps.

Share this post


Link to post
Share on other sites

Wish I saw this thread earlier and saved you a lot of time.

ARMA2 is more CPU limited in city areas (around buildings) than GPU limited, even if you run all GPU settings to high.

In benchmark1 you are CPU limited, thats why it gave u 2 fps more when you overclocked slightly and no extra fps when you turned SLI off.

Just 1 GTX470 is enough to "ALMOST" max out ARMA 2 in most scenarios (except smoke grenades).

What you need is a 4Ghz or higher CPU, sadly this still wont give you 60fps in towns/city areas, ARMA2 uses Quad CPU's but STILL IMO has issues. For example, my Quad @ 3.8Ghz gives me about 22fps in large towns, yet its usage shows only 65%... and GPU's (SLI) at less than 50%.

So basically the engine still needs to be optimised more, as its not using CPU/GPU max and gives low fps in towns.

The story is different when your in the woods/desert (non built up area)... in these areas the LOAD shifts to GPU.

My two GTX275's in SLI run at 96% in those areas with all settings on V.High.

Through my testing I found that "Terrain Detail" and "Object Detail" are the biggest problems for the CPU, I would recommend leaving them on "Normal" and putting everything else on V.High.

This will give extra headroom for the CPU, which will allow your GPU's to work harder. Unfortunately even a 4.5Ghz cpu wont be enough to stress your 3GPU's in SLI in ARMA2.

Finally, SLI does work VERY WELL in ARMA2, and scales up to 90%, your simply CPU limited like the rest of us :)

Share this post


Link to post
Share on other sites

Well to be honest I've read through every post and it could have been answered and solved in 1 simple thread... it was obvious he was CPU limited from the start.

No need to go on a wild goose hunt ;)

Share this post


Link to post
Share on other sites

Anyone monitoring their GFX memory usage and not just their CPU usage?

With my 285s I could see a lot of instances with 100% and even with my 3GB 580s they can get near the 100% in some instances, although I am now running triple monitor resolutions I was using a single 1920x1200 at the time of the 285s.

I see very few games use as much GFX memory as ArmA 2 to be honest.

Share this post


Link to post
Share on other sites
Well to be honest I've read through every post and it could have been answered and solved in 1 simple thread... it was obvious he was CPU limited from the start.

No need to go on a wild goose hunt ;)

I needed to be sure that's what the problem was (I said it was a CPU limitation in my first (or second) post in this thread and several people mentioned the built up areas etc)

It's nice of you to show up at the end of the discussion and 'save us all time' though.

Thanks :rolleyes:

Regardless Louisville, it seems your problem has been identified even if it hasn't been solved (yet) :D

Edited by BangTail

Share this post


Link to post
Share on other sites
You could try to add -winxp to the end of your Arma shortcut, target line. I've never tried it with Win7 but with Vista it gives me an extra 5fps.

this option disables DirectX 9Ex (available in Vista/Seven)

note: DirectX 9Ex improves e.g. ALT-TAB behaviour etc.

Edited by Dwarden

Share this post


Link to post
Share on other sites

It's nice of you to show up at the end of the discussion and 'save us all time' though.

Thanks :rolleyes:

hehe ok no offence taken :)

I too wish we could fix or solve the issue where in cities/built up areas the game fps drops to very low fps numbers even though CPU and GPU usage is low.

Share this post


Link to post
Share on other sites

Yah, there is definitely a problem where built up areas are concerned.

Share this post


Link to post
Share on other sites
Anyone monitoring their GFX memory usage and not just their CPU usage?

With my 285s I could see a lot of instances with 100% and even with my 3GB 580s they can get near the 100% in some instances, although I am now running triple monitor resolutions I was using a single 1920x1200 at the time of the 285s.

I see very few games use as much GFX memory as ArmA 2 to be honest.

its ok.

Arma2 use really HR content, including HR textures and high-poly models.]

so even 2/3/4Gb ram onboard isn't overkill and cost about $4-$10 to manufacturer, nowdays[RAM is cheap as dirt].

Share this post


Link to post
Share on other sites

just to add my 2c

Once I added a 2nd GTX275 I could run at 3840x2160 and not take much of a performance hit. Using EVGA precision (or MSI afterburner for ATI cards) it definately shows usage on both cards going up. At 1920x1080 it's usually only about 40% usage per card (or 80% with SLI disabled) and trying 3840x2160 on 1 card results in a MASSIVE performance hit, but almost none though a higher usage of 80-90% on both cards in SLI

So at the end of the day yes SLI works, as does Crossfire as i've tested 2x6970's and they both hit 100% usage. But you gotta keep in mind the CPU almost always holds it back, add in some AI and things happening and the usage is generally not anywhere near 100% on the graphics cards unless looking at a complex scene. So on an empty island even a crappy CPU will see triple digit framerates using a good SLI setup. Once things are moving around and thinking, the CPU struggles and framerate drops significantly and shows a massive drop in GPU usage because they are waiting on the CPU

Share this post


Link to post
Share on other sites

yes Millenium7, this is why I haven't upgraded my pair of gtx275s. my CPU i7930 @ 3.8 places a ceiling on my performance that an upgraded SLI setup would encounter. I would have to change board+cpu+gfx, and the cost of that upgrade doesn't justify the performance increases at the present time.

Share this post


Link to post
Share on other sites

I concur with my 285s. And also, there's the fact that I'd have to remake my entire watercooling loop, which I am putting off AS LONG AS HUMANLY POSSIBLE.

Share this post


Link to post
Share on other sites

jeah, most missions are cpu limited, but if you make a big smokescreen your gpu's will be the limit no matter what.

Share this post


Link to post
Share on other sites
I concur with my 285s. And also, there's the fact that I'd have to remake my entire watercooling loop, which I am putting off AS LONG AS HUMANLY POSSIBLE.

I upgraded from 2 285s to 2 580s and I feel your pain, I can now make toast on my radiators.

Share this post


Link to post
Share on other sites

Yup, that's why I waited for EVGA to do 3 gig cards, the Palit (which I had for a day) and Gainward both have awful cooling solutions. Thankfully, EVGA stuck to the stock cooler design

Share this post


Link to post
Share on other sites
Yup, that's why I waited for EVGA to do 3 gig cards, the Palit (which I had for a day) and Gainward both have awful cooling solutions. Thankfully, EVGA stuck to the stock cooler design

I would agree, for having 2 fans they are pretty damn awful, they are on liquid now though and I can still make toast with them. Albeit silently now. ;)

God knows how much heat 480s must produce if 580s are their cooler cousins.

Share this post


Link to post
Share on other sites

Never had a problem with any Fermi products and I had my 480s on air at the time. I just put a couple of spot fans near the intakes and they ran fine.

I think a lot of people have HP desktops etc and throw a 480 in and just expect it to run cold (LOL).

Bring on Kepler I say!

Share this post


Link to post
Share on other sites
I would agree, for having 2 fans they are pretty damn awful, they are on liquid now though and I can still make toast with them. Albeit silently now.

God knows how much heat 480s must produce if 580s are their cooler cousins.

Mate, that is goddamn insane. I have one loop in my system cooling the mosfets, NB, CPU and both GPUs (with a 4x120-fan rad at the back cooling the entire loop), and no one component ever pushes 50 degrees. I'm freakin' chuffed with it, as it's my first build, no leaks, and I'm no stranger to the odd overclock either.

...aaand because I'm bored, here's a picture of it. It's about 2 years old this year.

puter.jpg

Edited by CameronMcDonald

Share this post


Link to post
Share on other sites

You better hope that shite doesn't leak, mate. You should consider adding a sanitary napkin to it. :p

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
Sign in to follow this  

×