Jump to content
Sign in to follow this  
mac1

SLI and Direct X and FPS

Recommended Posts

I just installed Armed Assault on a PC with X2 7800 nvidia cards

(1) is it possible to run it in SLI?(i read somewhere if u change name of ArmA.exe file to 3dMarko6 you can run it in sli and get a performance gain?)...will a proper sli profile be released eventually?

(2)I notice the game needs direct x 9.1 and should update to this as part of install process, my machine was at 9.0c b 4 install and it stayed at 9.0c after install.....do i need install 9.1 again?

(3)is there any way to display frame rate in game?

Share this post


Link to post
Share on other sites

3. not through Arma for fps!!

but u can use a program called Frapps for recording screenies/movies and also shows fps on screen!

Share this post


Link to post
Share on other sites
(2)I notice the game needs direct x 9.1 and should update to this as part of install process, my machine was at 9.0c b 4 install and it stayed at 9.0c after install.....do i need install 9.1 again?

There is no DirectX 9.1. DX9.0c is the latest version (apparently there are bi-monthly updates, but they retain the 9.0c name)

Share this post


Link to post
Share on other sites
Quote[/b] ](1) is it possible to run it in SLI?(i read somewhere if u change name of ArmA.exe file to 3dMarko6 you can run it in sli and get a performance gain?)...will a proper sli profile be released eventually?

(2)I notice the game needs direct x 9.1 and should update to this as part of install process, my machine was at 9.0c b 4 install and it stayed at 9.0c after install.....do i need install 9.1 again?

(3)is there any way to display frame rate in game?

I use a 7950 GX2 card. All SLI options are available to me.

1) yes, I made a profile with nHancer and set SLI to AFR, with no flags selected. There is no nVidia SLI profile for ARMA. I'm using the 91.47 drivers.

2) Download the most recent DX9 update from Microsoft. There was an update this month, February 2007.

3) Yes, use download/install FRAPS.

smile_o.gif

Share this post


Link to post
Share on other sites

I run dual nvidia 7900gt's and can run ArmA with full graphics and no lag/stutters at all

Share this post


Link to post
Share on other sites

I have noticed the best SLI profile for Arma so far is FEAR's. Quick way to try it: rename arma.exe to fear.exe.

Share this post


Link to post
Share on other sites
I have noticed the best SLI profile for Arma so far is FEAR's. Quick way to try it: rename arma.exe to fear.exe.

I have ATI Crossfire and renaming the .exe to fear.exe works for Crossfire users. I use ATI Tray Tools to display my FPS onscreen.

Share this post


Link to post
Share on other sites

Thanks

(1)When u say try the GRAW or fear profiles do u mean that you rename the ArmA.exe file in the game folder to fear.exe then select the fear sli profile from the nvidia utility then you run the game using the renamed fear.exe executable?

Does this give you good perfomance gain over a single card? (someone has also told me to try this with the 3dmark06?) Which of these profiles should give me the most performance gain(3dmark, fear or GRAW?)

(2)In terms of displaying a frame rate in game is FRAPS the only way to do this (I am running nvidia cards) and where can i download this from?

Share this post


Link to post
Share on other sites
Thanks

(1)When u say try the GRAW or fear profiles do u mean that you rename the ArmA.exe file in the game folder to fear.exe then select the fear sli profile from the nvidia utility then you run the game using the renamed fear.exe executable?

Does this give you good perfomance gain over a single card? (someone has also told me to try this with the 3dmark06?) Which of these profiles should give me the most performance gain(3dmark, fear or GRAW?)

(2)In terms of displaying a frame rate in game is FRAPS the only way to do this (I am running nvidia cards) and where can i download this from?

(1)yes it can, as its the .exe which control whats the program needs to use

Share this post


Link to post
Share on other sites

Driver support is the key as I have tested this. Please use v93.71 as this set of drivers will keep and store profiles. Create a profile for ArmA

Use SLI: (Alternate Frame rendering)

Image settings set to: (high)

In testing I would get in a certain scene, (yours may very dending on scene) 21 FPS without SLI and 29 FPS with. The reason SLI doesn't work in other sets of video drivers is because the drivers do not support saving working profiles. The above mentioned "fake" profiles also works but is really not the way to do it, but then again it is a means to an end. v93.71 deos work.

The reason SLI doesn't kick the frame rates up much higher is because the game makes heavy use of vertex's which is also why the game game doesn't run very well to start with.

The GeForce 7900 GTX, for example, has 24 pixel shaders and 8 vertex shaders, the reason being that modern games are biased more towards pixel work than vertex. However, there can be problems with this fixed approach. ArmA is a good example

Consider a typical scene in Oblivion, such as a cave; the geometry required to create the cave is relatively simple, and there are only two or three character models for a couple of goblins, plus a few objects such as chests. To make these objects look good, the GPU has to calculate HDR lighting effects, reflections and so on, which require complex pixel shader calculations. Here, the 7900 GTX's balance of pixel shaders to vertex shaders makes sense.

However, when you go outside in ArmA, the balance changes. With the draw distance on full, there's more terrain to generate, plus a huge amount of vegetation, all made up of vertices. The 7900 GTX's fixed and unbalanced hardware architecture means that it finds the outdoor scenes much harder.

With a unified architecture like the 8800 GTX, there's no distinction between pixel and vertex pipelines. There are only stream processors, and each processor, Nvidia claims, 'is capable of being dynamically allocated to vertex, pixel, geometry, or physics operations'. The benefit is clear, since with a unified architecture, each part of the GPU can be kept busier for longer regardless of the type of scene being rendered. For example, instead of the vertex pipes lying largely idle when a 3D scene is geometrically simple, they can be reconfigured to work on whichever task the game throws at the GPU. The GPU's dispatch and control logic dynamically assigns work to the stream processors, and this occurs automatically so that game developers don't need to worry about it.

To keep all the processors busy, the work needs to be split into small chunks. Nvidia calls this 'thread granularity' (a term borrowed from ATi), and states that the 8800 GTX has 32-pixel granularity, as opposed to 48 for ATi's X1900-series. Despite borrowing the term from ATi, Nvidia still has its own name for it, which is GigaThread.

It's important to point out that each stream processor is a scalar processor, so it isn't equivalent to a single pixel shader, which operates on vector or scalar instructions (or a combination of both) although rarely with 100 per cent efficiency. Nvidia says that its 128 scalar processors can deliver up to double the performance of a GPU with 32 GeForce 7-series pixel processors, although this is, of course, a theoretical figure.

The main reason for these changes is that the 8800 GTX is the first DirectX 10, Shader Model 4-compatible GPU. DX10 is a massive leap forward from DirectX 9c and has a unified instruction set, which offers game developers much greater flexibility and resources. DX10 also adds a new feature called geometry shaders, which is a powerful new method of quickly generating geometry, without burdening the CPU. Geometry shaders can be used for many purposes, from simply adding extra detail (fur, for example) to generating particle effects.

So this means that even if you have an 8800 GTX it will be Directx 9 that will hold back the "thread granularity" and the new geometry shaders from making full use of this 8800 GTX card. Unless BIS recodes the game to take advantage of DX10 the problem with poor FPS on high end systems will never go away on the 7900 GTX's fixed and unbalanced hardware architecture. The 8800 GTX's un-used features means that it finds the outdoor scenes much harder to render also.

PACO454

Share this post


Link to post
Share on other sites

Paco, very good post. I really hope that BIS can step up and code to make more efficient utilization of existing hardware. I have tried 91.47, 93.71 and 93.81 drives for my GX2. Of the 3 driver sets, the 91.47's run ARMA (and every game I have) at their best. Their best means high quality settings, 1920x1200 resolution and they run very payable fps that range across the board from 40-100 fps. The only exception is ARMA.

I am currently using the fear.exe sli profile, rather than my own self created profile for ARMA. I find that the shadows/shading details are run best by this profile. I'm running shadows/shading detail on high right now and I get a pretty constant 38 fps. Sometimes more, but no real fps drop. The only downside is that V-synch is still run by ARMA. Forcing it off inside the fear.exe profile makes ARMA run worse. Go figure.

What ever optimizations nVidia did for FEAR, the profile seems to run the game better when using high detail for the shadows.

Just my observations.

Share this post


Link to post
Share on other sites

Informative post Paco, good stuff.

I have dual 8800 GTX's.  When I run the game in SLI my framerate is worse than if running one card.  I've attributed it to poor driver support for the 8800 from NVIDIA, and poor optimization from BIS.

Pretty maddening actually.  I dropped a boat-load of cash on what I thought would be the ultimate rig for ArmA and it runs like crap.  Crap being a relative term of course.  biggrin_o.gif

Share this post


Link to post
Share on other sites
The reason SLI doesn't kick the frame rates up much higher is because the game makes heavy use of vertex's which is also why the game game doesn't run very well to start with.

No, the reason SLI doesnt kick up your fps more than that is because when you create a game-profile in the Nvidia CP it doesnt add any compatibility flags (optimizations). That can only be done directly in the profile with a tool such as nHancer. FEAR is running with alot of different optimizations in the driver that Arma also benefit from.

Here is the meassured result for Arma on my machine. The card is a GF7950GX2 and the same settings are in use in all three tests.

Single-card config: 30fps.

SLI without optimizations: 40fps.

SLI with FEAR's optimizations: 55fps.

Here you can see that by enabling SLI gives you a slight FPS-improvement, but the real performance-boost comes from the compatibility flags in the driver. There are 31 of them and unfortunately there is no info about them except a very few have the names of some games in them. So I'm not about to start testing all kinds of combinations there to see if I can push the performance further. Most likely it is as fast as it can go (going from 30 to 55 is close to twice as fast).

Share this post


Link to post
Share on other sites
The reason SLI doesn't kick the frame rates up much higher is because the game makes heavy use of vertex's which is also why the game game doesn't run very well to start with.

No, the reason SLI doesnt kick up your fps more than that is because when you create a game-profile in the Nvidia CP it doesnt add any compatibility flags (optimizations). That can only be done directly in the profile with a tool such as nHancer. FEAR is running with alot of different optimizations in the driver that Arma also benefit from.

Here is the meassured result for Arma on my machine. The card is a GF7950GX2 and the same settings are in use in all three tests.

Single-card config: 30fps.

SLI without optimizations: 40fps.

SLI with FEAR's optimizations: 55fps.

Here you can see that by enabling SLI gives you a slight FPS-improvement, but the real performance-boost comes from the compatibility flags in the driver. There are 31 of them and unfortunately there is no info about them except a very few have the names of some games in them. So I'm not about to start testing all kinds of combinations there to see if I can push the performance further. Most likely it is as fast as it can go (going from 30 to 55 is close to twice as fast).

I stand corrected, however if the FEAR profile is so heavily optimized then why isn't there an ArmA profile there? Who does this optimization, nVidia or the game maker?

PACO454

Share this post


Link to post
Share on other sites
Single-card config: 30fps.

SLI without optimizations: 40fps.

SLI with FEAR's optimizations: 55fps.

Here you can see that by enabling SLI gives you a slight FPS-improvement, but the real performance-boost comes from the compatibility flags in the driver. There are 31 of them and unfortunately there is no info about them except a very few have the names of some games in them. So I'm not about to start testing all kinds of combinations there to see if I can push the performance further. Most likely it is as fast as it can go (going from 30 to 55 is close to twice as fast).

If this is true them my best bet to increase FPS would be to buy another 7900 GS instead of shelling out big$ for an 8800.

intel6400 oced @2.4

asusp5nsli

7900 gs @ 650/1.3

2 gig teem extreem ddr2 @667

current AmaA FPS: 25-35

Share this post


Link to post
Share on other sites

Duuude I didnt believe at first but nHancer + 91.47 really changed the fps. Im using GX2

It was like 28 before... and now its around 50 (in a tested scene)

Thanks guys :P

Share this post


Link to post
Share on other sites

I find it strange that Paco copy-pastes an article about Oblivion from Custom PC UK and just replaces "Oblivion" with "Arma".

Unless he is an Author on that site of course ; )

Share this post


Link to post
Share on other sites
if the FEAR profile is so heavily optimized then why isn't there an ArmA profile there? Who does this optimization, nVidia or the game maker?

Because Arma isnt such a "big game". Its not enough eyecandy value in Arma for Nvidia to prioritize it. I guess they dont see a non-mainstream game as worthy the effort in order to promote their videocards.

Nvidia is doing the optimizations.

Share this post


Link to post
Share on other sites
I find it strange that Paco copy-pastes an article about Oblivion from Custom PC UK and just replaces "Oblivion" with "Arma".

Unless he is an Author on that site of course ; )

Strange? Covers the same issues whether I replaced a few words or not. Makes no difference. I'd say your flame baiting, off topic and looking for a fight.

PACO454

Share this post


Link to post
Share on other sites
if the FEAR profile is so heavily optimized then why isn't there an ArmA profile there? Who does this optimization, nVidia or the game maker?

Because Arma isnt such a "big game". Its not enough eyecandy value in Arma for Nvidia to prioritize it. I guess they dont see a non-mainstream game as worthy the effort in order to promote their videocards.

Nvidia is doing the optimizations.

Thanks for the response. I did try the FEAR profile and went from, on a test map, from 41 to 79 FPS. I'd say you hit the golden egg. I think the game is "BIG GAME" with good eye candy, too bad nVidia doesn't think so. I'll right them nevertheless, can't hurt.

41 to 79fps, that's BIG TIME. Thankyou for the heads up.

PACO454

Share this post


Link to post
Share on other sites

Slightly off-topic but has anyone had any ArmA success with any of the performance booster programs?

Share this post


Link to post
Share on other sites
41 to 79fps, that's BIG TIME. Thankyou for the heads up.

Just happy it helps smile_o.gif

Quote[/b] ]Slightly off-topic but has anyone had any ArmA success with any of the performance booster programs?

I believe most of those programs are a scam because I've never heard of them improving anything except fixing simple "wrongs" with a computer and thus the result of the fix improved on the performance and not necessarely the program itself, but thats a different topic.

Please stick to SLI with Arma here. smile_o.gif

Share this post


Link to post
Share on other sites

Thanks

I have now tried the fear sli profile and it gives me a good increase in frame rate.

However I am finding that although it gives me an increase in fps on all the training maps (it now runs at 20-30 fps according to fraps with either the normal or high default detail levels) on a couple of the maps (sanitizing op and convey ambush ) the game seems to run choppy (not running smoothly particulalrly when I am running) eventhough the frames rate is at lleast 20-30? (I have already tried disabling one of the cores on my CPU and I am run AMD's dual core optimizer) it is almost the same effect I got with same games that didnt like dual cores.

The system I am running is

AMD 4800 dual core, x2 nvidia 7800 gtx in sli, 2gb menory, drivers 93.71, out of the box UK version of the game.

Does anyone know what might me causing this and what can I do to get it running smoothly?

Share this post


Link to post
Share on other sites

maybe thats because the way the card handle GFX? i am running on a single 8800GTS and could run 1.02 with 40 fps in non forest area and 20 to 30 in forested area in most mission

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
Sign in to follow this  

×