Jump to content
Sign in to follow this  
Geezer_AU

The Value of FPS (And a bit of a rant)

Recommended Posts

OK... So...

Some people say that because movies, TV etc run at 28 FPS then that should be enough in a computer game.

What they dont realize is that an optical camera that films a movie has natural motion blur which helps trick your visual perception, whereas computer games dont have this (although there is an emergence of this technology occurring at the present time).

The Next next thing to consider is that different peoples visual perception differs. Sometimes i notice TV flicker/refresh (usually related to an excess of my favorite beverage) but 95 percent of the time i dont see a fluid, smooth animation.

Remember though that the next man may be more sensitive in this respect than you.

Of course a twitch shooter, such as CS really does require a high FPS to be able to play with any kind of competition (Ive watched my younger bro play...), whereas, a game like ARMA, really, in my opinion does not.

Now i realize that BIS is fully responsible for some major shortcomings in regards to system optimization , especially in relation to the very latest technology ( I laughed when i read the specs of their Dev?/ test rigs!wink_o.gif.

Now... Ill admit i am a fanboy... But... I remember when OFP arrived, out of the blue, created by (as i recall) two brothers that lived in Czech that where fed up with the lack of access to PC games and so decided to write their own ( Accurate solar constellations and all!wink_o.gif.

Ive never known any game (or developer) to give the support that BIS has (still releasing updates for OFP even after removing all copy protection).

I think that if someone creates a game that can not be run with all graphical options turned to max ( with todays top end tech) you should respect their daring approach.

BIS could so very easily have made it so any reasonable PC could run it at full options (just lower the possible visual quality) and probably no one would complain... But they have given us a gift by making the game look better and better as technology catches up.

As we all know ARMA, as with OFP is a slave to the AI, Physics, world size and a valiant attempt at realism (shit the army uses this engine to train troops for real combat!wink_o.gif.

BIS have done the same thing with ARMA as they did with OFP. The same thing that has caused people to still be playing OFP now, five years after its first release and we still get a kick from the extras that we gain from it each time we upgrade our systems. (To be honest i think that the current top end spec, can only now run OFP at full available options).

I think it would be fair to assume that, to be able to play a game of such scope and depth as ARMA, you should expect pretty basic visuals. Fortunately BIS has given us enough graphics/game options to be able to scale the game to an individual system.

So rather than whinge about not being able to run the game at full options despite you having the latest tech, appreciate what you can get despite you not owning an military mainframe.

Geezer

Share this post


Link to post
Share on other sites

Great post, very nice. I myself am currently running Arma on a old rig and feeling the pain huh.gif

Same as with OFP it will probably take another 3/4 years before people's hardware catch up to Arma's standards if not more, i still cant play OFP with the viewdistance set to more than 50%

Share this post


Link to post
Share on other sites
Some people say that because movies, TV etc run at 28 FPS then that should be enough in a computer game.

What they dont realize is that an optical camera that films a movie has natural motion blur which helps trick your visual perception, whereas computer games dont have this (although there is an emergence of this technology occurring  at the present time).

I play ALL my games with an FPS between 20 and 30 and while this may be annoying the first few minites, after that your eyes more or less get used to it and it feels smooth.

And its not like my eyes are special or are so crappy i dont see the difference between anything above 30, its just what i said above.

This does NOT (contrary to what some believe) cause headaches like a low refreshrate does, and it doesnt affect my scores (I always got top scores in BF1942, i never seriously played any other fast paced games, although my CSS scores were never bad for a 'noob' ).

Im not sure why it feels smooth after a while, i know that the image you see remains on your retina for a short while, maybe your brain, maybe your eyes, make this period longer. I really dont know.

But for the love of god just try it, a STABLE 25FPS (22-28) may feel even better after a while then a fluctuating FPS between 50 and 100.

Share this post


Link to post
Share on other sites

i'd have that goddamn texture bug fixed with 30 FPS

100FPS with a black screen is no use for me

Share this post


Link to post
Share on other sites
OK... So...

Some people say that because movies, TV etc run at 28 FPS then that should be enough in a computer game.

What they dont realize is that an optical camera that films a movie has natural motion blur which helps trick your visual perception, whereas computer games dont have this (although there is an emergence of this technology occurring at the present time).

GPU's, Resolution,refresh and FPS 101:

This is true Geezer. The big change is the Nividia 8800 and DX10 and both the Xbox 360 and PS3 now also simulate Motion Blur just like an optical camera. There are effects that Blur the scene now. Although not tue motion blur, they are very similar.

DX9 can also do motion blur but Dx10 does it much better (see crysis video).

Both the PS3 and Xbox drone on about 60 HZ on many titles, this is because many consoles are being run on HDTV 720i 1080i Cathode Ray Tube Sets. These sets and older TV's Normaly work at 50-60 hz refresh. Some sets can set a 100hz refresh rate. As with CRT tubes 100 hz is Flicker free and much better for people to view.

Now computer CRT monitors often can work from 60-120 HZ (and higher). CRT tubes draw a picture differently to a LCD monitor but in the most basic sense most PS3 games and Xbox game are HARDLOCKED to 60 FPS max.

Now most LCD monitors are 60 hz. LCD's work differently to CRT tubes and in general 60 Hz {60 fps) is flicker free for a LCD.

Now as soon as you read benchmarking for a game running on a 8800 GTX you can see things like

Quake 4 1024x768 259 FPS

HL2 1024x768 200 FPS etc etc.

So in general even the best monitors on the planet can only display 120 FPS and about 90% of LCD's only do 60 FPS why on earth do you need 250 FPS?huh.gif?

Well you don't, is the simple answer.

200 FPs can't be seen on current monitors, frames are being skipped. It's just a way of saying this card and system is fast that's all.

But then you run at high res, my 32" LCD monitor does 60 hz 1600x1024, I want 4x anti alias (on a big screen AA makes a huge difference to jagged lines) and 8x AF{saying that i always run 16, but most bench marks only ever measure x8AF!wink_o.gif

FEAR at 1600x1024 4xAA 8x AF:

8800 GTX KO ACs3 92 FPS [enable super sample AA thats more like 60 FPS) whoppie I have HD res ALL eye candy and the Magic 60 FPS or over.

8800 GTS 320 66FPS (supersample aa then about 40 FPS)

Not bad, totally playable.

7900 GTX 256 49 FPS (SSAA on about 30 FPS)

Stable not to bad, might lag a bit in heavy action. best to turn SSAA off

x1950 39 FPS

Oh no, its drop the screen res time as the x1950 would only be stable at 1280x1024 (52 FPS).

All of the above cards went over 150 FPS at 1024x768 with no AA on.

Moral?

If you want high res 1600x1024 OR BETTER, you need a 7900 GTX or 1950XTX even for fear to hit 30-60 FPS stable game play with Anti Alias and even hardware eye candy on.

Now FEAR is an OLD game, and even it starts to slow even the fastest cards runing in a X6800 Intel Duo!!!!!!!! S sorry all the 6600 and x800 owners saying FEAR runs fine at high res maxed out. Oh no it didn't.... not unless 800x600 is high res...

Same test Oblivion, not as graphical complex as ARMA but the only thing close on the PC.

1600x1024, 4xaa 8x AF, outdoors

8800 GTX KO ACs3 36 FPS

7900 GTX 13 FPS!!!!!!!!

http://tomshardware.co.uk/2007....e4.html

Above is the link to the results. thats right, a X6800 CPU and a 7900 GTX could manage 13 FPS outdoors in oblivion all maxed out so what can it manage in ARMA?

So boys and girls unless you have a 7900 GTX and a X6800 and 2 GIG ram. Please don't ever complain about GFX quality in ARMA as you machine cant even run FEAR or Oblivion looking sexy.

To those who say they run Fear fine on a x800 or 6800, yes you do in low res, with no Anti Alias etc etc.

Please dont say it was running fine maxed out. becuase it wasn't. If you have a modern CPU and a powerful GPU as long as you get about 20-30 FPS. With settings high at 1280x1024 or better its about right.

But note: The fastest current Pc on the planet scored about 70 FPS in most of the 3d mark 06 tests. this was a Hyrdogen Cooled Quad Core Intel with 8800 GTX KO SLI. That means it's just over the 60 FPS barrier. So Arma producing about 30 FPS is about right on high end kit.

And in all the above benchmarks what isn't mentioned is the MIN FPS.

or the fact in online play you can get frame lag and stutters due to the server or your web connection.

In general to set up ARMA for your PC. Runs FRAPS to find out your FPS while playing offline. Set a screen res that suits your monitor and set refresh to 60. Then fiddle with settings start at very low and watch for what ever drops your machines FPS below 30 and turn it back off (post process view distance etc etc).

If with everything very low you can't get better than 30 FPS. You need to either fine tune your machine, get new drivers, bios etc or upgrade your hardware.

If you hit a stable 30 FPS but feel the game looks bad. then its your machine NOT the game.

Start all GFX settings VERY LOW or OFF try and hit 30 FPS or better. If even with everything off or low and you can't hit 30 FPS. Turn the screen res down eg, 1280x1024 to 1024x768. If that's not your LCD native res, "tough" your machine is either not powerful enough or tuned enough to play ARMA properly at that res.

Note PAL signals work at 25 FPS (UK TV and video), NTSC (US TV and video)at 30 and Film at 24 FPS.

Share this post


Link to post
Share on other sites

Visually a game may still look fine running at 25fps, but when your mouse response feels like sludge then it starts to seriously impact the playability and especially the enjoyment factor. Such is the situation I was in with my old 6800GT, particularly it was acute when looking into bushy areas. It was insanely maddening to get killed over and over because my mouse was too lagged out to aim and hit anything quickly.

Now with my x1950 I have the game running nicely on all normal settings with low shadows and post proc and no AA. It plays at only a marginally higher fps than on the 6800GT (35-50 against 20-25) but the major difference is in mouse response not being affected much at all and being able to aim at an enemy in the bushes and hit him fast. In fact with the much greater shading capability of this card it has no issues at all with bushes and I can walk into a forest without dreading the performance hit.

In short I disagree that ArmA can be played well at low FPS, you get a person running the game poorly in a server with you and see who is faster and more accurate on the draw in a bushy area or when all hell is breaking loose.

Also a developer doesn't need to limit the graphics to provide playability and performance, all they need to do is incorporate a good degree of scalability. Just look at Stalker, it can be intensely demanding on a system or happily run in DX8 mode on a Ti4200 like a friend of mine is doing. With ArmA there are several decisions in this area which cost the game sales because people with less than stellar PC's just can't run it well enough, one of which in my opinion is the forcing of HDR. This alone ensures that someone just meeting the recommended specs (as I was a short while ago) can barely run the game in a playable state.

Share this post


Link to post
Share on other sites

The only problem i see with making a game scalable to the always evolving PC, is that the game never actually catches up.

Looking at OFP on a high end system doesn't make it look any better graphically. It may play much better but by todays standards of graphics it's realy very poor.

ArmA will not last 5 years like its predecessor OFP. Unless the devs have delayed game2 by 3 years or more. ArmA is too much of the same to last another 5 years. Already were seeing some of the modders move on and with the delay in tools all were getting is reskins,(very good reskins i may add) and on the topic of reskins, doesn't this tell you that the content for ArmA is severly lacking.

I personally don't want to wait another 3 years before any half decent mod comes out, because hopefully by then we should be playing game2. And then the cycle starts all over again, with the 'fanboys' sticking to there false hopes.

Basically i'm saying that ArmA IMHO is a stop-gap before the release of game2. So why should a game which is only supposed to last 3 years max, not be playable now with max settings. Or am i missing something here and i've failed to see that game2 isn't scheduled for release until 2012  wink_o.gif

Share this post


Link to post
Share on other sites
The only problem i see with making a game scalable to the always evolving PC, is that the game never actually catches up.

Looking at OFP on a high end system doesn't make it look any better graphically. It may play much better but by todays standards of graphics it's realy very poor.

ArmA will not last 5 years like its predecessor OFP. Unless the devs have delayed game2 by 3 years or more. ArmA is too much of the same to last another 5 years. Already were seeing some of the modders move on and with the delay in tools all were getting is reskins,(very good reskins i may add) and on the topic of reskins, doesn't this tell you that the content for ArmA is severly lacking.

I personally don't want to wait another 3 years before any half decent mod comes out, because hopefully by then we should be playing game2. And then the cycle starts all over again, with the 'fanboys' sticking to there false hopes.

Basically i'm saying that ArmA IMHO is a stop-gap before the release of game2. So why should a game which is only supposed to last 3 years max, not be playable now with max settings. Or am i missing something here and i've failed to see that game2 isn't scheduled for release until 2012 wink_o.gif

HUH,

looks better than oblivion which until crysis kind aheld the best looking WOW factor awards on the PC (possibly tied with Company of hero's).

10 km view distances, amzing HDr (when set to 32), awesome shadows. Sound and physics need bit of a boost. But GFX wise on par with everything else.

ARMA is basically a flight sim in GFX scale.

How many flight sims can you go inside a house?

Some models need extra textures on the inside. But are we playing the same game?

The trees in this game are amazing. With Post process on the blur applied by layer to a 10 km landscape is amazing. The night effects. I'm getting fed up of people saying Arma looks ok cuase they are running an crap PC and not seeing it maxed out.

It looks alot better than BF2. So which game are we comparing it to? christ BF2 can't even do widescreen or time of day effects let alone 10 km view distances etc etc.

As for GFX scale, to all those using a Ti400, pls just go buy an xbox360 its about 10x faster than your PC. And you will stop lagging up my online PC games...

Share this post


Link to post
Share on other sites
Visually a game may still look fine running at 25fps, but when your mouse response feels like sludge then it starts to seriously impact the playability and especially the enjoyment factor. Such is the situation I was in with my old 6800GT, particularly it was acute when looking into bushy areas. It was insanely maddening to get killed over and over because my mouse was too lagged out to aim and hit anything quickly.

Now with my x1950 I have the game running nicely on all normal settings with low shadows and post proc and no AA. It plays at only a marginally higher fps than on the 6800GT (35-50 against 20-25) but the major difference is in mouse response not being affected much at all and being able to aim at an enemy in the bushes and hit him fast. In fact with the much greater shading capability of this card it has no issues at all with bushes and I can walk into a forest without dreading the performance hit.

In short I disagree that ArmA can be played well at low FPS, you get a person running the game poorly in a server with you and see who is faster and more accurate on the draw in a bushy area or when all hell is breaking loose.

Also a developer doesn't need to limit the graphics to provide playability and performance, all they need to do is incorporate a good degree of scalability. Just look at Stalker, it can be intensely demanding on a system or happily run in DX8 mode on a Ti4200 like a friend of mine is doing. With ArmA there are several decisions in this area which cost the game sales because people with less than stellar PC's just can't run it well enough, one of which in my opinion is the forcing of HDR. This alone ensures that someone just meeting the recommended specs (as I was a short while ago) can barely run the game in a playable state.

30 FPS should be considered your MIN frame rate, you have to sacrifice eye candy AA SSAA even Af and screen res till you can make sure 955 of the time you don't drop below 30 FPS.

I can totally max ARMA even 10 km VD and SSAA and generally beat 30 FPS. But if i crawl in the grass in a forest and look down my optical sites BANG 10-15fps.

Either I avoid the grass and woods or drop some features and make sure 95% of the time I don't drop below 30.

Share this post


Link to post
Share on other sites

20FPS minimum is enough for me in ANY game (Exept 360 games)

Personaly I prefer 30FPS, but I don't notice the different from 30 FPS to 100000000000000000000000000FPS. lol.

Share this post


Link to post
Share on other sites
Already were seeing some of the modders move on and with the delay in tools all were getting is reskins,(very good reskins i may add) and on the topic of reskins, doesn't this tell you that the content for ArmA is severly lacking.

Oh my... Why is it I'm seeing this over and over (not just from you MrR) the game has only been out for 6 months, hell it doesnt even have a fully global release yet. How can the tools possibly be "delayed" (yes, I'm aware there were RUMOURS that the tools would be out with 505, but it was just that - a rumour)

Aside from HL2, I know of NO game which was released with the toolkit/SDK/whatever. It is infact common for the tools not to be released until approximately 6 months after the game release, in order for the actual game to get a good playing first (yes, I know that the ArmA campaign is a bit naff, thats beside the point)

I believe we have ALL become "spoilt" by the notion that OFP would port over easily and quickly to ArmA. Its just not the case however. Patience is a virtue.

(Sorry for the OT tounge2.gif )

Share this post


Link to post
Share on other sites

A note about LCD screens. LCD screens are always 'drawn'. They are backlit, and have no flicker. The work by shining a light through crystals that, very simply, change colour. They can change colour faster than any CRT monitor can draw a whole screen. They are usually limited to a certain hz rating by the limits of crt protocols or hardware conventions.

The frames that you're seeing onscreen also correspond to frames that the computer is rendering in real time. A faster framerate is generally more playable, and will allow you greater precision. The point where this would become excessive is where your computer is rendering scenes faster than the refresh rate of your interface devices, such as your mouse or joystick. It is true that you can't perceive much past 30 frames visually, but the world that affects us is also not limited to what we can perceive, both virtually or actually. How much better is 120 frames than 60? Not much. But I'd bet that there is an incremental edge to be had at those fast frame rates. This is certainly my experience playing il2, where a faster framerate allows you to much better guage speed, especially when something is streaking across your monitor at a combined speed of 1200 km/h.

edit: it also seems to help with continuity of images. An example is where a p-51 mustang is opening up its 6 machineguns, each with a fire rate of around 750 rpm. Bear in mind that it was a while ago that I noticed this, but at framerates, let's say, of around 30, it just looks like a stream or cloud of shells falling out of the wings. At 60 you can see every individual shell falling out if the ejection ports and continuing off screen.

Share this post


Link to post
Share on other sites

Turning ArmA down to lowish detail and getting 60-90 fps is really an eye opener on how cool high fps can be. Everything seems so smooth and aiming is responsive.

Share this post


Link to post
Share on other sites

So,my game looks fine with 17FPS,but I feel it' a bug with Fraps.I saw,see freezes in many games,in ArmA I havo no freezes,but have 15-20FPS. In Stalker i have 75-150FPS playing on max.Strange

Share this post


Link to post
Share on other sites
Turning ArmA down to lowish detail and getting 60-90 fps is really an eye opener on how cool high fps can be. Everything seems so smooth and aiming is responsive.

yes but it looks like Britney Spears without hair and makeups

Share this post


Link to post
Share on other sites
Turning ArmA down to lowish detail and getting 60-90 fps is really an eye opener on how cool high fps can be. Everything seems so smooth and aiming is responsive.

yes but it looks like Britney Spears without hair and makeups

yes low res is "who lets the dogs out" if your a HOLD BF2 CS kinda person MAX fps is needed to get every advantage over the other guy. I often hide in woods so if people come after me I know people with bent hardware will lag and I can pop up or roll about and nail them.

Until ARMA for a PC breaker, Oblivion was the benchmark of choice. Now its Arma till Crysis arrives.

Share this post


Link to post
Share on other sites

ArmA in the lowest detail still looks better than OFP.

I have a mid to high range PC, and I still play with somewhere around "medium" settings.

That's the way games are released for the PC these days. They're made so that even high end PC's can't play on the highest visual settings. As technology advances, you can enable more and more visual settings. It gives the game more longevity.

Every time I upgrade my PC, I always find myself loading up old gems that I couldn't run at full graphical settings before. It's one of the ways that game developers keep interest in the game, long after it's been released. If you're like me, when you get new hardware you load up games that you've previously shelved, just to see them played the way they were supposed to be played.

Nothing wrong with it, it's the way things are done.

FPS is very important. FPS is much more important than eye candy. Just tune your graphical settings to where you get 30 FPS and above, and increase the graphical settings as you upgrade your PC.

Share this post


Link to post
Share on other sites
ArmA in the lowest detail still looks better than OFP.

I have a mid to high range PC, and I still play with somewhere around "medium" settings.

That's the way games are released for the PC these days. They're made so that even high end PC's can't play on the highest visual settings. As technology advances, you can enable more and more visual settings. It gives the game more longevity.

Every time I upgrade my PC, I always find myself loading up old gems that I couldn't run at full graphical settings before. It's one of the ways that game developers keep interest in the game, long after it's been released. If you're like me, when you get new hardware you load up games that you've previously shelved, just to see them played the way they were supposed to be played.

Nothing wrong with it, it's the way things are done.

FPS is very important. FPS is much more important than eye candy. Just tune your graphical settings to where you get 30 FPS and above, and increase the graphical settings as you upgrade your PC.

Case in point bishop is both FEAR and HL2 could run max settings on my last GFX card the 7800 GTX, but enabling supersample AA at 4x aa and both games would drop below 30 FPS in some areas.

13 months later I got a 8800 when back maxed them out and could not drop below 60 with ssaa and enjoyed them both again. Same with Far Cry I even edited the far cry cfg to turn LOD off so far Cry displays everything high res as far as the eye can see and still no slow down on a 8800.

ARMA can't be maxed at high res on any machine without slow down. Next years CPU's and GPU's are required for that. But it seems about half the people who own a 8800 expect ARMA/Crysis/Alan Wake to run fully maxed at 2000+ res at 100FPS.

Boy are they in for a shock.

best one was the guy maoning on another BB about Command and Conquer 3 running at 30 FPS on his machine (quad core 8800 SLI) and it ran at 30 FPS on his mates duo core with a single 8800. he didn't know CnC3 is capped at 30 FPS. This was after a page rant on how crap EA are and can't programme a game and hist machine should do 60 FPS as 30 sucks...

Share this post


Link to post
Share on other sites

Some good points have been bought up here. Thank you.

To try to steer this debate more towards the value of what ARMA has to offer rather than the value of FPS we need to remember what sort of game ARMA as with OFP was created to be.

To quote myself (ahem) "I remember when OFP arrived, out of the blue, created by (as i recall) two brothers that lived in Czech that where fed up with the lack of access to PC games and so decided to write their own ( Accurate solar constellations and all!

Now, "accurate solar constellations" are of course just a touch of the realism those brothers did strive to bring us.

ARMA and OFP are a VERY particular kind of game.

I think most of us (OFP vets) already new that ARMA was going to be, not so much a stopgap before 'Game 2' (which is being made by different developer (and publisher?) altogether) but an update to OFP (OFP 1.5 as it was nicknamed during development).

Anyone who played OFP would already have known that that there was never gonna be any chance of playing ARMA on full graphics settings, at any decent frame rate with any decent hardware.

ARMA is a simulator, on a huge island not a 'gunfest' in a small field/city block.

You just cant expect the same level of visuals from a sim of this size and caliber as you would from a game like FEAR etc.

As i said before " BIS could so very easily have made it so any reasonable PC could run it at full options (just lower the possible visual quality) and probably no one would complain"

Take your eyes off the settings menu, set your options while watching your FRAPS and enjoy the experience BIS has created for you smile_o.gif

Cheers Geezer

Share this post


Link to post
Share on other sites

well Oblivion is still PC breaker with texture addons, custom shaders and addons and very tweakable config

not to mention Oblivion supports multicore systems (so you can tweak AI, engine I/O, physics, rendered area to extreme)

with 4 dualcores You can get tons of exterior cells populated with AI and huge "populated" viewdistance ...

Share this post


Link to post
Share on other sites
well Oblivion is still PC breaker with texture addons, custom shaders and addons and very tweakable config

not to mention Oblivion supports multicore systems (so you can tweak AI, engine I/O, physics, rendered area to extreme)

with 4 dualcores You can get tons of exterior cells populated with AI and huge "populated" viewdistance ...

Arma and Supreme Commander break even the fastest PC's out of the box, no mods required.

Both are the lastest examples in a long line of games that are out 12 months before the hardware and drives to run them maxed out are.

Fry cry, Doom 3, HL2, Oblivion are also great examples even the very best and most expensive hardware on day of release could not max them totally.

If you want ARMA at 1600z1024, all ingame maxed with 10k VD, SSAA enabled and 32 HDR at 100FPs min. Next years hardware is the only way as even the very best PC's on the planet CAN't do that yet.

Supreme Commander may be 2 years away from Max out, as that can 2 FPS a quad core Oc'ed Intel with 8800 SLI. IF you have a battle with 3 AI armies on a large map with max GFX settings!

Share this post


Link to post
Share on other sites

It's all very well saying that ArmA will require future spec machines to run at max settings...but is it justified? I look at the graphics on my setup (3GHz C2Duo, 640MB 8800 gts) and don't see much to warrant the poor performance. Even on medium settings it's not smooth. Whether that is a problem with the game or the drivers I am not sure.

Share this post


Link to post
Share on other sites
It's all very well saying that ArmA will require future spec machines to run at max settings...but is it justified? I look at the graphics on my setup (3GHz C2Duo, 640MB 8800 gts) and don't see much to warrant the poor performance. Even on medium settings it's not smooth. Whether that is a problem with the game or the drivers I am not sure.

drop screen res, drop visual features, amix of both can result in 60+fps that will never drop.

You have to find a screen res, visual mix that suits you for gameplay and looks.

For taking screenshots I max it all out but can end with 5-10 FPS occasionally, not so hot for MP.

By dropping some visual features I get a super stable min 30 FPS which is great.

Arma like Supreme Commander is about 12 months ahead of its time.

In 12 months the next gen videos cards will be 2 faster than the 8800 and the new CPUs 2x faster etc. Crysis will be the same, its always been this way it always will.

Almost all games can be maxed out on a 8800 better than any other video cards. ARMA and Supreme Commander are 2 new games where we gets the best performance but can't max it totally yet.

try the nvidia 102 drivers from 3dguru they are really good with the 105 beta.

Share this post


Link to post
Share on other sites

^^

The way i see it thats far from being true, Arma is not 12 or 24 months ahead of current HW available. While it is a very demanding app that should benefit from future HW it doesnt take advantage of SLI or dual core CPU's, that being just a simple example of technology thats been out for a very long time now.

Since you seem to be quite the enthusiast you might have read about dx10 and understand that the leap from dx9 to the dx10 api will not be as big as the one from dx8 to dx9 in terms of graphical features. It will be a massive leap in terms of eficiency and better use of resources (SM4).

Arma is just not programed to take advantage of all this.

Share this post


Link to post
Share on other sites

BIS said  ArmA doesn't support Dual core Cpu,doesn't support SLI. OK, I changed my cpu to p4 3.8Ghz(it should be the fastest single core cpu), gpu to 7800(not 8800, but it should be far better than the requirement of Arma).

 turned on the machine .

 started at a open field. OH! nice 50 fps.  

 6 my AI team mates was coming, droped to 40 fps.

 arrived to the combat positon beside the wood, went prone in grass, droped to 24 fps. ... it still can be acceptable.

 enemy soldiers were coming! fire! 20fps!  No problem. this time I just wanted to hold the position. needn't move. so I still can do my job at 20 fps.

 enemy tank was coming! retreated to the forest!. My god! 15 fps.

no way to continue. gave up. quitted the game.

 turn off the machine.

my spec

p4 3800

ram 2g

vid 7800gx

graphic setting:

1028x1024

pastprocess very low,

AA off

rest is low.

....

huh.gif

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
Sign in to follow this  

×