Jump to content
Sign in to follow this  
teach

ARMA 2 Review: A tale of wasted FLOPS

Recommended Posts

http://jonpeddie.com/reviews/comments/arma-2-review-a-tale-of-wasted-flops

ARMA 2 Review:A tale of wasted FLOPS

Posted by Ted Pollak and Robert Dow on February 10th 2011 |

For the uninitiated, ARMA 2 is widely known amongst Enthusiast PC Gamers as one of the most system intensive and realistic games on the market.

This is so because the environmental effects are dynamic, view distance can be set to 10,000 meters and is affected by light, reflection, rain, and mist. As in real life if you are walking thought the forest with the sun in front of you, the terrain can take on a shadowed nightmare as your retinas struggle with both light and darkness. Additionally there is what is known as the "the sandbox element." Sandbox games can be setup for unscripted play where you do not know where the enemy is coming from. It's no coincidence that PC Gamer Magazine hosts an ARMA 2 server among only two other games hosted by the magazine.

We decided to test this beast on some of our hottest hardware, and unfortunately, the results were what I expected. ARMA 2, and many other advanced simulations, are "bound" to the CPU and do not properly utilize the power of modern day graphics processing units; which is probably why the GPU suppliers never use them as benchmark examples.

We ran a number of graphics settings combinations, at various common gaming resolutions. We ran these tests on enthusiast and performance AIB's and found that, in the case of Nvidia-based AIBs, there was little performance variation between a $350 GPU and a $550 GPU – and we got similar results with AMD AIBs.

The GTX570 and GTX580 are almost in a dead heat on average performance across all benchmarks, as measured in FPS using the in-game FPS counter. Yet the GTX580 has 176 Gigaflops more computing power (1581 vs. 1405). We even tested the GTX560 (1262 Gigaflops) in a few scenarios and there was still no significant drop in performance. To put this in perspective a Playstation 2 is capable of about 3.5 Gigaflops. It's almost like having 91 PS2's just sitting around watching us sweat and not helping! Well get to work you little buggers!

I think it is high time for simulation and advanced PC gaming companies to write CPU/GPU load balancing code into their products. Simulation gamers often have the best equipment, spending well over $2,500 on their machines. If Bohemia Interactive were to write such engine code and supply it with some kind of expansion pack, it'd be a hit.

In fact let's not stop at games. Adobe didn't. During my recent installation of Premiere Essentials I was asked if I had Nvidia GPUs in my system so that the software could use the available cycles on those chips. Let's start this kind of thinking for games as well, and let's make it brand agnostic. Granted since I'm not a programmer, I don't know the complexity of what I am asking for, so any comments from those in the know would be welcome.

If you play these types of games and are building a system, here's my advice if you do not have unlimited budget. Focus on the motherboard, CPU, CPU cooling, and hard drive for your initial investment. Those things are harder to upgrade. A current performance level (i.e. mid level) gaming GPU is all that is needed and super easy to upgrade later.

Our test rig was running Win 7 64 on a Core i7 x980 3.33 GHz with 4 gigs of RAM.

What do we think?

There have been tens of thousands of forums posts of people trying to unlock the secret to increasing this game's performance. In fact an entire benchmark has been created called ArmaMark and thousands of people have run it with many posting their scores. One of the popular Nvidia tweaks is to force VSync off and increase pre-rendered frames. (this helped me). Believe it or not one of the AMD tweaks is to rename the executable Crysis.exe??? There are even settings that increase performance when graphical complexity is increased. Totally counter-intuitive; I am still a bit perplexed myself but hope that game engine designers can write engines to survey system usage on the fly, and dynamically change the threading so as to load balance the game. From my relatively primitive viewpoint, a FLOP is a FLOP.

Edited by TeAcH

Share this post


Link to post
Share on other sites

You forgott that most of those GPU based games out there are player centric. Everything that happens, happens around the player. In Arma stuff can happen multiple klicks away from the player. So its totally logical that the Game needs more CPU Power.

Share this post


Link to post
Share on other sites

What exactly is the purpose of this review?

The conclusion seems to be that Arma 2 is mostly CPU dependent. Well, great, because we hadn't already known that since 2009. :rolleyes:

Share this post


Link to post
Share on other sites
There are even settings that increase performance when graphical complexity is increased. Totally counter-intuitive

Counter-intuitive, perhaps, but it makes sense when you realize why:

In the case of shadow settings, "high" and "very high" utilize the GPU, while lower settings are more CPU and less GPU intensive. When you consider that Arma 2 is overall very CPU dependent, it makes sense that freeing up CPU cycles by calculating shadows on the GPU would improve performance.

Share this post


Link to post
Share on other sites
You forgott that most of those GPU based games out there are player centric. Everything that happens, happens around the player. In Arma stuff can happen multiple klicks away from the player. So its totally logical that the Game needs more CPU Power.

Totally true. As well as the little known fact that BI is saving themselves for a monster new physics engine :bounce3:

Edit: Just being cheeky guys -no new engine afaik.

Edited by froggyluv

Share this post


Link to post
Share on other sites
You forgott that most of those GPU based games out there are player centric. Everything that happens, happens around the player. In Arma stuff can happen multiple klicks away from the player. So its totally logical that the Game needs more CPU Power.

The conclusion seems to be that Arma 2 is mostly CPU dependent. Well, great, because we hadn't already known that since 2009. :rolleyes:

Have you never heard of OpenCL, CUDA, ATI Stream etc.?

Share this post


Link to post
Share on other sites

it's still the hottest looking game when maxed out, i would date it if i wasn't girlfriended up :D

Share this post


Link to post
Share on other sites
What exactly is the purpose of this review?

The conclusion seems to be that Arma 2 is mostly CPU dependent. Well, great, because we hadn't already known that since 2009. :rolleyes:

Since 2001 more like ;)

Share this post


Link to post
Share on other sites

He doesn't even bother to tell what version he patched the game to. And why is he running 4GB ram with a 980X? you'd expect triple channel.

And while a 980X might look impressive an i5-2500K is pretty affordable and faster in all games.

Share this post


Link to post
Share on other sites
froggy - pray tell , new engine ?...or are you speculating

Sarcastically optimistic..?

Share this post


Link to post
Share on other sites

The author hopes game code like ARMA2 should depend more on GPU flops instead of CPU. But can this kind of game code will be compatible with both AMD and NVIDIA card? And can it will be compatible with different framework for GPU update?

Share this post


Link to post
Share on other sites
Totally true. As well as the little known fact that BI is saving themselves for a monster new physics engine :bounce3:

Proof?

Btw the wird thing is, I can play just fine even with 35 FPS in Crysis 1, while ArmA2 feels sluggish even at 40.

Edited by ziiip

Share this post


Link to post
Share on other sites
Proof?

Btw the wird thing is, I can play just fine even with 35 FPS in Crysis 1, while ArmA2 feels sluggish even at 40.

I have diagnosed your problem and I have come to a conclusion: PEBKAC.

Don't worry it is normal when people do not know how to configure their system.

Share this post


Link to post
Share on other sites
I have diagnosed your problem and I have come to a conclusion: PEBKAC.

Don't worry it is normal when people do not know how to configure their system.

No, there is a definitive difference in responsiveness when you sway your weapon in the two games at low framerate.

Share this post


Link to post
Share on other sites

The guy raises a valid point. There is no need to shoot him down because he is criticising. Its a good post and it is informative.

In an ideal world you would have all these things, and evenly spread hardware load. But that just isn't possible in some games due to their very scope.

Share this post


Link to post
Share on other sites

in Crysis You are not aiming with whole body thus the sway acts completely differently as the rendering and modeling of view is done differently

also the review seems to be fast cooked w/o any sense put in ...

there are better analysis of CPU and GPU performance of A2 and OA engine around

missing are system hardware and software specs, driver specs, game build and so on ...

not able provide complete .cfg settings, commandlines or even expirence of what does what (oh wait it's detailed on BIKI for example lol)

do the author realized when the engine was made and released

then OpenCL was on drawing board and CUDA was still in infancy state ?

also this whole CUDA stuff works only on NVIDIA cards so 60+% of users is out of luck?

arguing with playstation 3 computing power while "analyzing" PC game is beyond my logic too :)

Share this post


Link to post
Share on other sites
Proof?

Btw the wird thing is, I can play just fine even with 35 FPS in Crysis 1, while ArmA2 feels sluggish even at 40.

What is it with people wanting 35+ fps? Anything above 23fps and the mark one eyeball cant tell the difference.... perfectly playable in other words.

Flame away.

Share this post


Link to post
Share on other sites
What is it with people wanting 35+ fps? Anything above 23fps and the mark one eyeball cant tell the difference.... perfectly playable in other words.

Flame away.

Agree. If you're distracted by FPS then you're not properly engaged with the game. Only when FPS drops below a rather obviously stuttery level should you be concerned.

Share this post


Link to post
Share on other sites
What is it with people wanting 35+ fps? Anything above 23fps and the mark one eyeball cant tell the difference.... perfectly playable in other words.

Flame away.

Well, shouldn't your brain work "slightly" different then your computer?Yes? No?

Share this post


Link to post
Share on other sites

I notice a significant difference in displayed images when they go from <20, to 20, to 30, to 60. After 60 I notice no difference. However, I am perfectly happy playing with a FPS of 15 or higher. Lower than 15 I find I just lose all control of my character.

Share this post


Link to post
Share on other sites

@dale0404

not entirely true. If it's a you're watching something recorded with a camera (TV, Cinema) it is true, 25FPS are enough.

But there is one thing that makes it different to Computer games: on a Camera, the shutter remains open for some miliseconds. Anything that moves in this time will create a motion blur effect which we are used to see and what our brains expect as source of information about movements.

In computer games, this shuttertime is missing, by default there is no motion blur (although games (generally speaking) try to fake it) so our brain misses evident data to recognize a ongoing movement instead of a series of still images.

The effect, although you have FPS that one would say it is enough, somehow it still feels sluggish.

Share this post


Link to post
Share on other sites

24 fps is NOT the max fps of your eyes. It's the fps at witch something appears to be moving instead of being a slideshow. you can easily see the difference between 30 and 60 fps. That's why (24fps) movies always have the car semi-static while the landscape is blurry as hell, to mask the low fps.

Why they didn't kick movies to 60 fps with blueray is beyond me. Now all tv's are interpolating screens and sacrifice sharpness for smoothness.

Share this post


Link to post
Share on other sites
24 fps is NOT the max fps of your eyes. It's the fps at witch something appears to be moving instead of being a slideshow. you can easily see the difference between 30 and 60 fps. That's why (24fps) movies always have the car semi-static while the landscape is blurry as hell, to mask the low fps.

Why they didn't kick movies to 60 fps with blueray is beyond me. Now all tv's are interpolating screens and sacrifice sharpness for smoothness.

IMO these new interpolating techniques make movies look like they were shot on video :)

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
Sign in to follow this  

×