Jump to content
Sign in to follow this  
teach

ARMA 2 Review: A tale of wasted FLOPS

Recommended Posts

Never heard of.

And why does the author (who I've never heard of either) bring up the "renaming to crysis.exe" trick. Wasn't that for enabling SLI/Crossfire support before NVidia/ATI added profiles for it?

Seems like he run a few tests and browsed the forum a bit, just to boost his own opinion about using spare capacity of GPU for off-loading CPU.

Which btw is a good argument but could have been more clear than just pulling a few FPS tests from one single game.

Edited by =WFL= Sgt Bilko
Spell fool

Share this post


Link to post
Share on other sites

I kind of agree with the review except he was wrong about people spending $2,500 on a rig.

The performance does need improving and with that the AI. There's been lots of improvements so I'm not fully criticizing BI, but still, the game can sometimes left a lot to be desired with some of the most simplest of things.

Share this post


Link to post
Share on other sites

The only sluggishness I experience is when the weapon view is zoomed.

Maybe the sluggishness some people experience has something to due with mouse polling.

Share this post


Link to post
Share on other sites

I tried playing without ATOC and i just cant...AA very high with ATOC is just looking too good and i get my average 25 FPS normally with an e8400 @ 3.6ghz & Gtx460....zoom into very near gras and the good old orange trees kills the FPS to sometimes 17 FPS, but like i said i can stand that since Arma 2 has the most beautiful graphics on the market if you are able to play it maxed out! i dont get how they complain that much in the article if i dont have massive problems with a mid range rig.

Share this post


Link to post
Share on other sites
Agree. If you're distracted by FPS then you're not properly engaged with the game. Only when FPS drops below a rather obviously stuttery level should you be concerned.

yeah.

if you get what Arma2 was about/around, you dont care about FPS... so much.

p.s.

article author is basically right, his GPU "GFLOPS" was wasted, but there are VR engine specific in effect.

and thats why peoply buy Arma 2 and retain/keep, if they like it.

p.p.s.

he's mistyped "Arma 2" as "Amra 2" at least once.

Edited by BasileyOne

Share this post


Link to post
Share on other sites

I run ARMA at 1920x1080 on a $700 rig, built over a year ago. Where this guy gets his information, I do not know.

Share this post


Link to post
Share on other sites

This should be required reading for anyone who wants to debate how many FPS the eye can see.

As for the game, I'm at least able to play on "Very High" default settings, with 3D res bumped up to my monitor's res (1920x1080) and get consistently smooth gameplay.

Share this post


Link to post
Share on other sites
You forgott that most of those GPU based games out there are player centric. Everything that happens, happens around the player. In Arma stuff can happen multiple klicks away from the player. So its totally logical that the Game needs more CPU Power.

Quoting for truth.

Share this post


Link to post
Share on other sites

Using the GPU for general processing is pointless when the game doesn't take full usage of the CPU.

BI should focus on current CPU technology (64Bit, Multi-Threading). From the comments I have read, it seems that people believe the game is CPU bound - did you miss where the reviewer pointed out he was using a top of the line $1000 CPU?

The game isn't CPU bound - it's bound by the technology behind the engine, not the hardware.

Share this post


Link to post
Share on other sites
Using the GPU for general processing is pointless when the game doesn't take full usage of the CPU.

BI should focus on current CPU technology (64Bit, Multi-Threading). From the comments I have read, it seems that people believe the game is CPU bound - did you miss where the reviewer pointed out he was using a top of the line $1000 CPU?

The game isn't CPU bound - it's bound by the technology behind the engine, not the hardware.

point is, GPU's can't do time-critical stuff quick enough.

not yet.

HUUUUUUGE latency.

and slow and latent bus[PCIe this time] wasn't help much too.

so, BIS developers simply had [virtually]no choice, "in short".

Share this post


Link to post
Share on other sites

also this whole CUDA stuff works only on NVIDIA cards so 60+% of users is out of luck?

The latest Steam Hardware Stats [March 2011] beg to differ... 59.11% USE Nvidia Cards (32.98% ATI & 6.22 Intel) :)

Source:Steam Hardware Survey [Valve Software]

So making Some of the More CPU intensive parts run via CUDA isn't a bad idea!

Share this post


Link to post
Share on other sites
The latest Steam Hardware Stats [March 2011] beg to differ... 59.11% USE Nvidia Cards (32.98% ATI & 6.22 Intel) :)

Source:Steam Hardware Survey [Valve Software]

So making Some of the More CPU intensive parts run via CUDA isn't a bad idea!

Explain that to the 32.98%, would you?

Share this post


Link to post
Share on other sites
The latest Steam Hardware Stats [March 2011] beg to differ... 59.11% USE Nvidia Cards (32.98% ATI & 6.22 Intel) :)

Source:Steam Hardware Survey [Valve Software]

So making Some of the More CPU intensive parts run via CUDA isn't a bad idea!

And every one of the 60% owns a modern nvidia card with CUDA cores, am I right son?

Share this post


Link to post
Share on other sites
The latest Steam Hardware Stats [March 2011] beg to differ...

And Steam somehow collects stats on every PC ever sold on earth ???

Share this post


Link to post
Share on other sites
The latest Steam Hardware Stats [March 2011] beg to differ... 59.11% USE Nvidia Cards (32.98% ATI & 6.22 Intel) :)

Source:Steam Hardware Survey [Valve Software]

So making Some of the More CPU intensive parts run via CUDA isn't a bad idea!

STEAM survey covers only part of our users

also STEAM survey is done on 200-500k users out of theirs 20+ millions ...

so while it may be statistically enough it's not precise enough to apply worldwide ...

AMD and NVIDIA still work on delivering fully multithreaded drivers for 7 ...

NVIDIA has first victory there for Civ5

AMD plans to bring this for all DX11 and universally beyond (rumours says 10, OpenGL and maybe 9)

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
Sign in to follow this  

×