Jump to content
Sign in to follow this  
zhrooms

Arma 3 Alpha - CPU Comparison - Helicopter Showcase

Recommended Posts

SLI puts significantly extra work on your CPU, and unless you're running 2560x1600 with AA, there's no point in running SLI 680's. Try disabling or removing one card and check it again.

[citation needed]

Share this post


Link to post
Share on other sites
[citation needed]

http://lmgtfy.com/?q=sli+cpu+overhead

A Single-GTX680-SOC-2GB makes things even worse:

http://imageshack.us/a/img521/6016/arma32013051209314310.jpg

http://imageshack.us/a/img109/6569/arma32013051209315488.jpg

http://imageshack.us/a/img89/9231/arma32013051211205033.jpg

http://imageshack.us/a/img191/5580/arma32013051211205159.jpg

http://imageshack.us/a/img199/8782/arma32013051211205421.jpg

http://imageshack.us/a/img4/2562/arma32013051211205724.jpg

http://imageshack.us/a/img705/7404/arma32013051211231785.jpg

http://imageshack.us/a/img195/4765/arma32013051211231885.jpg

http://imageshack.us/a/img43/8837/arma32013051211231998.jpg

http://imageshack.us/a/img825/2707/arma32013051211232083.jpg

http://imageshack.us/a/img29/6198/arma32013051211232331.jpg

http://imageshack.us/a/img12/2645/arma32013051211233089.jpg

http://imageshack.us/a/img189/9120/arma32013051211233339.jpg

Beside this, my 2-Way-SLI is still not strong enough to drive a 1920x1080 120Hz Screen, when playing BF3 with Ultra-Settings. I would need a 3-Way-SLI to reach the 120fps on a stable base, as the 2-Way-SLI reaches only ~90fps on average. I didn't upgrade only for ARMA-3.

:)

That is strange. I know BF3 scales really well with multi-GPU, but I get the feeling most of the gpu usage in that game comes from phoney resource taxation. Meaning, going from Low Shaders quality to Ultra incurs a huge hit on GPU performance, but doesn't really improve anything visually. Arma 3 so far for me is showing amazing GPU utilization, and I haven't even enabled AA yet. If I was to tack 2x or 4x AA on I'm sure I'll be GPU capped 24/7. I actually don't like doing so however as being at 99% GPU usage on most situations with a Radeon card creates horribly stuttering. For this reason I typically cap my FPS at 59 so my GPU usage is usually in the 50-80% ballpark and I don't get any tearing or stuttering.

Check this out Tonschuh:

That's on a single GTX 680 in Battlefield 3. He's getting insane frames per second, all Ultra, and that's while recording using FRAPS (or something else, still just as taxing)

Share this post


Link to post
Share on other sites

i was curious about it so i checked about 8 links, none explained or proved what you mentioned.

and about the difference on bf3, it changes considerably specially for far away objects, into a higher lod, also the lighting/shadows-shadow quality and other subtle differences on textures, etc. you can perceive the difference here:

high: http://images.bit-tech.net/content_images/2011/11/battlefield-3-technical-analysis/mesh-high.jpg

ultra: http://images.bit-tech.net/content_images/2011/11/battlefield-3-technical-analysis/mesh-ultra.jpg

here its easier to perceive how much mroe blurred the shadows become when the range increases:

high: http://images.bit-tech.net/content_images/2011/11/battlefield-3-technical-analysis/preset-high.jpg

ultra: http://images.bit-tech.net/content_images/2011/11/battlefield-3-technical-analysis/preset-ultra.jpg

http://www.bit-tech.net/hardware/2011/11/10/battlefield-3-technical-analysis/1

and here you can see crossfire scaling to almost 100% more performance on a 6990 (6990 has a slightly lower clock) compared to a 6970: (throwing out the window the cpu overhead hypothesis)

http://www.bit-tech.net/hardware/2011/11/10/battlefield-3-technical-analysis/4

Edited by white

Share this post


Link to post
Share on other sites
http://lmgtfy.com/?q=sli+cpu+overhead

That is strange. I know BF3 scales really well with multi-GPU, but I get the feeling most of the gpu usage in that game comes from phoney resource taxation. Meaning, going from Low Shaders quality to Ultra incurs a huge hit on GPU performance, but doesn't really improve anything visually. Arma 3 so far for me is showing amazing GPU utilization, and I haven't even enabled AA yet. If I was to tack 2x or 4x AA on I'm sure I'll be GPU capped 24/7. I actually don't like doing so however as being at 99% GPU usage on most situations with a Radeon card creates horribly stuttering. For this reason I typically cap my FPS at 59 so my GPU usage is usually in the 50-80% ballpark and I don't get any tearing or stuttering.

Check this out Tonschuh:

That's on a single GTX 680 in Battlefield 3. He's getting insane frames per second, all Ultra, and that's while recording using FRAPS (or something else, still just as taxing)

Thanks for the Link, but he is not playing with Ultra-Settings.

Just compare his Custom-Settings with my Ultra-Settings:

He (Custom-Settings):

http://img801.imageshack.us/img801/2138/customsettings.jpg

Me (Ultra-Settings):

http://img51.imageshack.us/img51/4373/bf32013051217270412.jpg

The fps are also different on the different maps and depend on how many players are on the map as well. My average ~90fps are more on the lower range of the frames I got on the Test-Map (Seine), but the lower fps are more present on the map then the higher ones, so it makes little sense to just calculate the average fps out of the max and min frames, as this is not the reality when you play the game.

Yes I could reduce some settings, but I play so or so with adaptive VSync enabled and get the fps I really need and no micro-stutters.

I assume that BF4 will even demand more from our systems and therefore I upgraded from 1x GTX670-OC-2GB to 2x GTX680-SOC-2GB, even that the difference between 2x GTX670-OC-2GB and 2x GTX680-SOC-2GB is only around 3%.

Pic's of my Rig:

http://imageshack.us/a/img841/5547/wp20130423001i.jpg

http://imageshack.us/a/img14/8169/wp20130423002.jpg

:)

Edited by TONSCHUH

Share this post


Link to post
Share on other sites

@zhrooms

i am very interested in total cpu usage and gpu usage in addition to your graphs. I use hwinfo64 to log it. Maybe it would be a kind of selfexplanation to prevent a speculation overkill :)

@Tonschuh

impressive rig.

Share this post


Link to post
Share on other sites
i was curious about it so i checked about 8 links, none explained or proved what you mentioned.

and about the difference on bf3, it changes considerably specially for far away objects, into a higher lod, also the lighting/shadows-shadow quality and other subtle differences on textures, etc. you can perceive the difference here:

high: http://images.bit-tech.net/content_images/2011/11/battlefield-3-technical-analysis/mesh-high.jpg

ultra: http://images.bit-tech.net/content_images/2011/11/battlefield-3-technical-analysis/mesh-ultra.jpg

here its easier to perceive how much mroe blurred the shadows become when the range increases:

high: http://images.bit-tech.net/content_images/2011/11/battlefield-3-technical-analysis/preset-high.jpg

ultra: http://images.bit-tech.net/content_images/2011/11/battlefield-3-technical-analysis/preset-ultra.jpg

http://www.bit-tech.net/hardware/2011/11/10/battlefield-3-technical-analysis/1

and here you can see crossfire scaling to almost 100% more performance on a 6990 (6990 has a slightly lower clock) compared to a 6970: (throwing out the window the cpu overhead hypothesis)

http://www.bit-tech.net/hardware/2011/11/10/battlefield-3-technical-analysis/4

I don't understand why I should even have to explain this, the year is 2013. Multi GPU is not something new. Why do none of you know about poor SLI scaling resulting from low graphics load but high cpu load. I've done 3 trips down multi gpu lane and every single time I saw the same results. Running 1920x1200 meant that most games saw 50%/50% GPU usage on both cards. It was only when I started to pump insane amounts of AA into the game that I saw proper scaling. Some games, the scaling was so bad that I was getting worse frames per second than I did with a single GPU. There are thousands of posts on the net about this exact scenario. Whether you choose to ignore them or accept them is up to you. I have my experience with it, and know whats what.

As for BF3, yeah look at the difference from Low to Ultra. Not worth the massive toll it takes on your GPU usage. Sure if you have the hardware to run it you may as well, otherwise that's wasted money, but in all honesty the only thing you need to run is Textures and Mesh quality on Ultra. Everything else is a joke and I bet the GPU companies paid them off to artificially inflate GPU usage on those settings to better sell video cards. Don't think it doesn't happen. *points to Crysis 2 tessellation patch + nvidia*

Thanks for the Link, but he is not playing with Ultra-Settings.

Just compare his Custom-Settings with my Ultra-Settings:

He (Custom-Settings):

http://img801.imageshack.us/img801/2138/customsettings.jpg

Me (Ultra-Settings):

http://img51.imageshack.us/img51/4373/bf32013051217270412.jpg

The fps are also different on the different maps and depend on how many players are on the map as well. My average ~90fps are more on the lower range of the frames I got on the Test-Map (Seine), but the lower fps are more present on the map then the higher ones, so it makes little sense to just calculate the average fps out of the max and min frames, as this is not the reality when you play the game.

Yes I could reduce some settings, but I play so or so with adaptive VSync enabled and get the fps I really need and no micro-stutters.

I assume that BF4 will even demand more from our systems and therefore I upgraded from 1x GTX670-OC-2GB to 2x GTX680-SOC-2GB, even that the difference between 2x GTX670-OC-2GB and 2x GTX680-SOC-2GB is only around 3%.

Pic's of my Rig:

http://imageshack.us/a/img841/5547/wp20130423001i.jpg

http://imageshack.us/a/img14/8169/wp20130423002.jpg

:)

Very nice rig indeed. He doesn't use MSAA, but that's about the only performance impacting setting he has different from you. The FOV and motion blur barely affect it. Then again with you having SLI you shouldn't even feel the impact of 4xAA in that game.

My point to that link was that if he can FRAPS the game and get 100-120 fps on a single GPU, then you with SLI 680's should be getting like 120+ fps with AA on. If you're not, then somethings really up because that's not the normal for that setup.

Share this post


Link to post
Share on other sites
I am very interested in total cpu usage and gpu usage in addition to your graphs. I use hwinfo64 to log it. Maybe it would be a kind of selfexplanation to prevent a speculation overkill

It varies too much (and) to little, really no point other then what I explained on the second page of this thread.

Anyway to show what I mean by GPU is not very important I took some pictures demonstrating the actual impact of the graphics card and that you really do need two video cards to be able to max it out. DaRkL3AD3R really has no clue what he's talking about..

(DayZ Mod View Distance 1500/1060, Everything Highest except Post Processing Quality)

_____________________________________

One GPU in Town, GPU Settings Turned Off

http://i.imgur.com/dMSXoW0.jpg

One GPU in Town, GPU Settings Turned On

http://i.imgur.com/Wc3ooi0.jpg

Conclusion: No difference in FPS, CPU is bottlenecking GPU

Two GPU's in Town, GPU Settings Turned Off

http://i.imgur.com/Ws9Z5yX.jpg

Two GPU's in Town, GPU Settings Turned On

http://i.imgur.com/LdF9LIn.jpg

Conclusion: Adding another GPU makes no difference in FPS, CPU is bottlenecking GPU's

So what can we take from this? A used GTX 580 1.5GB that cost around €90 can literally max out ArmA 3 Alpha.. or wait a moment.. that's just the Heavy CPU Town? What about everywhere else on the island?

_____________________________________

One GPU in Forest, GPU Settings Turned Off

http://i.imgur.com/zbvMUDp.jpg

One GPU in Forest, GPU Settings Turned On

http://i.imgur.com/EDStWYT.jpg

Conclusion: Huge difference in FPS, GPU is not powerful enough to push 60 FPS, and SSAO isn't even turned on which would bring it down even further

Two GPU's in Forest, GPU Settings Turned Off

http://i.imgur.com/zFqHkSH.jpg

Two GPU's in Forest, GPU Settings Turned On

http://i.imgur.com/sftbeS7.jpg

Conclusion: Double the GPU, Double the FPS, In other words very good GPU scaling, still not powerful enough to deliver 109 FPS which is the CPU limit

So what can we take from this? In situations where the CPU is not under heavy load, one GPU won't be able to max out the ArmA 3 Alpha, so SLI/CF is very helpful in that regard, you do not loose any performance by enabling SLI or CF, you only gain, but only in situations where the CPU is not under heavy load, which is pretty much everywhere outside Towns or in Heavy AI situations.

_____________________________________

So, GPU is not very important at all, when I used one GPU in the Forest that was with every GPU setting turned up Highest except Post Processing Quality, because it doesn't necessarily make the game look better at all and has a huge performance strain on the GPU for almost no visual effect, but if I had turned down the Anti-Aliasing effects and Shadows I should have been able to push 60 FPS in the Forest with One GPU, so SLI/CF is only good when you really want to play on literally highest (maxed out AA & Post Processing or higher resolutions than 1080p)

But you can clearly see the CPU has a much bigger impact, buying a i5 3570k for €175 and a GTX Titan GPU for €900 (Total €1075) would not get you 60 FPS in Towns, you would actually get considerably more FPS in general with a i7 3930k for €450 and a GTX 660 2GB GPU for €175 (Total €625)

If I would recommend a computer to a person that wants to max out ArmA 3 Beta/Final I would probably recommend a Haswell Intel Core i5 4570k 4 Core 6MB, overclock that bad boy to 5.0GHz with 4x4GB 2400MHz Memory and maybe 2x GTX 660 2GB with aftermarket coolers for extra overclocking, that setup would easily get an average of 80 FPS in ArmA 3 and BF3 at highest settings, no doubt, for a much smaller cost than for example a 3930k and a GTX Titan that would cost twice as much but only deliver 15-25% more FPS

@JumpingHubert This as impressive?

7WGBwZn.jpg

Edited by zhrooms

Share this post


Link to post
Share on other sites

Tack så mycket zhrooms!

Thanks zhrooms!

Thats why I changed my CPU from 4100 (3.6GHZ)@4.3GHZ to a 8350(4.0GHz) and went 5.0 GHz

Share this post


Link to post
Share on other sites
Thats why I changed my CPU from 4100 (3.6GHZ)@4.3GHZ to a 8350(4.0GHz) and went 5.0 GHz

Ye, as you can see in the chart in the Original Post, a 3570k at 3.8GHz had 35 FPS and overclocked to 4.8GHz it had 44 FPS, that's a 9 FPS increase (25.7%), which is a a damn lot.

Speaking of just that, Intel Haswell releases in less then a month, and early tests on the 4770k suggests that 6.0GHz is actually possible on water without much trouble at all, an early test got out a few days ago from Asia, in which in they got the 4770k to 5.0GHz on Air on the first try, 50x100 = 5000MHz on as low as only 1.1v, will be very interesting to see more reviewers getting their hands on them.

So all in all it seems Haswell has much better overclocking than Ivy Bridge (3570k/3770k), so I'm predicting a lot of people who play ArmA will buy Haswell because of the overclocking potential, a 4770k €275 at 5.5GHz will no doubt outperform a 3930k €450 at 5.0GHz, If 3.8GHz is 35 FPS and 4.8GHz is 44 FPS, from a 1 GHz increase, you would assume at 5.8GHz you would increase another 9 FPS to 53 FPS! Yes please!

Share this post


Link to post
Share on other sites
Ye, as you can see in the chart in the Original Post, a 3570k at 3.8GHz had 35 FPS and overclocked to 4.8GHz it had 44 FPS, that's a 9 FPS increase (25.7%), which is a a damn lot.

Speaking of just that, Intel Haswell releases in less then a month, and early tests on the 4770k suggests that 6.0GHz is actually possible on water without much trouble at all, an early test got out a few days ago from Asia, in which in they got the 4770k to 5.0GHz on Air on the first try, 50x100 = 5000MHz on as low as only 1.1v, will be very interesting to see more reviewers getting their hands on them.

So all in all it seems Haswell has much better overclocking than Ivy Bridge (3570k/3770k), so I'm predicting a lot of people who play ArmA will buy Haswell because of the overclocking potential, a 4770k €275 at 5.5GHz will no doubt outperform a 3930k €450 at 5.0GHz, If 3.8GHz is 35 FPS and 4.8GHz is 44 FPS, from a 1 GHz increase, you would assume at 5.8GHz you would increase another 9 FPS to 53 FPS! Yes please!

WOW

1.1v!

Watercooling and one of those CPU's :), to bad its nearly double the price of a 8350...

Im personally curious about the Centurion from AMD. Its just a glorified 8350 that makes 5.0 on air AFAIK.

There will be a hefty price tag on that one too.

Intel or AMD..? decisions decisions..... will have to change MB and CPU....

EDIT: Rumor has it that a Centurion will be at US $795!

Share this post


Link to post
Share on other sites
I don't understand why I should even have to explain this, the year is 2013. Multi GPU is not something new.

I know its not knew, i had a 3dfx voodoo 5500 from which nvidia took its sli technology. (i had most others voodoo´s aswell)

Why do none of you know about poor SLI scaling resulting from low graphics load but high cpu load.

gpu usage is only low when the cpu is the bottleneck for the game, even with 1 card. if your cpu doesnt let your first videocard get close to 100% of course sli wont do much if anything at all. but its not the sli itself that causes the overhead to block the scaling, its not what you see when you get almost 100% scaling with a second card with a good cpu on a properly coded game (aka not arma). usually the problem with scaling appears when drivers simply arent optimized for an specific game, but scaling on both ati and nvidia nowdays is amazing and dont take much for new drivers to properly support sli in whatever is launched.

I've done 3 trips down multi gpu lane and every single time I saw the same results. Running 1920x1200 meant that most games saw 50%/50% GPU usage on both cards. It was only when I started to pump insane amounts of AA into the game that I saw proper scaling. Some games, the scaling was so bad that I was getting worse frames per second than I did with a single GPU. There are thousands of posts on the net about this exact scenario. Whether you choose to ignore them or accept them is up to you. I have my experience with it, and know whats what.

your single experience means nothing, might be the exception or simply something you did wrong, or your perception. thats why is necessary to take several reviews into account and not trust just 1 source that can be completely wrong. as close as one can be to the scientific method on a review, no reviewer goes all the way when it comes to it.

As for BF3, yeah look at the difference from Low to Ultra. Not worth the massive toll it takes on your GPU usage.

I work with 3d archviz dealing daily with stuff like texture quality and resolution, shadow quality and resolution, etc. and to me, the difference from low to ultra is huge, not so much between high and ultra although there is some aswell. but i agree that it doesnt look horrible like arma on low.

arma32013-05-0608-37-45-83_zpsf9c229b2.jpg

Sure if you have the hardware to run it you may as well, otherwise that's wasted money, but in all honesty the only thing you need to run is Textures and Mesh quality on Ultra. Everything else is a joke and I bet the GPU companies paid them off to artificially inflate GPU usage on those settings to better sell video cards. Don't think it doesn't happen. *points to Crysis 2 tessellation patch + nvidia*

wasted for blind people, i agree. those people also think that the game is beautifull and look the same as on pc while running on consoles with 500x on very low with 30fps or lower.

Edited by white

Share this post


Link to post
Share on other sites

As for BF3, yeah look at the difference from Low to Ultra. Not worth the massive toll it takes on your GPU usage. Sure if you have the hardware to run it you may as well, otherwise that's wasted money, but in all honesty the only thing you need to run is Textures and Mesh quality on Ultra. Everything else is a joke and I bet the GPU companies paid them off to artificially inflate GPU usage on those settings to better sell video cards. Don't think it doesn't happen. *points to Crysis 2 tessellation patch + nvidia*

Very nice rig indeed. He doesn't use MSAA, but that's about the only performance impacting setting he has different from you. The FOV and motion blur barely affect it. Then again with you having SLI you shouldn't even feel the impact of 4xAA in that game.

My point to that link was that if he can FRAPS the game and get 100-120 fps on a single GPU, then you with SLI 680's should be getting like 120+ fps with AA on. If you're not, then somethings really up because that's not the normal for that setup.

You're right when you say that BF3 looks on lower Quality-Settings not really bad.

I had another Test-Run with BF3 on different Map's and on a 64/64-Player-Server and the maximum fps I got was 197, but I was unfortunately to slow to take the screen-shot.

Server: Click

Doesn't really matter, as the following Screen-Shots should show you the fluctuation with Ultra-Settings and VSync off:

http://imageshack.us/a/img221/4558/bf32013051318071333.jpg

http://imageshack.us/a/img713/4666/bf32013051318074847.jpg

http://imageshack.us/a/img577/6571/bf32013051318080350.jpg

http://imageshack.us/a/img545/2531/bf32013051318080746.jpg

http://imageshack.us/a/img842/7669/bf32013051318082184.jpg

http://imageshack.us/a/img268/6118/bf32013051318082630.jpg

Different map, same outcome:

http://imageshack.us/a/img5/9039/bf32013051318515864.jpg

http://imageshack.us/a/img825/9804/bf32013051318520514.jpg

http://imageshack.us/a/img809/4267/bf32013051318521825.jpg

http://imageshack.us/a/img163/4486/bf32013051318530945.jpg

http://imageshack.us/a/img13/5103/bf32013051318532420.jpg

http://imageshack.us/a/img547/5525/bf32013051318544687.jpg

http://imageshack.us/a/img11/4124/bf32013051318545227.jpg

http://imageshack.us/a/img90/9403/bf32013051318550883.jpg

http://imageshack.us/a/img90/9403/bf32013051318550883.jpg

http://imageshack.us/a/img827/2889/bf32013051318553431.jpg

http://imageshack.us/a/img706/9591/bf32013051318554100.jpg

With adaptive VSync on:

http://img28.imageshack.us/img28/3367/bf32013051318290053.jpg

I have a mate with a GTX690-4GB and he can't drive a 120Hz Screen with Ultra-Settings either, as he can't get stable 120+ minimum fps.

bf3_1920_1200.gif

Source: Click

Plenty games are having trouble with a SLI / CrossFire-Setup or scaling badly.

If I would recommend a computer to a person that wants to max out ArmA 3 Beta/Final I would probably recommend a Haswell Intel Core i5 4570k 4 Core 6MB, overclock that bad boy to 5.0GHz with 4x4GB 2400MHz Memory and maybe 2x GTX 660 2GB with aftermarket coolers for extra overclocking, that setup would easily get an average of 80 FPS in ArmA 3 and BF3 at highest settings, no doubt, for a much smaller cost than for example a 3930k and a GTX Titan that would cost twice as much but only deliver 15-25% more FPS

A 4770k vs. a 3770k is only between 5-10% faster and therefore not really worth the money / worth to upgrade to.

Source: Click

wasted for blind people, i agree. those people also think that the game is beautifull and look the same as on pc while running on consoles with 500x on very low with 30fps or lower.

+1

Here my old Rig (2600k):

http://imageshack.us/a/img703/1006/img0320hs.jpg

http://imageshack.us/a/img338/3053/20120309185138.jpg

http://imageshack.us/a/img856/5762/20120309185158.jpg

http://imageshack.us/a/img62/3053/20120309185138.jpg

http://imageshack.us/a/img825/4424/20120517221957.jpg

If someone is serious about a push-pull-set-up, then have a look at the pictures above. That was my 2600k @5012MHz (!) on Air (Prime95 stable => In-place large FFTs).

The set-up is based on a Thermaltake FRIO-OCK, which got the stock fan's removed and replaced with 2x EBM-Papst 4112 NH4 and got combined with a NZXT Sentry Mesh Fan Controller.

nominaldata.jpg

Just have a look at the amazing throughput of 355 cubic-meters per hour.

I bought the fan's from onlinecomponents.com. They're not cheap, but worth every cent.

PS: Sorry for the bad pictures. I couldn't find the data-cable of my CyberShot at that day and had to use my crappy Galaxy-2.

:)

Edited by TONSCHUH

Share this post


Link to post
Share on other sites
Crappy Galaxy S2??!

You heathen!

Yeah, because it had no flash, the battery died after 2 Weeks and it was a bit slow as well. I have now a Lumia 920 which is much better.

:)

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
Sign in to follow this  

×