Jump to content
Sign in to follow this  
Yett71

What computing system do you have for the Arma 3 Beta?

Recommended Posts

Okay friends and countrymen - time for sproyd's benchmark post!! Deemed necessary by the perpetual and inevitable internet arguments with unproven claims. Of course, I haven't been part of these arguments so have nothing to prove or gain other than world peace.

To reiterate and for the benefit of future readers - my brand spanking new, mid-range Arma 3 build is as follows (important parts only);

[u]My Build[/u]
[b]CPU[/b]: Intel i5 4670K Haswell, overclocked to 4.2GHz stable. 
[i]I have received my new RAM now but don't have time to push the OC further[/i]
[b]GPU[/b]: Asus GTX 760 Direct CU II OC (running at OEM speed)
[b]RAM[/b]: 2x4GB Corsair Vengeance 1600MHz RAM
[b]SSD[/b]: Samsung 840 series
[b]MoBo[/b]: Gigabyte Z87N-WiFi

[u]My non-scientific fixed test conditions[/u]
Drivers: Latest [u]beta[/u] nVidia drivers - 326.41
Arma 3: Beta Dev build 0.77.109136
[i]I am using the dev build because it has generally shown optimisations that have been rolled out to stable so is the most current version of the game performance wise[/i]
OS: Win 8 Pro 64bit
Monitoring Software: MSI Afterburner for FPS & GPU usage, HWiNFO64 for CPU Usage
Resolution: 1080p
View Distance: Fixed at 3,000 exactly with default corresponding object distance (1,736) and Shadow Draw (100)
Mission: Scenarios > Helicopter Showcase. I have picked this as there is a lot going on at the base with lots of polygons on screen and so I think is a good benchmark. The combined arms helicopter insertion runs rougher (for some reason?) but I'd rather have a higher FPS to benchmark variances. I do not move for exactly 60 seconds and watch the High and Low FPS. 

Test1 - Dream settings, GPU working hard

General Quality Settings: Maxed (all Ultra except particles high and HDR standard)

PP: No blur, default DOF & bloom, SSAO maxed, Caustics on, AA 8x, ATOC maxed, FXAA Ultra, AF Ultra

FPS: 43-50, mostly around 46

GPU Usage: 92-99%

CPU Usage: All 4 cores at c. 70% max

Test2 - Just dropped AA, negligible FPS improvement, GPU usage drops

General Quality Settings: Maxed (all Ultra except particles high and HDR standard)

PP: No blur, default DOF & bloom, SSAO maxed, Caustics on, AA 4x, ATOC maxed, FXAA Ultra, AF Ultra

FPS: 43-51, mostly around 46

GPU Usage: 85-99%

CPU Usage: Cores maxed at 61 to 93%

Test3 - drop quality settings by a notch, GPU usage decreases further, FPS goes up

General Quality Settings: Very High where applicable

PP: No blur, default DOF & bloom, SSAO maxed, Caustics on, AA 4x, ATOC maxed, FXAA Ultra, AF Ultra

FPS: 46-55

GPU Usage: 74-98%

CPU Usage: Cores maxed at 75 to 90%

Test4 - drop quality settings another notch, GPU usage about the same, FPS goes up, especially when GPU usage is high

General Quality Settings: High where applicable

PP: No blur, default DOF & bloom, SSAO maxed, Caustics on, AA 4x, ATOC maxed, FXAA Ultra, AF Ultra

FPS: 47-60

GPU Usage: 76-98%

CPU Usage: Cores maxed at 67 to 89%

Test5 - crap on all graphics settings to see what happens, FPS goes up but GPU usage is much less

General Quality Settings: Standard where applicable

PP: No blur, default DOF & bloom, SSAO standard, Caustics on, AA 4x, ATOC maxed, FXAA Ultra, AF Standard

FPS: 53-68

GPU Usage: 53 to 82%

CPU Usage: Cores maxed at 57 to 80%

Test6 - try to find optimal balance between performance and pretty - ideally with GPU working hard

General Quality Settings: Everything Maxed

PP: No blur, default DOF & bloom, SSAO high, Caustics on, AA 8x, ATOC maxed, FXAA Ultra, AF Ultra

FPS:43-51

GPU Usage: 73 to 98% average high 80s low 90s

CPU Usage: Cores maxed at 78 to 90%

Half-baked Conclusions

Its hard to draw much of a conclusion and the CPU usage, although fairly even across the cores doesn't seem to make much sense. I think they key is having high GPU usage as the game seems to be generally CPU bound as most people are aware. My FPS always started raising as the GPU usage went up which I guess happens when the CPU stops working so hard - although I never really saw 100% usage on any cores which is strange. I guess my rig is fairly well balanced between CPU and GPU - you would need a real beasty CPU to be GPU limited on a decent card (even say a 770) as you can see I was pretty much maxing out graphics settings. I'll post back if I do more experimentation.

What we really need is a list of what settings are CPU bound and which are GPU bound so we can optimize better. I noticed AA didn't have a huge effect on performance but SSAO did...

Edited by sproyd

Share this post


Link to post
Share on other sites

Yay. 4670K and 760 with 4GB RAM is just what we need to see benchmarked.

However your methodology is strange.

The settings don’t look like default settings, right?

And anti-aliasing and ATOC are always on lol!

I would like to see every default setting with view distance manually set to something like 1600/800 (std. default? 1600/1200?) which is Stratis ground-relevant and maybe double that or something more high-end.

Even if default settings are pretty poorly optimized I think they’re a good place to start before you settle in somewhere in-between settings and start tweaking until you’ve got a balance you like.

Also I would dispute anything you say about the graphics card being so strong that a stronger graphics card would not matter much until I’ve seen a 4X70K with a 770 and 4GB RAM which I’ll be getting in a week hopefully ;)

Then we’ll see.

Also you should have written the average readings from Afterburner, they’re right under the graphs. Averages are more important than minmax AFAIK.

Oh and by the way:

The CPU is working nicely at 93% and your graphics card at 100%... then you turn up your AA which I guess is calculated by the graphics card and the graphics card starts crying tears of blood pretty much but the CPU actually slows way down to 70% usage? What does that even mean? Did it shift gears?

Share this post


Link to post
Share on other sites
Yay. 4670K and 760 with 4GB RAM is just what we need to see benchmarked.

However your methodology is strange.

The settings don’t look like default settings, right?

And anti-aliasing and ATOC are always on lol!

I would like to see every default setting with view distance manually set to something like 1600/800 (std. default? 1600/1200?) which is Stratis ground-relevant and maybe double that or something more high-end.

Even if default settings are pretty poorly optimized I think they’re a good place to start before you settle in somewhere in-between settings and start tweaking until you’ve got a balance you like.

Also I would dispute anything you say about the graphics card being so strong that a stronger graphics card would not matter much until I’ve seen a 4X70K with a 770 and 4GB RAM which I’ll be getting in a week hopefully ;)

Then we’ll see.

Also you should have written the average readings from Afterburner, they’re right under the graphs. Averages are more important than minmax AFAIK.

Oh and by the way:

The CPU is working nicely at 93% and your graphics card at 100%... then you turn up your AA which I guess is calculated by the graphics card and the graphics card starts crying tears of blood pretty much but the CPU actually slows way down to 70% usage? What does that even mean? Did it shift gears?

Yes there is a lot more testing I can do - basically I always have AA and ATOC on because those are things that I don't want to sacrifice and why I spent £££ on my rig. Anyway - if you want to give me a list of settings to try and give you the benchmark I am more than happy.

In regards to the readings;

CPU Usage: this was peak over the 60 seconds

GPU Usage: I didn't want to give average because of course that accounts for alt-tab time, menus etc. However I can comfortably tell you I inspected the graphs and the averages are generally towards the upper end of the spectrum, and where you see a higher-on-the-scale range, you had a higher average. FPS went up towards the end of the minute test as I think CPU was more heavy at the start while all the AI get their waypoint instructions and have to pathfind etc.

In regards to no point getting a stronger graphics card - I meant with my current CPU. A 4770k or a higher clocked i5 will of course put more onus on the GPU. The Haswell silicon lottery is well-known to have huge variations between chips. Mine is either a bad chip or its just that I have a fairly crappy mITX motherboard that isn't letting me drive it. I am going to try for 4.3GHz as every little 100MHz helps when it comes to CPU limitation. However if you were to get a 4770K @ say 4.6GHz I imagine a 770 or 780 would be a better balance if you were aiming for max settings with over 60FPS

Share this post


Link to post
Share on other sites
Yes there is a lot more testing I can do - basically I always have AA and ATOC on because those are things that I don't want to sacrifice and why I spent £££ on my rig. Anyway - if you want to give me a list of settings to try and give you the benchmark I am more than happy.

In regards to the readings;

CPU Usage: this was peak over the 60 seconds

GPU Usage: I didn't want to give average because of course that accounts for alt-tab time, menus etc. However I can comfortably tell you I inspected the graphs and the averages are generally towards the upper end of the spectrum, and where you see a higher-on-the-scale range, you had a higher average. FPS went up towards the end of the minute test as I think CPU was more heavy at the start while all the AI get their waypoint instructions and have to pathfind etc.

In regards to no point getting a stronger graphics card - I meant with my current CPU. A 4770k or a higher clocked i5 will of course put more onus on the GPU. The Haswell silicon lottery is well-known to have huge variations between chips. Mine is either a bad chip or its just that I have a fairly crappy mITX motherboard that isn't letting me drive it. I am going to try for 4.3GHz as every little 100MHz helps when it comes to CPU limitation. However if you were to get a 4770K @ say 4.6GHz I imagine a 770 or 780 would be a better balance if you were aiming for max settings with over 60FPS

Aaah! Afterburner can't be started/paused with a hotkey? That's why Fraps is still leading I guess.

The Haswell lottery mostly only affects overclocking what I hear by the way and 4670K=4770K in terms of power in gaming and ask anyone they will tell you a 4X70K won't bottleneck any combination of cards.

I don't know the basis behind it and never seen anyone elaborate on it but on serious overlcocking sites it's not like one or two people will say it -- everyone will.

Based on my own experiences using a 3 year older CPU than graphics card and the fact that you’re using one of the currently strongest CPUs available to consumers coupled with the weakest card of this generation which basically equals a year-old 660 Ti...

Well so anyways I'm not sure about ARMA in specific... could be a monster game that really only runs on CPU.

But a i7-3820 which is probaly 10% weaker than a 4670K/4770K can handle tri-SLI of Titans in games like Battlefield 3 and graphics-intensive games but in Skyrim for example it didn't do much...

But that ARMA would be so CPU-intensive a 4670K couldn’t handle a 660 Ti... doubtful.

Anyways I guess we could try coordinating some benchmarking when I get my 4770K/770 to see if there's a lick of difference soon.

Maybe they’ll an an official benchmark soon.

Share this post


Link to post
Share on other sites

Well so anyways I'm not sure about ARMA in specific... could be a monster game that really only runs on CPU.

But a i7-3820 which is probaly 10% weaker than a 4670K/4770K can handle tri-SLI of Titans in games like Battlefield 3 and graphics-intensive games but in Skyrim for example it didn't do much...

But that ARMA would be so CPU-intensive a 4670K couldn’t handle a 660 Ti... doubtful.

From my research i'll give a bit of clarity...

An I5 4670k is about the same as a 4770K...from what i read it sounds as though the I7's O.C. a bit better but overall they arn't worth the extra money as the differences are within 2-3 frames.

As for CPU bottleneck...that depends on the game. Planetside 2, Arma, Shogun 2 are always gunna bottleneck CPU's. A 760 is just fine at 1080p because your probably gunna loose frames from the CPU anyway.

Games like Crysis 3, Tomb Raider, Max Payne, Metro Last Light are not that CPU intensive and the addition of Sli'd/crossfire setups help, as the GPU is the bottlenecked from the visuals as oppose to the prossessing of the A.I. that Arma uses.

So it depends on what you play really. Even a Phenom X4 isn't gunna bottleneck a sli 780's assuming if the game your playing isn't CPU intensive. But people recommend an I5, at least, as games that are more graphically intensive seem to require more processing power anyway.

Share this post


Link to post
Share on other sites

Yes but wake-up call folks - we are on the BIS forums, more specifically the Arma 3 forums so CPU bottle-necking is an unfortunate fact so Clock Speed is of utmost importance as is chip architecture so OC Haswell or at OC least Ivy Bridge is really what will be driving your minimum FPS... Not some SLI'd setup...

Share this post


Link to post
Share on other sites
Yes but wake-up call folks - we are on the BIS forums, more specifically the Arma 3 forums so CPU bottle-necking is an unfortunate fact so Clock Speed is of utmost importance as is chip architecture so OC Haswell or at OC least Ivy Bridge is really what will be driving your minimum FPS... Not some SLI'd setup...

Which you say without having much evidence really.

I want to see numbers before I advice people to buy a 760 instead of any stronger graphics card if all they're gonna do is play ARMA anyways.

Naturally everyone should get a 4670K or something similar, but question is if they have an extra $139 lying around should they upgrade to a 770 or will a 760 4GB, extra RAM, a better or bigger SSD, cooler chassis or more efficient power supply be better value.

Or should they absolutely not spend it at all?

Share this post


Link to post
Share on other sites

As always PCs and performance is damn hard to benchmark because there are so many variables - CPU, GPU, MotherBoard, RAM, HDD/SSD all play a role in boosting or limiting performance, and that's without factoring drivers, OS, optimizations and in-game settings. Its impossible to know for sure.

Share this post


Link to post
Share on other sites
Which you say without having much evidence really.

I want to see numbers before I advice people to buy a 760 instead of any stronger graphics card if all they're gonna do is play ARMA anyways.

Naturally everyone should get a 4670K or something similar, but question is if they have an extra $139 lying around should they upgrade to a 770 or will a 760 4GB, extra RAM, a better or bigger SSD, cooler chassis or more efficient power supply be better value.

Or should they absolutely not spend it at all?

now now sneakson you know as well as i that if you want to get max performance from A3 you will need

Intel Core i7-3970X $1000 :eek:

DDR3 3000 (for the RAM drive to house A3) $690 :eek:

and of course a Titan video card $1000

for a poultry $2700 your rocking a solid 60FPS.... ;)

All kidding aside BIS needs to up what they consider the "recommended" specs

Share this post


Link to post
Share on other sites
Hey by the way what does new about $400 graphic cards and processors score in the Windows rating system? Say a GTX770 and i5/i7 or something such.

My system is currently 7.3 7.3 7.6 7.6 7.0 7.0.

I get 8.1 with my new i5 and 8.1 with my GTX 760. I've got SLI though, which it doesn't account for. Windows Experience Rating really doesn't tell you much, however.

I'm able to play the game fairly easily with almost all settings on ultra, especially in 1080p on 1 monitor. I can push it to 3 monitors and get pretty solid frames too. Don't have it in front of me to check right now and can't recall what I was getting. It runs better on 1 though.

Core i5-4670k 4.3 Ghz

16GB DDR3 Corsair Vengeance

Gigabyte GTX 760s OC'ed in SLI using latest nVidia beta drivers

Samsung 840 SSD with Windows 8 and Arma 3 installed on it

Edited by gpellis87

Share this post


Link to post
Share on other sites

The point is, what´s about the FPS when you switch from day to night mode ? Altis by night is really impressive, but the FPS by night are really bad.

Share this post


Link to post
Share on other sites

Right now I have the following rig.

i5-2500k 3.3GHZ @ 4.3

8 GB RAM

GTX 570

Let's say that I right now have 50 FPS playing on Altis. How much better could it be if I buy the GTX 770? How many frames do you guys think I would gain with the same ingame settings?

Is it worth it? I also plan to buy an SSD but that's another chapter. :)

Share this post


Link to post
Share on other sites

I find it interesting that the majority seem to have 8GB ram and Quad core CPU...

Share this post


Link to post
Share on other sites

Survey is pretty much flawed/useless since important factors like 32/64 bit OS are missing, and basically any Processor that´s older than 2 years cannot be specified (instead of a quadrillion i- processors a more comprehensive listing would´ve been good).

Share this post


Link to post
Share on other sites
Right now I have the following rig.

i5-2500k 3.3GHZ @ 4.3

8 GB RAM

GTX 570

Let's say that I right now have 50 FPS playing on Altis.

Uhm, what settings do you use ( in game ), I have the same CPU, same RAM, but with a Radeon HD8570 2GB and have less than 30fps.

Share this post


Link to post
Share on other sites

I have the setup in my sig (below). Occasional stutter, esp when I switch back from the Map, but I can do 30 fps or better depending on settings.

Share this post


Link to post
Share on other sites

Keep in mind that I overclocked my CPU alot and that has a clear effect in better frames.

My settings:

Resolution : 100%

Visibility

Overall : 2000

Object :1200

Shadow :100

Texture Quality: Very high

Object Quality : Standard

Terrain Quality: Very high

Cloud, Particle and Shadow Quality: High

Anisotropic filtering: Ultra

AA= Off

PP= Low

FXAA= Ultra

Share this post


Link to post
Share on other sites
Keep in mind that I overclocked my CPU alot and that has a clear effect in better frames.

My settings:

Resolution : 100%

Visibility

Overall : 2000

Object :1200

Shadow :100

Texture Quality: Very high

Object Quality : Standard

Terrain Quality: Very high

Cloud, Particle and Shadow Quality: High

Anisotropic filtering: Ultra

AA= Off

PP= Low

FXAA= Ultra

Uhm, cool thx, I've now ask my GPU to give me its best, I can go quite close, though it's reachin 73C.

Share this post


Link to post
Share on other sites

Windows 8

1920x1200 (16:10)

MSI Z87-G45 Gaming

4770K @ Stock

MSI GTX 770 Gaming 2GB

16GB Vengeance Pro 1600 MHz RAM

Arma on 840 Evo SSD

Everything max except Object Quality High and Visibility 2000/2000 @50fps.

Share this post


Link to post
Share on other sites
4770K @ Stock

You should/must/have to overclock that CPU. It's a "K" model.

Share this post


Link to post
Share on other sites
Uhm, cool thx, I've now ask my GPU to give me its best, I can go quite close, though it's reachin 73C.

If you can upgrade your GPU to something similar as mine or better will clearly give you the results you want.

The game looks fine even at lower settings and the boost you can gain is very good.

Share this post


Link to post
Share on other sites

Win 7 /64bit

I5 3570K OC @4,2 Scythe Mugen Cooler

MSI Z77A G43

8 GB Corsair

Sapphire Vapor X 7970 GHz 6GB

Samsung SSD

Running ARMA 3 on Ultra, Sample 150%, View ~ 5000m, Object ~3000m gets me 35FPS when flying and 50FPS on Ground

Share this post


Link to post
Share on other sites
Guest
This topic is now closed to further replies.
Sign in to follow this  

×