Jump to content
Sign in to follow this  
UKFX

7850 recommended, I have a 6870. ARMA III on actual RELEASE?

Recommended Posts

Yeah I adjust FOV in the configs also as I prefer a 90deg FOV whenever possible, but I always change settings in game first so menu items etc appear correctly (not stretched or cut off), that 5/4 setting just jumped out at me..

Just want to add that ArmA3 performance is much better than Arma2 is for me on the same hardware, with the exception of hosting - the cpu hit for running AI is much higher in A3 vs A2 in my testing.

However during my old HD5830's lifespan the average fps has more or less doubled in A2 with patches/optimisation from BIS.

I'm just worried guys, that is all. I don't like buying things I really want to play only for them to be released and it sucks. I know optimisation will come but not all people benefit, I just hope I'm one of those people that gain a benefit from using a 6870.

Anyway, I was playing on a mix of Standard/High/Very high, no AA, FXAA Standard, 1080p and interestingly in many areas it ran smooth-ish. Minimum 30, maximum 55fps. I'd say it averaged about 42. Combat was doable. However, after a short while performance plummeted and regardless of settings changes things because really choppy. Even when I had reasonable frames, there were times that the FPS would collapse to 0-5fps. Render Distance was around 2200, Object around 1300 I believe. Shadows at the default 100.

My Processor was used to a maximum of 72% this time instead of 82%. There must be a very specific combination of settings to get the right hit but it appears that the benefit doesn't last for some reason. Also, the fallback to dodgy frames was probably my 4GHz 2500K dropping back down to its usual utilisation of 52%, which is usually does when playing the game.

There is hope, but again I hope I don't end up one of those folks who finds this so called optimisation leading to the expensive cards or the mega old cards getting boosts but the one in the middle getting bugger all, or even worse performance.

Fingers crossed. Will be nice to see what the next update brings.

Oh no wait - TOTALLY forgot. I know why my FPS improved. I changed the GPUFramesRenderedAhead from 1000 (wtf?) to 3. Apparently changing both those renderahead options to one creates benefit to some people.

Why did they set it to 1000? Perhaps this is causing the CPU to wait up hence it only uses 52% and that when I changed the GPUrenderahead to 3 it bumped (at least for a short while) to 72%?

I'll try 2 and 1 on both later. At the moment both renderahead settings are set to 3.

Share this post


Link to post
Share on other sites

Just to note, those results were on CoOp, not Wasteland like before.

Also, running renderahead on 1 or 2 makes practically no difference although CPU usage was 75%. Average 40fps on a mix of high and standard, no AA. Occasional complete bog down to about 2-5fps. Card was operating at 970 / 1120 stable (no voltage changes, I don't do voltage OCs).

Just have to wait for some updates to the Alpha so the i5-2500K can be used fully. Also, take note that this time around I used the extra launch parameters in Steam but it made no noticeable difference.

Will await updates, since tweaking doesn't work at all.

---------- Post added at 09:57 ---------- Previous post was at 09:33 ----------

Just to note, those results were on CoOp, not Wasteland like before.

Also, running renderahead on 1 or 2 makes practically no difference although CPU usage was 75%. Average 40fps on a mix of high and standard, no AA. Occasional complete bog down to about 2-5fps. Card was operating at 970 / 1120 stable (no voltage changes, I don't do voltage OCs).

Just have to wait for some updates to the Alpha so the i5-2500K can be used fully. Also, take note that this time around I used the extra launch parameters in Steam but it made no noticeable difference.

Will await updates, since tweaking doesn't work at all.

Share this post


Link to post
Share on other sites

I wouldn’t spend anything on it yet. If only I could have listened to that advice myself…:rolleyes:

I bought an i7 with a 7970 for it, still going to put the game proper onto it when its released. But looking at it now, I shouldn’t have bothered for a year or two more, still plenty of life in my A2 pc and certainly in the card.

Still, live and learn, but no, don’t spend until you feel you have to.;).

Oh, on the A3 site it says its a 7750 recommended..

Edited by ChrisB

Share this post


Link to post
Share on other sites
I wouldn’t spend anything on it yet. If only I could have listened to that advice myself…:rolleyes:

I bought an i7 with a 7970 for it, still going to put the game proper onto it when its released. But looking at it now, I shouldn’t have bothered for a year or two more, still plenty of life in my A2 pc and certainly in the card.

Still, live and learn, but no, don’t spend until you feel you have to.;).

Oh, on the A3 site it says its a 7750 recommended..

LOL WHAT? 7750? Well. I was getting 40-50fps on all Standard (Shadows High), FXAA Standard with my card on 970 / 1100 (MSI 1GB 6870) and that was when my 4Ghz i5-2500K was being used 62% (last time I played). I really would love to see what my machine can do when CPU utilisation is fixed.

I'm becoming hopeful for all high settings. Assuming the game is optimised well for actual release and video drivers are good, I think I could get some solid frames. Early days yet, but ye, all standard is a bit too low for me, I tried it last game and things looked a bit bland. I'm not a graphic nut but I'd like solid FPS in combat with a mix of med/high.

Lol 7750 - If they suggest that, then... well, 6870 should do a fine job assuming they don't go "AHAHHAAHHA in your face you have an AMD card, we <3 Nvidia".

Share this post


Link to post
Share on other sites

Cpu utilization probably won't change all that much in all honesty - games still (generally) have 1 or 2 main threads that handle the bulk of processing with other off shoot threads dealing with other things such as AI and physics etc. Generally that means high use on 1 or 2 cores and lesser demand on remaining cores, the percentages you quote are your cpu overall, but that 70% could translate to 100% on core1, with 60% on core 2-4.

Some good info can be read here

http://www.techpowerup.com/forums/showthread.php?t=126091

Share this post


Link to post
Share on other sites

Then in that case this game is poor. It should require a minimum Hex-Core CPU. 4 cores to run everything and 2 for A.I, be properly optimised/threaded and work.

An yes, I was aware that my overall CPU usage was 70% (not the first time), but not any of the cores were maxed - see this from my earlier post:

CPU Total - 52.7%

Core 1 - 90.8%

Core 2 - 63.1%

Core 3 - 53.8%

Core 4 - 62.1%

And, it also doesn't explain why ancient dual cores perform better. The simple fact is, optimisation for quads is still laughably bad and if it isn't sorted, I doubt there are going to be that many people overclocking their quads to 5.2Ghz, simply because they have to make their essentially dual-core chips (within the game) more powerful to enable more efficacy over only 2 cores.

Alpha indeed. But if they don't optimise CPU usage at any time they can refund me for misleading me purposely (yes, did anyone notice the recommended clock speeds and quad core cpu in the recommended specs?)

I have patience. If the ARMA 3 takes a year to come out, I expect that to be a year of extreme optimisation, bug fixes and tweaking and nothing less. If this is supposed to be the next big thing in military simulation, then the least they can do is make it a cut above ARMA 2.

Edited by UKFX

Share this post


Link to post
Share on other sites

First of all "CPU Utilization" isn't going to get "fixed".

Second of all, it's not good to have 100% CPU usage, contrary to what you all want to think. Want to see your entire system become unstable? If the game is using your entire CPU, windows and other applications will not be getting any CPU time. There's an overhead that needs to be left un-utilized for sake of stability.

Share this post


Link to post
Share on other sites

Last week I had my cpu usage at 100% and every game was unplayable ;) (I had a bad process...). Like GossamerSolid say it's not the right way.

Share this post


Link to post
Share on other sites

Would the game actually run better on a Dual-Socket-2011-Board with Dual-Xeon's ?

:)

Share this post


Link to post
Share on other sites

Ok this is getting ridiculous...

Then please tell me why they put the most conservative of recommended requirements on their website, an then, please tell me why a CPU dependant game is inadequate when used with a reasonably decent processor clocked at 4Ghz?

It's like all their stuff is contradictory, or randomly pulled out of a hat, and then I'm told that even though I meet the recommended requirements, that my system is not good enough and after this so called and probably unlikely optimisation in the future, that it probably won't really make any difference?

Now I see why so many blab on the net about ARMA having such a horrid engine.

---------- Post added at 13:06 ---------- Previous post was at 12:57 ----------

Last week I had my cpu usage at 100% and every game was unplayable ;) (I had a bad process...). Like GossamerSolid say it's not the right way.

I figured 100% was okay because I doubt the I/O thread would get saturated.

It's the same way that if I run Prime95 I can still browse the Internet and listen to music at 100% usage. But if it uses 2 cores, then I'll only be content when I see those 2 cores being used 100% first. So far it's just 90% and 60ish.

Share this post


Link to post
Share on other sites

What do you expect will happen if Arma3 starts using 100% CPU? It's the GPU that actually has to render the scene, so if your GPU is already running at 99% (as you indicated earlier) and rendering 50fps, then your framerate isn't going to get any better just because of the CPU. It is already rendering the scene as fast as it can.

The main problems with Arma3 performance are currently that multi-GPU setups aren't properly utilized and that GPU usage drops below 99/100% in many other situations, for example in MP. Other than that, if you're running around in the editor with GPU usage maxed out and getting 40fps, that's simply the framerate your GPU will deliver on those settings (barring any future improvements to the renderer, reductions in scene complexity due to LOD improvements etc.). More CPU usage won't do you any good at that point.

Share this post


Link to post
Share on other sites
What do you expect will happen if Arma3 starts using 100% CPU? It's the GPU that actually has to render the scene, so if your GPU is already running at 99% (as you indicated earlier) and rendering 50fps, then your framerate isn't going to get any better just because of the CPU. It is already rendering the scene as fast as it can.

The main problems with Arma3 performance are currently that multi-GPU setups aren't properly utilized and that GPU usage drops below 99/100% in many other situations, for example in MP. Other than that, if you're running around in the editor with GPU usage maxed out and getting 40fps, that's simply the framerate your GPU will deliver on those settings (barring any future improvements to the renderer, reductions in scene complexity due to LOD improvements etc.). More CPU usage won't do you any good at that point.

Damn sorry BI P.R, I apologise, forgive me but how do you know that? A lot of games I play it simply says 99% GPU used in OpenHardwareMonitor but it has no effect on FPS even if I crank up AA, AF or anything like that even more. An also, when I see some games use 99%, or at least it says it does, the temps vary. Particularly demanding games will heat up the card significantly more, and yet easy games to run - even if it says 99% - leave the card nice n chilled as if not stressed. Please explain to me how this works then?

Also, a 7750 recommended on their site? Please explain that, and also please show me where it says : "ignore our spec sheet completely, we put it there for lulz, instead you must go buy a server processor, overclock it, and get two 7990s in Crossfire to be able to run the game. I don't expect miracles with a 6870 OC, but I don't expect piss poor performance in future (after optimisation) at medium settings, which is what everyone says will most likely happen.

Can you go tell BI to change their system spec page? They'll probably listen to you (mindless fanboy tag under your name - nice sig btw)

How does ARMA 3 run on your i7 by the way, an your GTX560? I'm curious. GTX560 TI isn't exactly a dynamite card either. An the i7 makes no difference... unless ARMA 3 uses multi-threading?

---------- Post added at 14:34 ---------- Previous post was at 14:04 ----------

I just played for about 10 minutes and got flown across the map, fired at people, died, respawned, etc and this is what I have (with everything on medium except shadows which were on high):

ARMA3Usage.jpg - View full image for easy viewing.

Edited by UKFX

Share this post


Link to post
Share on other sites

I've found my sweet spot in the graphics settings that gives me nearly perfect performance most of the time, while also delivering pretty much the exact same graphic fidelity as full on Ultra.

The key thing is to keep Object Quality and Terrain Quality set to Standard. Bumping these above this creates massive tolls on your performance while not really giving you much in the way of improved graphics. Textures I would always advise people run on Ultra assuming they have 2GB of VRAM or more. Shadows, Standard setting for junky cards, Very High for better. Don't bother with Ultra. And Particle Quality I usually keep on Very High for those smoothed out particle sprites that don't get cut on objects. Cloud quality is very personal preference but do note that every single setting has an almost linear toll on performance starting with Low. By the time you get to Ultra you will have put quite a bit of extra load on your graphics card. For those with the top end GPU's, just rock Ultra (if you like the volumetric clouds) as it won't hurt that bad. I can get away with Ultra clouds on my ancient 5870 and still get 60 fps most of the time.

Oh and Post Processing DISABLED. You can get by with Low setting, but imho the haze it adds just looks awful. Anything above Low creates a massive drop in performance for what certainly amounts to nothing.

One last thing, view distances. This is extremely CPU sensitive. If you have an overclocked Sandy Bridge/Ivy Bridge you can get around 3500 view, 2000 object and still maintain about 50 fps minimum almost all the time. Granted this is on Stratis, and we have no experience or knowledge of what to expect from Altis performance wise.

Anyway I strongly recommend anyone who doesn't have an absolute top of the line GPU (7970/680/Titan) try using my settings and see how it works out for you. I think you'll be quite happy with the visual and performance results.

Edited by DaRkL3AD3R

Share this post


Link to post
Share on other sites
I've found my sweet spot in the graphics settings that gives me nearly perfect performance most of the time, while also delivering pretty much the exact same graphic fidelity as full on Ultra.

The key thing is to keep Object Quality and Terrain Quality set to Standard. Bumping these above this creates massive tolls on your performance while not really giving you much in the way of improved graphics. Textures I would always advise people run on Ultra assuming they have 2GB of VRAM or more. Shadows, Standard setting for junky cards, Very High for better. Don't bother with Ultra. And Particle Quality I usually keep on Very High for those smoothed out particle sprites that don't get cut on objects. Cloud quality is very personal preference but do note that every single setting has an almost linear toll on performance starting with Low. By the time you get to Ultra you will have put quite a bit of extra load on your graphics card. For those with the top end GPU's, just rock Ultra (if you like the volumetric clouds) as it won't hurt that bad. I can get away with Ultra clouds on my ancient 5870 and still get 60 fps most of the time.

Oh and Post Processing DISABLED. You can get by with Low setting, but imho the haze it adds just looks awful. Anything above Low creates a massive drop in performance for what certainly amounts to nothing.

One last thing, view distances. This is extremely CPU sensitive. If you have an overclocked Sandy Bridge/Ivy Bridge you can get around 3500 view, 2000 object and still maintain about 50 fps minimum almost all the time. Granted this is on Stratis, and we have no experience or knowledge of what to expect from Altis performance wise.

Anyway I strongly recommend anyone who doesn't have an absolute top of the line GPU (7970/680/Titan) try using my settings and see how it works out for you. I think you'll be quite happy with the visual and performance results.

Will give this a try later, thanks for posting dude (and yes I noticed PP having a huge effect on ARMA 2). I'll try it with PP off and run settings similar to yours and see how it goes.

Peace :)

Edited by UKFX

Share this post


Link to post
Share on other sites

I don't know if this is relevant to the conversation but I had to RMA my gpu so I was left with integrated graphics 4000 on my i5 3570k. I had to turn down my graphics obviously but with all the graphically intensive options reduced I was still running 55 - 60 fps like I would with a gpu. So imo the processor is the most important aspect. The game will reach a certain level of performance based on your processor... then turn up graphical settings and it puts load on your gpu until your gpu can't keep up with your processor. With my 680 on all ultra... half the time I'm cpu bottlenecked and get down to 30fps and the other half I'm gpu bottlenecked when not much is going on and the 680 doesn't want to push out more than ~59 fps with 4xaa 16xaf on ultra but will pump out 120fps with lower graphics settings. That description is all over the place, my apologies too for beating this dead horse

Share this post


Link to post
Share on other sites

Doesn't explain why no modern CPU on the market is apparently adequate for the job. 6-7 year old Dual Cores are more appropriate for this game than an i5-2500K at 4Ghz? That makes zero sense to me.

Out of curiosity, you two guys, what are your GPU_MaxFramesAhead and GPU_DetectedFramesAhead settings?

Share this post


Link to post
Share on other sites
Doesn't explain why no modern CPU on the market is apparently adequate for the job. 6-7 year old Dual Cores are more appropriate for this game than an i5-2500K at 4Ghz? That makes zero sense to me.

Out of curiosity, you two guys, what are your GPU_MaxFramesAhead and GPU_DetectedFramesAhead settings?

Both are set to 0 for me. I have no input lag at all unless my GPU is the bottleneck and pegged at 99% usage. I cap my framerate at 59 fps using Afterburner and this gives me excellent smooth gameplay, and typically my GPU has some breathing room so about 60-80% GPU usage at ~60 fps. It's dense forests with shadows that nuke my graphics card really, and then the stuttering and input lag comes.

But yeah, just about anywhere else, even in Agia Marina, I can peg that 59 fps mark constantly. Especially when playing on a good dedicated server.

Share this post


Link to post
Share on other sites
Both are set to 0 for me. I have no input lag at all unless my GPU is the bottleneck and pegged at 99% usage. I cap my framerate at 59 fps using Afterburner and this gives me excellent smooth gameplay, and typically my GPU has some breathing room so about 60-80% GPU usage at ~60 fps. It's dense forests with shadows that nuke my graphics card really, and then the stuttering and input lag comes.

But yeah, just about anywhere else, even in Agia Marina, I can peg that 59 fps mark constantly. Especially when playing on a good dedicated server.

I've tried 3, 2 and 1, but not 0. I'll give that a shot in a while and post back.

Share this post


Link to post
Share on other sites

Similar to @DaRkL3A3R

As I put in my video at the start of the thread (pg2), more or less the same settings, only I have Object detail on Ultra, the only setting on that last tab I need to drop is Terrain = Standard, can run it next highest up but there is not really any point, not worth the performance drop, all others on that last tab at the highest they will go, object detail gives me no noticeable drop.

I turn off PIP, I think it’s a waste of time, just hoping ‘Blakes Mirrors’ gets ported over. Dynamic lights at Ultra, HDR to standard, FXAA on Ultra, no PP (don’t like it), AA x4 now, was x2 but as said in the vid, things got better, x8 just isn’t really needed for me. Atoc on tree’s only.

The view distance was at 1500, but as I say in the video, performance has improved (for me anyway), so I have put it upto 2000 now and it runs fine (depends on ai amounts, lowest setting would be 1200 with lots of ai), I would however recommend 1500vd for missions, seems to provide good performance with enough vd.

As said in the vid, my HD5850 2gb is oc’ed to C=800 & M=1200, this is reduced from my original oc for the core of 860, simply because it ran the game at the same performance, so no need to overstretch the card more than needed.

Haven’t touched the game configs, so all those will be default settings (BIS).

All in all, everything has shifted upwards a little from first release, performance is solid, as good as A2..

Share this post


Link to post
Share on other sites
Similar to @DaRkL3A3R

As I put in my video at the start of the thread (pg2), more or less the same settings, only I have Object detail on Ultra, the only setting on that last tab I need to drop is Terrain = Standard, can run it next highest up but there is not really any point, not worth the performance drop, all others on that last tab at the highest they will go, object detail gives me no noticeable drop.

I turn off PIP, I think it’s a waste of time, just hoping ‘Blakes Mirrors’ gets ported over. Dynamic lights at Ultra, HDR to standard, FXAA on Ultra, no PP (don’t like it), AA x4 now, was x2 but as said in the vid, things got better, x8 just isn’t really needed for me. Atoc on tree’s only.

The view distance was at 1500, but as I say in the video, performance has improved (for me anyway), so I have put it upto 2000 now and it runs fine (depends on ai amounts, lowest setting would be 1200 with lots of ai), I would however recommend 1500vd for missions, seems to provide good performance with enough vd.

As said in the vid, my HD5850 2gb is oc’ed to C=800 & M=1200, this is reduced from my original oc for the core of 860, simply because it ran the game at the same performance, so no need to overstretch the card more than needed.

Haven’t touched the game configs, so all those will be default settings (BIS).

All in all, everything has shifted upwards a little from first release, performance is solid, as good as A2..

Yeah these custom settings make a world of difference in the games performance. I speculate however how much improvement I'd see going to a GTX 780 in a week in terms of cranking things to Ultra. I monitor my GPU usage and frame-rate constantly. It almost is getting me to the point where I am just fed up and just want to turn it all off and just enjoy the game lol But I am extremely OCD about maintaining that minimum 60 fps mark, so I have to constantly test different scenarios and settings to get a fine level of understanding of what my system is capable of, and what the game demands. So far, from what I see, cranking Object Quality up really only hurts your CPU. And since at the high end of the spectrum GPU's are being underworked due to CPU bottlenecking, can't say I would really recommend setting it high.

It's like the GTA 4 PC port, people don't tweak their settings and find what's best for their PC. They just crank everything to the max and expect it to just work at crazy frame-rates. But that game, much like Arma 3, doesn't have constraints on settings that will produce 60 frames per second on most hardware. You set View Distance to max? Expect performance to tank hard. It's exactly the same thing here. Everyone has to be realistic with their graphics settings, and don't always assume just because you have the absolute top of the line hardware that you can run the game at the developers set max graphics.

Share this post


Link to post
Share on other sites

Okay, it was meh, looking around at things at a reasonably short range got me 40-50fps, distant stuff made the fps drop to around 26. Running and covering a short distance even when looking at 30° down (from looking directly horizontal - 90°) towards the ground made the thing stutter and drop frames like mad. I'll have to test again another time but to me it doesn't appear to have made any difference.

---------- Post added at 20:46 ---------- Previous post was at 20:45 ----------

GPU_MaxFramesAhead=0 is the same as 1000, i.e. it goes with whatever is set by the driver.

And btw. setting GPU_DetectedFramesAhead in the config does nothing.

Thanks for letting me know BI P.R :)

Share this post


Link to post
Share on other sites

What's the deal with you guys? I provide some simple, neutral information and you act like I'm attacking you. :D

Calm down yourself.

Edited by MadDogX

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
Sign in to follow this  

×