Jump to content
Sign in to follow this  
pluke the 2

Will overclocking the Titan X with Precision X produce more frames for Arma 3?

Recommended Posts

Yes you're increasing the GPU speed at which the work can be done but at the same time you're increasing how much it can be done up to the point when you hit the CPU limit.

I think I somehow said that overclocking increases your GPU usage? That's not what I meant at all, pretty much the opposite. Sorry for the confusion. Maybe also the 99% was also confusing because that's full usage when you monitor with MSI Afterburner. I should've maybe write 100% at first.

If you're above your target fps when you've full usage then naturally there's no point to overclock but if you're ever below your target fps with full GPU usage and you don't want to decrease graphical settings, then it's time for some overclocking. It helps up to the CPU limit point which is when the GPU usage actually starts to drop.

Again still wrong.

If I have 1 render task per frame that takes 5ms at say 900mhz to compute but only uses 10% of the GPU and I increase that clock speed to 1000mhz and it only takes 4.5ms then I have increased FPS by like 10% or something like that, Point being usage and performance are irrelevant to each other irregardless of the math. Usage just simply means that X amount of the GPU's core's are in use. Has nothing to do with speed or performance. It doesn't matter if I'm at my target FPS or not, overclocking makes a difference as far as GPU tasks are concerned, usage being irrelevant.

The reason it doesn't with ArmA is because of other threads operating within the same frame time stalling and causing that frame to take longer to render. In fact really, overclocking your GPU is actually having an affect on actual rendering thread performance. You're just completely limited by the engine, not even some "CPU limit point" but literally the engine itself.

Anyways, there's no correlation between GPU usage and some CPU limit, increasing GPU speed simply increases how fast the GPU can calculate. It doesn't decrease GPU usage either.

Share this post


Link to post
Share on other sites

Arma doesn't run its CPU activities in parallel with the GPU engine ones, they are serial. Once the commands are passed off to the GPU it then renders and the game continues on with its CPU processing. As far as I know overclocking the CPU does not improve the CPU time that talking to the GPU takes and since the GPU is always under utilised it will finish the work faster (less latency) but because the GPU is idle waiting on the CPU to produce the next frame the frame rate will not increase.

Share this post


Link to post
Share on other sites

BrightCandle said it very shortly. But I'll still post this :p

If I have 1 render task per frame that takes 5ms at say 900mhz to compute but only uses 10% of the GPU and I increase that clock speed to 1000mhz and it only takes 4.5ms then I have increased FPS by like 10% or something like that, It doesn't matter if I'm at my target FPS or not, overclocking makes a difference as far as GPU tasks are concerned, usage being irrelevant.

You're right in your situation but it doesn't really have anything to do what I'm saying.

When GPU isn't fully used, you're in the CPU limited situation in Arma and then the GPU usage actually drops and fps remains the same because the CPU limits the fps. That's what I've tried to explain. In your situation GPU usage would drop around 9% because of the overclock and fps remains the same. Not like you said that GPU usage remains the same and fps increases, because that only happens when the GPU usage is full. Just monitor with MSI afterburner and see different situations and you'll see.

The reason it doesn't with ArmA is because of other threads operating within the same frame time stalling and causing that frame to take longer to render. In fact really, overclocking your GPU is actually having an affect on actual rendering thread performance. You're just completely limited by the engine, not even some "CPU limit point" but literally the engine itself.

I don't know if GPU usage really even means how many GPU cores are at use. Well if so then that just means when usage isn't at full it's just idling some not needed cores.

Well in the end everything pretty much goes through that single CPU core (roughly said) so when that's bottlenecking then the GPU can't work fully because the GPU needs to wait more for the CPU. You know the rendering actually goes through the CPU.

Yeah I know it's really engine limitation in the end. But it's easier to talk about the CPU limit because that's really the only place we can overclock and have some effect. And everyone have heard about that Arma is CPU heavy and CPU bottlenecks in Arma.

Anyways, there's no correlation between GPU usage and some CPU limit, increasing GPU speed simply increases how fast the GPU can calculate. It doesn't decrease GPU usage either.

Yes increasing GPU speed simply increases how fast GPU can calculate, but the usage drops in CPU limited situations because everything goes through the CPU. So GPU doesn't make any extra not needed work but it just makes what it needs.

Maybe you haven't monitored enough but when you're CPU bottlenecked, overclocking does decrease GPU usage because it does need to "work less" when the speed is faster. It can't push through the CPU limits because that's where everything goes through so the GPU usage drops. I just tested and GPU usage dropped 4% when I overclocked. Naturally when I'm at full GPU usage the fps would increase and usage remain the same up to the point when the CPU limit edge is reached and then what I said previously happens. Exactly the same happens if I limit my game with vsync because there's no reason for GPU to work fully when working less is enough.

A can make a box in a minute and for B it takes 2 mins to make a box. Both can only make 60 boxes because there's no more room for more boxes. Also both one have an hour working day. For the A takes only half the effort to do the same as for the B. So A is working at 50% load but B is working at 100% load.

That's the same case that happens in Arma when you're hitting the engine CPU limit and you start to overclock your GPU. Or if you've for example one Titan with 80% GPU usage and then you buy a second one, then each one would work at ~40% load (I know it's likely not that linear).

Share this post


Link to post
Share on other sites
...overclocking does decrease GPU usage because it does need to "work less" when the speed is faster.

Not true and is not related.

Share this post


Link to post
Share on other sites
Not true and is not related.

How about reading and quoting the most relevant part that you took out of the quote?

but when you're CPU bottlenecked, overclocking does decrease GPU usage because it does need to "work less"

Share this post


Link to post
Share on other sites

...., increasing GPU speed simply increases how fast the GPU can calculate. It doesn't decrease GPU usage either.

its simple how you can falsify or verify this hypothesis. Why you don´t proof your assumption? Its so simple to doubleclick the arma3.exe, overclock your gpu and look whats going on with the gpu-usage. 10 Minutes ago I have done this simple operation and:

you are simply wrong.

@St.Jimmy

useless to discuss with people not willing or not able to verify their assumptions and/or ignoring your ingame proofed demonstrations.

Edited by JumpingHubert

Share this post


Link to post
Share on other sites
its simple how you can falsify or verify this hypothesis. Why you don´t proof your assumption? Its so simple to doubleclick the arma3.exe, overclock your gpu and look whats going on with the gpu-usage. 10 Minutes ago I have done this simple operation and:

you are simply wrong.

@St.Jimmy

useless to discuss with people not willing or not able to verify their assumptions and/or ignoring your ingame proofed demonstrations.

I can easily prove it using a utility such as Furmark or 3DMark. No matter GPU speed, either will run at 99-100% GPU utilization. Besides I'm not the only one that said that "explanation" is wrong.

Also your thinly veiled insults generally summarize yourself better than me.

Share this post


Link to post
Share on other sites

Yes, perhaps in 3DMark, but we are talking Arma3 here (or any Arma product!) ... common sense has no bearing here on how a superb GFX card performs.

Share this post


Link to post
Share on other sites

Windies, you doubleclicked the wrong exe...not easy with you...and as Kremator said, armaseries have their own logic..

Share this post


Link to post
Share on other sites
How about reading and quoting the most relevant part that you took out of the quote?

Does not change anything, your statement is not correct. Clocks are not related with GPU load.

If you increase clocks theoretically the GPU will work fast but that does not affect in any way GPU usage.

Also we are speaking about overclock through software, in this situation only the higher stage of the clocks is being overclocked, while the others remain unchanged, means that the overclock only can give any benefit when clocks are workimg at full speed.

If we want to make a decent overclock we need to extract the GPU bios, then we can look at the clocks of the different (all) stages and overclock it accordingly, after that we flash the bios in to GPU. This we can call a GPU overclock, still does not have any relation with GPU workload.

In response to OP, GPU overclock through software is not something that I am going to do since most of the times besides extra heat does not give anything else, all my GPUs are overclocked through a modded bios.

Share this post


Link to post
Share on other sites

@Bratwurste

As an overclocking fanatic I am interested in your statement software- vs biosoverclocking related. Do you have a link that exemplify and proof it?

The rest is still wrong if I understand you correct. If you run in arma3 in cpu limit, overclocking gpu (via software or bios) or buying a faster card will result in the same fps and lowering gpu-usage.

I bought a faster card (from GTX570 to R9 290) and I got in cpu limited scenarios same fps but lower gpu-usage.

Edited by JumpingHubert

Share this post


Link to post
Share on other sites
Yes, perhaps in 3DMark, but we are talking Arma3 here (or any Arma product!) ... common sense has no bearing here on how a superb GFX card performs.

The workload is still the same, it's still rendering. The only difference is the simulation, sound, AI etc in ArmA being mitigating factors to performance. Case in point, as far as overclocking the GPU us concerned it doesn't matter. Whether you get better performance in ArmA is is irrelevant to whether GPU clock speed has any correlation to GPU workload or usage as was the argument. It's still beneficial in ArmA anyways, it's just that you really don't see much of an improvement because there's very little rendering overhead versus the massive simulation and core thread overhead.

To the thread topic, Yeah it will produce more frames. Will it produce a lot more? Probably not. Has nothing to do with GPU workload, but it has to do with the entirety of the thread being stalled by things like simulation, scripting and AI taking up a major bulk of frame time. You're still increasing the speed at which you can calculate that Rendering workload, but if it's only .5-1ms out of a 16-24ms frame time then it's not going to make a big difference.

---------- Post added at 17:17 ---------- Previous post was at 17:15 ----------

Windies, you doubleclicked the wrong exe...not easy with you...and as Kremator said, armaseries have their own logic..

At this point you have your own logic and it's extremely flawed.

Really ArmA isn't CPU limited or GPU limited, It's strictly engine limited at this point.

Share this post


Link to post
Share on other sites

I may have gravely misunderstood everything in this thread, but are you saying that the CPU is sitting still, idling, while the GPU is rendering the next frame? In other words, the faster the GPU is, the faster CPU can get back to processing?

Instead of, the cpu prepares the data for the GPU, feeds it to the GPU and then instantly goes back to processing. So as long as the GPU gets his job done before the next feeding time, the CPU doesn't have to wait for anything. I.e. the CPU is alone bottlenecking the performanc and a faster GPU would do zero increase in performance. (Unless, the memory is faster so the "feeding" is faster for CPU. Maybe.)

I'm pretty far from being an expert on these things, so I'm sorry if I'm way off.

Share this post


Link to post
Share on other sites
The workload is still the same, it's still rendering. The only difference is the simulation, sound, AI etc in ArmA being mitigating factors to performance. Case in point, as far as overclocking the GPU us concerned it doesn't matter. Whether you get better performance in ArmA is is irrelevant to whether GPU clock speed has any correlation to GPU workload or usage as was the argument. It's still beneficial in ArmA anyways, it's just that you really don't see much of an improvement because there's very little rendering overhead versus the massive simulation and core thread overhead.

Yes like I said some pages ago, I don't know if you can get 0,1fps increase when your GPU usage is low and you overclock, but I said it's really not worth it anymore.

Full GPU usage and you can get more fps by overclocking up to the point when you catch the simulation, AI etc. bottleneck. When that's reached the usage will just drop when you overclock more (usage drop doesn't mean GPU is working less) and fps remains pretty much the same (could be some VERY MINOR increase).

That's the reason why I said that if you ever see full usage and you're not happy about the fps on those points when you've the full usage, overclocking can be worth it. Otherwise not.

Yes Arma is actually pretty much engine limited but we users can always compensate that with the best hardware and highest clocks. That's why I use GPU and CPU "bottlenecks".

And I say this again, GPU usage =/= GPU work. Usage can go lower when overclocking when the limits are reached, but it actually works the same or more.

Share this post


Link to post
Share on other sites
I may have gravely misunderstood everything in this thread, but are you saying that the CPU is sitting still, idling, while the GPU is rendering the next frame? In other words, the faster the GPU is, the faster CPU can get back to processing?

Instead of, the cpu prepares the data for the GPU, feeds it to the GPU and then instantly goes back to processing. So as long as the GPU gets his job done before the next feeding time, the CPU doesn't have to wait for anything. I.e. the CPU is alone bottlenecking the performanc and a faster GPU would do zero increase in performance. (Unless, the memory is faster so the "feeding" is faster for CPU. Maybe.)

I'm pretty far from being an expert on these things, so I'm sorry if I'm way off.

The GPU is busy rendering the frame in parallel with the CPU. There is quite a sizeable period where the CPU is talking to the GPU (rendering) and while most of this is Arma code there is some time in DirectX and some time in the drivers. Presumably a small amount of this time must be talking to the GPU and maybe if its overclocked that is quicker but in my actual practical experience it makes literally 0 fps difference, overclocking didn't get me 1 measly fps.

Share this post


Link to post
Share on other sites

So, option B it is then. That's what I thought. Thank you.

Share this post


Link to post
Share on other sites

Really ArmA isn't CPU limited or GPU limited, It's strictly engine limited at this point.

I like your logic. Really funny :o

Share this post


Link to post
Share on other sites
Yes like I said some pages ago, I don't know if you can get 0,1fps increase when your GPU usage is low and you overclock, but I said it's really not worth it anymore.

Full GPU usage and you can get more fps by overclocking up to the point when you catch the simulation, AI etc. bottleneck. When that's reached the usage will just drop when you overclock more (usage drop doesn't mean GPU is working less) and fps remains pretty much the same (could be some VERY MINOR increase).

That's the reason why I said that if you ever see full usage and you're not happy about the fps on those points when you've the full usage, overclocking can be worth it. Otherwise not.

Yes Arma is actually pretty much engine limited but we users can always compensate that with the best hardware and highest clocks. That's why I use GPU and CPU "bottlenecks".

And I say this again, GPU usage =/= GPU work. Usage can go lower when overclocking when the limits are reached, but it actually works the same or more.

I agree with it not really being worth it for ArmA. There's gains to be had even in ArmA by overclocking under certain circumstances where the engine isn't thread locked to hell. You just generally won't notice it 95% of the time though.

GPU usage doesn't equal anything but GPU usage. It's not a performance metric, it's not even really a metric of how hard the GPU is working. It's simply how many GPU cores or shaders, IIRC, are active. It's not even like CPU usage where you have one unit doing processing and it's a metric of how busy that unit is during a polling interval. If rendering requires 600 shaders and you have 1000 shaders, then you're only gonna use 60% of your GPU no matter what speed. How fast that work is done, how fast that frame is rendered as far as the GPU is concerned is based on how fast those shaders can calculate which is what clock speed is.

Usage doesn't change when overclocking. You're not suddenly creating more shaders on your GPU by overclocking. Usage doesn't lower when overclocking. This whole thing about overclocking at 99% usage or it's a waste or usage having anything to do with performance is seriously BS more or less. There's no correlation, none.

This engine outgrew the best hardware the second parallel processing became the norm and CPU's started having more than one core. That's just a fact. Having the best hardware anymore means very little to ArmA's performance. A G3258 will run the game as well as an i7-5960X within a very comparable margin of error. Considering how "CPU Intensive" ArmA is, that's just plain sad.

---------- Post added at 03:15 ---------- Previous post was at 03:11 ----------

I like your logic. Really funny :o

Probably because you can't comprehend it. It's OK though, keep laughing. Ignorance is bliss they say.

Share this post


Link to post
Share on other sites

Usage just means how much of those shaders are used. Overclocking can decrease it because it can't push anymore through the CPU because it can work with less shaders within the same time. Actually I haven't seen that usage rises when you OC GPU just like you said, that can be increased when you OC CPU so GPU can push closer to the full usage because CPU can just take more from the GPU.

If you really don't believe that usage can decrease when overclocking GPU in CPU bottlenecks in Arma then just check it because that's how it's. It's very logical thing.

The point when monitoring GPU usage is that when you see full usage there's room to overclock the GPU and you can get couple fps or more up to the limit is reached. That limit is when GPU usage drops from the full. And when you don't see full GPU usage, you will not get anything more or less from overclocking. The fps will remain the same. It's logical and you can produce that easily and quickly. It's the easiest and quickest way to check if there's room for fps improvement in certain places or terrain etc with overclocking the GPU.

If you don't believe then you can check it yourself. Just over and underclock your GPU and be amazed how you can use GPU usage as a tool to check if GPU is the bottleneck in some places or if you need to lower some graphic settings. Maybe you'll understand then what I was even talking about. I'll end this here :D

Share this post


Link to post
Share on other sites

Is there any real reason for this thread and 5 pages of pointless discussion? I mean the answer is simple : NO

Edited by Nikiforos

Share this post


Link to post
Share on other sites

While the answer is just no (mostly) the reason for it is due to the way Arma is designed and runs and its not so much complex as potentially counter intuitive. I can say as much as I like that the profile and GPUView data I produced some months ago shows exactly why all this is the case but a lot of people even with my explanations don't understand the diagrams because they don't regularly work with profilers and debuggers because they aren't programmers.

I don't personally think the answer is just no, its no most of the time and its no when frame rate is at its worst. But its not always no because there are scenes in Arma that are GPU limited.

Edited by BrightCandle

Share this post


Link to post
Share on other sites

Windies, the problem is you make theories about the weather but......you don´t want to look out of the window to check if these theories are true or not. I have no idea how the arma engine works but I can observe how the correlations are between a couple of parameters: gpu usage, gpu clock, fps, gpu or cpu intense ingame settings.

Share this post


Link to post
Share on other sites

Luckily, we have several very well informed and valid theories about CPU/GPU (and vice versa) operation with Arma 3 engine. As a "forumer" and based in my experience I also have mine theory, mainly about the called "cpu bottleneck", so here she goes.

With Arma 3 the relation between CPU and GPU in matters of usage/workload is not different from any other game engine, basically CPU workload is dependant of GPU needs.

What is the main limiting/demanding factor for GPU and consequently for CPU usage/workload? Frames per Second.

According to my theory most of "cpu bottleneck" claims are because they are running the game at very low Frames per Second (60 or bellow) due to GPU limitations, in these circumstances is not a surprise to see the CPU with low utilization, in fact is quite normal and common to every game engine.

Why I am having this theory?

Because when I run Arma 3 at stable 144 Frames per Second I have a quite decent CPU (cores) usage and similar to any other game engine in same circumstances.

When I run Arma 3 at 60 Frames per Second, my CPU is usage is low which is quite normal and similar to any other game engine.

In conclusion for my theory, if we have GPU power enough to unleash Arma 3 we will see the CPU "smoking hot".

I will leave AI out of this equation, because I have for it some other theory and is not related with hardware.

So to OP, instead of overclocking your Titan, get a second one and you will feel the power of Arma 3.

Also, getting a second Titan, be sure that you have at least 32GB of RAM and at least 2 SSDs (preferably RAMDISK) in RAID0.

Share this post


Link to post
Share on other sites

@Bratwurste Wait , WHAT???? So now you are telling him to get a second Titan to play and enjoy ARMA3 ? Seriously this is false information and unfortunately you can get away with it.

Share this post


Link to post
Share on other sites

Not what I am saying.

I enjoy Arma 3 even when I play on my laptop.

Read again.

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
Sign in to follow this  

×