Jump to content
Sign in to follow this  
pluke the 2

Will overclocking the Titan X with Precision X produce more frames for Arma 3?

Recommended Posts

Will overclocking the Titan X with Precision X produce more frames for Arma 3?

I know this game is CPU hungry, but I wanted to ask if you think it would do me any different for FPS increase.

Whatever base clock is. If I boosted 100+ on both core and memory or whatever I could get away with stability wise.

Share this post


Link to post
Share on other sites

i dont think so.. i switched from 580 to 780ti nearly a year ago and i dont see any fps increases..arma 3 gpu optimization is quite good i'd say u dont need high end gpu to max it out

better off overclock your ram and cpu

Share this post


Link to post
Share on other sites

If you ever see 99% GPU usage then yes. Otherwise not. High GPU usage is achievable with high resolutions and AA stuff but if you can't get max GPU usage with the most highest GPU heavy settings alone in the editor then there's no reason to overclock. If you can understand that you know if you need to overclock.

Some maps or places that have very high/dense vegetation likes to stress the GPU a lot so in those places you can likely see some increase. I suggest you to try the W.I.P.Orshanets terrain from the modding discussion part of the forums. At least I get GPU bottleneck in there easily.

Share this post


Link to post
Share on other sites

Nope.

You might see 80-100fps in singleplayer or in the editor, but the moment more players join and/or AI is spawned, you'll have the same 25-45 FPS everyone else has.

Share this post


Link to post
Share on other sites

Even the gtx 970 is enough to get bored in arma 3. The problem is the ram and cpu.

Share this post


Link to post
Share on other sites

Lol no need for GPU overclocking unless you play other games that are GPU demanding. Spending money for expensive GPUs and for Arma is like throwing away money down the toilet :)

Share this post


Link to post
Share on other sites

As all the others have mentioned the answer is no it wont. Its simply not the limit in the game. You want a faster CPU and ideally if you want to play at 30 fps on pretty much everything out there you need an Intel CPU at about 8Ghz. So even liquid Nitrogen cooling isn't enough to run the game to the minimum standard, you would need a world record overclock to get 30 fps in most public servers.

Share this post


Link to post
Share on other sites

Hey guys. Thanks for your responses. I only play Arma 3 at the moment. I know I wasted cash on the GPU but I had the money and I wanted to waste it.

Another question if you guys are till around, I'm running 5820k at 4.5ghz and memory on XMP at 2666mhz (standard speed for my memory) with 15 15 15 35cas timings and T1 clock.

What is going to give me more FPS? 10 10 10 20 T1 clock or 3.6ghz memory speed? (i know this is an exaggeration but just trying to understand what i should focus on)

thanks in advance.

Share this post


Link to post
Share on other sites
Hey guys. Thanks for your responses. I only play Arma 3 at the moment. I know I wasted cash on the GPU but I had the money and I wanted to waste it.

Another question if you guys are till around, I'm running 5820k at 4.5ghz and memory on XMP at 2666mhz (standard speed for my memory) with 15 15 15 35cas timings and T1 clock.

What is going to give me more FPS? 10 10 10 20 T1 clock or 3.6ghz memory speed? (i know this is an exaggeration but just trying to understand what i should focus on)

thanks in advance.

Better place to ask is a overclocking forum. They can size-up the mobo. Tho i would say T1 is tight setting and much more stable.

I see a 'failure to post' in your future. If you don't know what that is, an overclocking forum is the place to go.

Share this post


Link to post
Share on other sites
Hey guys. Thanks for your responses. I only play Arma 3 at the moment. I know I wasted cash on the GPU but I had the money and I wanted to waste it.

Another question if you guys are till around, I'm running 5820k at 4.5ghz and memory on XMP at 2666mhz (standard speed for my memory) with 15 15 15 35cas timings and T1 clock.

What is going to give me more FPS? 10 10 10 20 T1 clock or 3.6ghz memory speed? (i know this is an exaggeration but just trying to understand what i should focus on)

thanks in advance.

Both timing and MHz are important for RAM so it's hard to answer. I think there's no benchmark to show better. Only a small test was done in this topic but the tests was only done with 1333-2133 CL9-11 http://forums.bistudio.com/showthread.php?166512-Arma-3-CPU-vs-RAM-performance-comparison-1600-2133-up-to-15-FPS-gain Well it kind o shows that MHz are bit more important than timings but both are very important in the end.

Share this post


Link to post
Share on other sites
If you ever see 99% GPU usage then yes. Otherwise not.

Well that information is just flat out incorrect...

Share this post


Link to post
Share on other sites
Well that information is just flat out incorrect...

If your GPU usage is at full, your GPU is bottleneck at the moment when it's full. So that's not incorrect. It doesn't increase fps in places where the GPU usage isn't at full, at least not much likely. It's fps increase but it might not be worth it at all if it only happens when watching a sea at 200fps.

Share this post


Link to post
Share on other sites
Hey guys. Thanks for your responses. I only play Arma 3 at the moment. I know I wasted cash on the GPU but I had the money and I wanted to waste it.

Another question if you guys are till around, I'm running 5820k at 4.5ghz and memory on XMP at 2666mhz (standard speed for my memory) with 15 15 15 35cas timings and T1 clock.

What is going to give me more FPS? 10 10 10 20 T1 clock or 3.6ghz memory speed? (i know this is an exaggeration but just trying to understand what i should focus on)

thanks in advance.

Its recommned to use benchtools like maxmem to find best combination of clockspeed and timings. After it you have of course to test if its stable. But before overclocking you have to read carefully some how to´s in benchmarking.

There is really only one tweak for arma3 that helps a lot: http://forums.bistudio.com/showthread.php?163640-Arma3-and-the-LARGEADDRESSAWARE-flag-%28memory-allocation-gt-2GB%29

@GossamerSolid

you can check what St.Jimmy wrote. Go into empty editor, forest of Stratis (+ max out all gpu related graphic settings like AA and resolution) and downclock your gpu. Voilà, you will see a drop in fps.

Edited by JumpingHubert

Share this post


Link to post
Share on other sites
If your GPU usage is at full, your GPU is bottleneck at the moment when it's full. So that's not incorrect. It doesn't increase fps in places where the GPU usage isn't at full, at least not much likely. It's fps increase but it might not be worth it at all if it only happens when watching a sea at 200fps.

Once again, you're wrong. Feel free to find any graphics card review site that shows GPU usage graphs as well as overclocking results.

You can also try it yourself.

Share this post


Link to post
Share on other sites

Upgrading GPU may raise your FPS by a few, but no matter how good GPU you have, ArmA 3 will never run optimal, especially in object and AI heavy places. This is simply because ArmA 3 engine's renderer is tied to simulation. New DayZ engine won't have this limitation and hopefully same engine will be used by next ArmA game :)

Share this post


Link to post
Share on other sites
Once again, you're wrong. Feel free to find any graphics card review site that shows GPU usage graphs as well as overclocking results.

You can also try it yourself.

In what part am I wrong I'd like to hear that first.

I tried and it goes just as I said. Maybe my English is then so bad. Overclocking = generally better fps in any game if your fps is bottlenecked by GPU when you've high graphic settings like a lot of resolution and AA.

I just tested in the Orshanets. I have 35fps when I'm not overclocked and I've full GPU usage. When I overclocked I've 37 fps and still naturally full GPU usage because in that case I'm not bottlenecked by the CPU.

Share this post


Link to post
Share on other sites
If your GPU usage is at full, your GPU is bottleneck at the moment when it's full.

Since we are speaking about overclock, I presume that with yours "GPU usage is full' you are referring clocks.

If is the case, you are completely wrong. Anyway, I would like to see your elaboration about the relation between full clock usage and GPU bottleneck.

Share this post


Link to post
Share on other sites

@Bratwurste

Scenario: empty stratis editor, look into deep pine tree forest.

gpu-clock 975 mhz, gpu-usage 99%: 62fps

gpu-clock 1125 mhz, gpu-usage 99%: 71fps.

If gpu-usage go´s below 99%, there will be no fps gain with gpu overclock or better gpu.

Share this post


Link to post
Share on other sites

imo that card is a waste for A3 unless you are running the game in 4K resolution. Like others suggested a really fast CPU would matter the most.

Share this post


Link to post
Share on other sites
Since we are speaking about overclock, I presume that with yours "GPU usage is full' you are referring clocks.

If is the case, you are completely wrong. Anyway, I would like to see your elaboration about the relation between full clock usage and GPU bottleneck.

Nope with full GPU usage I mean full usage that you can monitor with MSI Afterburner for example, not clocks.

With a bottleneck I mean it's always a case that's something is bottlenecking. If you get full GPU usage, you're bottlenecked by the GPU because with better clocks or GPU you would get even better performance up to the point where other part of the computer is the bottleneck. You're not bottlenecked by the GPU when the GPU usage isn't at full and in that case overclocking or getting better GPU doesn't really help.

The bottleneck word might be too harsh or confusing because GPU bottlenecks can be easily eliminated with lower graphics settings

Is GPU overclocking worth it in Arma? Well mostly not, but you can monitor things easily and quickly if you're bottlenecked in some points by the GPU and inprove your fps a bit in those places.

This is a very good example from Statis where I'm getting GPU bottlenecked:

[if you want to see how much going from GTX 560 Ti 448 to GTX970, you can compare the video to

(Well BIS did some little optimizations after that video but it's still very demanding example place)]

Shows well how demanding the vegetation is in Arma if you've some AA and sampling tuned up.

I get 2-3fps more when I mildly overclock my GTX970 at that same point. Everyone has an opinion is the overclock worth it but the reality is that the fps got improved in that place :p

I've apadtive vsync on so I'm capped at 60fps in that video. So GPU usage isn't naturally at full when that fps is met.

When you're playing without any vsync or some other fps caps so your fps can wonder around freely, the case is when you see full GPU usage, your GPU is bottlenecking at that point, because at those points better GPU or better clocks would give more fps. Doesn't matter if the fps is 30 or 300, everytime the GPU usage is at full, you're capped by the GPU. And everytime you don't have full GPU usage you're capped by the CPU, because CPU can't get anymore draws, because of the high traffic.

This is all about the RV engine. I'm not sure how other game engines goes because I haven't monitored them much.

Dam I'm making this a long post :D

Share this post


Link to post
Share on other sites

There are places in the game where you can get limited by the GPU, I don't want anyone thinking that isn't the case. But you have to go somewhere where there is a lot of vegetation and have high AA settings and then you will max out the GPU. Performance will be in the region of 60 fps on a decent GPU, and you could by really pushing the settings up get it down into the 45 fps range. Absolutely doable. Of course just changing the settings will immediately fix the situation and return you to a height of more like 70 fps.

The issue is the cases where the game drops down to 25 fps or even 15 fps like I had on one altis life game with 20 players in it. Those instances are not GPU limited and no amount of changing the settings can fix the situation. The major problem with performance is CPU related, it doesn't mean that at times the game can't become GPU limited, it absolutely can, its just the GPU limited areas aren't the problem, they can be fixed with reducing the settings and they are no where near as severe as the problems with the CPU where the game will become unplayable and no changing of settings can ever fix it.

Share this post


Link to post
Share on other sites

Thats true, BrightCandle, gpu limiting is not the problem because limits by gpu you can compensate with downgrading the graphics. When cpu is limiting you can do nothing settingwise.

Share this post


Link to post
Share on other sites
In what part am I wrong I'd like to hear that first.

I tried and it goes just as I said. Maybe my English is then so bad. Overclocking = generally better fps in any game if your fps is bottlenecked by GPU when you've high graphic settings like a lot of resolution and AA.

I just tested in the Orshanets. I have 35fps when I'm not overclocked and I've full GPU usage. When I overclocked I've 37 fps and still naturally full GPU usage because in that case I'm not bottlenecked by the CPU.

Utilization is a workload metric, Not a measurement of speed that workload is done at. When you increase clock speed, you're increasing the speed at which the work can be done, not how much work can be done. A GPU at 1mhz can have 99% utilization and 1 fps while the same GPU at 1000mhz can have 99% utilization and 1000 fps. Clock speed and utilization are not necessarily indicative of each other.

Share this post


Link to post
Share on other sites
Utilization is a workload metric, Not a measurement of speed that workload is done at. When you increase clock speed, you're increasing the speed at which the work can be done, not how much work can be done. A GPU at 1mhz can have 99% utilization and 1 fps while the same GPU at 1000mhz can have 99% utilization and 1000 fps. Clock speed and utilization are not necessarily indicative of each other.

Yes you're increasing the GPU speed at which the work can be done but at the same time you're increasing how much it can be done up to the point when you hit the CPU limit.

I think I somehow said that overclocking increases your GPU usage? That's not what I meant at all, pretty much the opposite. Sorry for the confusion. Maybe also the 99% was also confusing because that's full usage when you monitor with MSI Afterburner. I should've maybe write 100% at first.

If you're above your target fps when you've full usage then naturally there's no point to overclock but if you're ever below your target fps with full GPU usage and you don't want to decrease graphical settings, then it's time for some overclocking. It helps up to the CPU limit point which is when the GPU usage actually starts to drop.

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
Sign in to follow this  

×