blankko 0 Posted September 14, 2016 I'm using an AMD HD 6970 and used to get around around 40fps on high settings, 1680x1050. After the Tanoa install, my fps dropped to around 30fps on Altis and 25 fps on Tanoa. So I'm planing to upgrade to the latest AMD GPUs. I know BIS is a wintel nvidia gameworks shop and for that AMD GPU owners have to accept being treated as second class users but a disparity of over 50% between similar price/performance category GPUs is remarkable. The only reasoning I can get out of this is development efforts spent on only one vendor platform while AMD users are left to their own. I will still be getting an AMD GPU as there are other task I benefit from it; deciding between the RX-470 and RX-480. I hope BIS developers will take note the this unjust disparity and work with AMD towards giving your AMD GPU customers a bit better experience with ARMA 3. Share this post Link to post Share on other sites
thr0tt 12 Posted September 14, 2016 I have an R9 390, it's not the GPU that is the issue it's mainly CPU bound as I can crank everything down and up and it makes little difference. Share this post Link to post Share on other sites
dunedain 48 Posted September 14, 2016 Or maybe the GTX1060 3GB is simply superior as far as DX11 games are concerned thank to better optimized drivers? Share this post Link to post Share on other sites
austin_medic 109 Posted September 14, 2016 AMD drivers are crap, and Nvidia's are not. It will get better performance for the video card if the drivers are better for one than they are for the other. This game is mainly tied to how fast the CPU can go, if it can stomp single threaded workloads then you'll get very high FPS if it is also paired with a strong GPU. 1 Share this post Link to post Share on other sites
domokun 515 Posted September 15, 2016 AMD drivers are crap, and Nvidia's are not. It will get better performance for the video card if the drivers are better for one than they are for the other. This game is mainly tied to how fast the CPU can go, if it can stomp single threaded workloads then you'll get very high FPS if it is also paired with a strong GPU. Err actually that's not entirely true. You're right that IPC has a heavy influence on performance in Arma but so does GPU as highlighted here: https://youtu.be/UISTSZ-fx_8?t=6m49s Right now, to get good performance (60+ FPS @ 1080p) in Arma I suggest: 1. a fast CPU: 3.5 GHz min, ideally 4.0+ GHz 2. an Intel CPU: i3 OK, i5 better, i7 50%, i7 best but not cost-effective (50% more expensive than i5 with no significant benefit) 3. a fast GPU, preferably NVidia: GTX 1060 min, ideally 1070, 1080 if you want 1440p or even 4K 4. a SSD with 256+ GB for OS, A3 & mods 5. 8-16 GB fast RAM: 2133 or higher for DDR3 & 2666 or higher for DDR4 Share this post Link to post Share on other sites
Tajin 349 Posted September 15, 2016 The 1060-1080 have a new architecture which at the moment is simply superior to anything AMD has to offer. As far as CPUs are concerned, ArmA benefits from good single-core procressing. Intel happens to be quite good with that (HT certainly plays its part there), hence arma usually runs better on them. This has little to do with bad drivers or bad optimization (except for bad multicore support, yes). 1 Share this post Link to post Share on other sites
blankko 0 Posted September 15, 2016 Yes, AMD GPUs are slower on dx11 games and yes, nVidia's pascal architecture is a lot faster in dx11 than AMD's polaris architecture. In fact, the difference between a GTX 1060 and RX 470 can be seen on the benchmark above, showing the 1060 at 14% faster on average. Arma 3 however, being the outlier here, favoring the 1060 at 55% more fps can only mean the game is heavily optimised for nVidia's latest architecture while AMD's architecture gets no love from the devs. As an AMD GPU user, I know what to expect, when running a game that dons nVidia's Gameworks logo at launch. I just ask the developers to also give attention to users not using nVidia cards so that everyone can get a better performance out of Arma 3. Share this post Link to post Share on other sites
heavygunner 179 Posted September 15, 2016 The reason why only Arma performs that way in these benchmarks, is that AMD drivers have a more CPU overhead and since Arma is already CPU dependent you have even less fps then nvidia cards. There is nothing BIS can optimize for AMD and it has nothing to do with Gameworks, so get that conspiracy out of your head. Share this post Link to post Share on other sites
blankko 0 Posted September 16, 2016 http://www.hardwareunboxed.com/gtx-1060-3gb-vs-rx-470-4gb-best-value-at-200/ The benchmark suite clearly showed Arma 3 favoring nVidia more so than the other games that the reviewer had to excluding Arma 3 from overall picture, preventing a skewed result. Gameworks titles like the Witcher 3 was only 9% faster on the 1060 but at 55% faster for Arma 3 does raise eyebrows. If other game developers can get their engines running fairly well on both architectures, why can't we ask the same of BIS? Share this post Link to post Share on other sites
domokun 515 Posted September 16, 2016 http://www.hardwareunboxed.com/gtx-1060-3gb-vs-rx-470-4gb-best-value-at-200/ The benchmark suite clearly showed Arma 3 favoring nVidia more so than the other games that the reviewer had to excluding Arma 3 from overall picture, preventing a skewed result. Gameworks titles like the Witcher 3 was only 9% faster on the 1060 but at 55% faster for Arma 3 does raise eyebrows. If other game developers can get their engines running fairly well on both architectures, why can't we ask the same of BIS? You're right. As the reviewer said, there's something wrong with the driver. The driver is the responsibility of the designer of the chipset, not the developer of the game. So the ball is in AMD's court here, not BIS' Much as I want AMD to rival NVIDIA (Athlons & Radeons powered my rigs for almost a decade), you can't polish a turd. Share this post Link to post Share on other sites
brightcandle 114 Posted September 16, 2016 I used AMD cards back in the Arma 2 days and they had this one bug where you would get white snow where the ATOC should be, you had to turn that setting off. It was like that years just broken. I used dual cards and there were these places in the map where crossfire scaling would just disappear and the frame rate would stutter like crazy. Then Arma 3 came out and I had the same issues. I finally gave up on the 7970 and used 2x680 and all my graphical glitches disappeared, my performance improved and the scaling was consistent. Considering the ATOC bug was there for years I have to blame AMD for that because it was a bug that they finally fixed for Arma 2 and Arma 3 at the same time when they finally got around to do something for Arma 3. Its just not a game on AMD's radar, they aren't doing much to help it along. If you go look on gamegpu.ru and go through a tonne of games you see this trend where the common games all have the cards performing where you might expect with the usual back and forth variances. But then you look at the rarer and less popular games or the old ones and Nvidia is beating AMD decisively. There are too many "did not work" results in those less common games. As far as I can tell on the grand meta AMD focussed on particular games to save resources, they don't generally produce good performant drivers they produce optimised drivers for popular games. Arma 3 isn't on their radar, its not benchmarked enough to impact their comparison on most major websites and hence I suspect their support is largely abandoned. The other possibility is that BI is doing a tonne to hurt AMD performance and is refusing to cooperate with them, but considering how often this happens in other less popular games too it would seem like it remains an AMD problem. AMD has had this problem for a decade, its been a consistent issue which up until recently has always been blamed on AMD, now for some reason its been framed as everyones problem (the game devs are shilling, Nvidia is hurting their performance with gameworks etc etc), and yet all the same evidence of issues on the less played games is still there. Regardless the answer is the same, you want an Intel CPU with the highest clockspeed and latest architecture (not an -E chip they get "hurt" by their memory controller we think) and you want an Nvidia GPU. That is the best combination to actually play the game. Whoever's fault it is doesn't matter because the game runs poorly enough that if you don't build a machine with the game in mind you will get worse performance as a result. Share this post Link to post Share on other sites
heavygunner 179 Posted September 17, 2016 If other game developers can get their engines running fairly well on both architectures, why can't we ask the same of BIS? Cause other games aren't Arma and don't need that much CPU resources like Arma does? Share this post Link to post Share on other sites
blankko 0 Posted September 20, 2016 Isn't that Arma 3 being poor at multithreading? They could start optimising there it they haven't yet. It would benefit everyone. Share this post Link to post Share on other sites
hcpookie 3770 Posted September 24, 2016 Arma games have never multi-threaded properly. I happen to know a few things about multi-threading as performance optimization on servers is part of my day job. Since BIS doesn't publish performance hooks that we the public can use to monitor, we have to go on empirical observations alone. Perfmon collections are mostly useless while the game is running. Procmon is not much better. The main performance hits are on the CPU and disk I/O. While an SSD drive helps, it only helps during disk I/O which is usually when new files are loaded from disk to memory. You can set your paging file to zero with enough physical RAM to keep all page operations in physical memory. It helps. The big problem however is that with all these optimizations in place, a multi-core processor always has a single core doing all the work. One can confirm this easily by examining how many threads are being spawned by the Arma 3 process. The BIKI documentation indicates only a handful of threads at the most are spawned by the process. 7 at most I think. And only ONE thread for all the scripting, as far as I recall. Now I don't know about you, but scripting seems to be one thing that always happens in this game. :) That's why any given mod in a single player "empty" map gets 80+ FPS and then your MP dedicated server gets 20 FPS if you're lucky. People (including myself) are getting smarter with scripting so this is being helped some by better scripts, however they all must use that single, poor, overworked thread for processing. Meaning a single core! That could be off-loaded by the OS, if it were written in such a way to send that to the OS scheduler. Now I don't know how it is coded today, so my supposition here is that it is not being off-loaded by the OS. It appears to be always sent to the same core. A simple search of this forum shows how many have watched a single core of their 4x or 8x systems run high when the others are nearly completely idle. Which supports this theory. Offloading anything to the GPU is "relatively" new and may not be understood yet by the BIS developers... this is the same stuff that bitcoin miners are using... That could be a huge benefit because as perception suggests (again we don't have published specs other than BIKI) is that there are many rendering tasks that today's fancy GPU's could do better. Again that's a perception and if we ever see developer specs (unlikely) we could confirm this instead of making guesses :) Without proper debug tools it is difficult to delve deeper. Computer optimization and upgrades can only go so far, I'm afraid. A finely-tuned Ferrari will still run poorly if you feed it crap gas. Faster than a VW beetle, yes, but still poorly. That's fundamentally why other game titles seem to run better, regardless, I'm afraid. 3 Share this post Link to post Share on other sites
MONACO SLIM 1 Posted October 17, 2016 surprised AMD is still in business Share this post Link to post Share on other sites
fighter3005 0 Posted September 21, 2017 Hey guys, I found an interesting result when disabling cores on my I7 6700k while playing ArmA 3. I tried to emulate the performance of a g4560 since I sold mine. I noticed, that more cores resulted in a much more stable experience (lower minimum framerae). you can look at my results at this link: https://drive.google.com/open?id=0B323H8cfciypYWFFUHVuQXBEX3M or at the picture above. The CPU usage is more spread on the cores (all 4 cores are used 40%, and 2 cores are used around 80%) Share this post Link to post Share on other sites
computer 113 Posted September 21, 2017 I've had a 4870x2 and got a 7970 atm. I can say that I've been let down and will get nvidia for my next card 770 came out like a month after I got my 7970 (SAME PRICE) and I borrowed one and got ~10% better perf. in arma 3. I've never been a part of any team/cult, because tribalism is stupid. But I can say I've not had a good experience with AMD, but I've had no experience with nvidia in long term... Share this post Link to post Share on other sites
Guest Posted September 22, 2017 Nvidia is better, but i love the AMD terrorists. Share this post Link to post Share on other sites
Grumpy Old Man 3546 Posted September 22, 2017 On 9/14/2016 at 2:21 PM, blankko said: I know BIS is a wintel nvidia gameworks shop and for that AMD GPU owners have to accept being treated as second class users So you're blaming BI because you bought an AMD GPU? On 9/20/2016 at 7:09 PM, blankko said: Isn't that Arma 3 being poor at multithreading? They could start optimising there it they haven't yet. It would benefit everyone. Go ahead, learn C++ and all there is about threads, mutex, future, promise and async, then come back and teach those silly devs so they can finally start optimizing their poor engine. Can't be that hard. Cheers 1 Share this post Link to post Share on other sites