Jump to content

Groove_C

Member
  • Content Count

    940
  • Joined

  • Last visited

  • Medals

Everything posted by Groove_C

  1. @Valken the only board which is good is Gigabyte B550 Aorus Pro or Pro AC (WiFi). It has 6 layers PCB (copper), memory OC is very good, very very good sound, very good VRM and VRM cooling, passive cooling for M.2 SSD and a lot of USB ports and a lot of connectors for case fans. And this is the RAM you should buy: 2x8 GB: https://geizhals.de/g-skill-flare-x-schwarz-dimm-kit-16gb-f4-3200c14d-16gfx-a1590064.html 2x16 GB: https://geizhals.de/g-skill-flare-x-schwarz-dimm-kit-32gb-f4-3200c14d-32gfx-a2151635.html
  2. @Valken where do you look for price of stuff you want to buy? what's the PSU you have now? 4000 MHz CL19 you can forget since it's not for AMD. 3200 MHz CL16 is really bad quality. ASUS X570P is really bad quality. VRMs are weaker than on smoe other similarly priced boards, VRM cooling is weak, sound is trash and no passive cooling (aluminum plate) for M.2 SSDs.5600X has no feature, since I've seen in Hardware Unboxed review of Ryzen 5000 and games that are new, are heavy on cores (on their number) and this is only the beginning, so it should be not les than 16 threads + you shouldn't buy now, because of artificially too high prices for Ryzen 5000. It will calm down end December or beginning January.
  3. Well, if you play in singleplayer only and not higher than 1080p standard, then yes. Otherwise, in multiplayer and at 1080p very high/ultra, the difference is marginal at best and can only be measured, but not perceived, like at all, because it's then around 5 FPS average and not minimum and only at low view distance, like 1500m. If you increase the view distance to like 3000m or more, FPS difference will be <5 FPS average, not even talking about minimum FPS and 1440p/2160p resolution. + in multiplayer, your PC performance will be limited by the server you connected to, depending on the hardware of the server, how efficiently the mission was written by its author and on number of players/AI/vehicles. I would still do it, personally, but don't recommend it to the majority of players, since this tiny marginal gain requires a lot of knowledge, time and patience.
  4. YAAB 1.00 | Arma 2.00 1080p standard (CMA AVX2 malloc) R9 5900X @stock | 2x16 GB 3600 MHz 16-19-19-39 | GTX 970 (3.5 GB) | SSD FPS: 55 min | 82 avg R9 5900X 4.6/1.8 GHz core/Infinity | 2x16 GB 3600 MHz 16-19-19-39 | GTX 970 (3.5 GB) | SSD FPS: 58 min | 88 avg i7-9700K 5.1/4.8 GHz core/cache | 4x8 GB 3600 MHz 16-19-19-39 | RX 470 (4 GB) | SSD FPS: 48 min | 74 avg i7-9700K 5.1/4.8 GHz core/cache | 4x8 GB 4000 MHz 15-15-15-28 | RX 470 (4 GB) | SSD FPS: 57 min | 83 avg R5 5600X, R7 5800X or R9 5900X @ 4.6 GHz with good 3800MHz CL14/15 or with a bit of luck 4000 MHz CL15/16 could deliver 10 min and avg FPS more, judging by Intel with 4000 MHz 15-15-15-28 RAM
  5. Personally I wouldn't buy anything more expensive than 3600 MHz 16-16-16-36 (G.Skill Trident Z | Samsung B-Die) or 3600 MHz 16-18-18-38 (Crucial Ballistix | Micron E-Die) and probably wouldn't waste any amount of my time squeezing more performance out of the purchased kit. I also wouldn't buy anything lower than 3600 MHz, because of bandwidth vs. current and most importantly upcoming games, with a very detailed/populated world, high resolution textures and very detailed 3D models, all of which are heavy on bandwidth. Nice FPS difference you all can see here is only because it's only 1080p, only standard video settings and only singleplayer. Only 1080p, only standard video settings and only singleplayer are used to be able to have as fair and reproducible results as possible, to be able to see what certain CPU and/or RAM can and not be restricted/influenced by the GPU. These settings/conditions have naturally nothing to do with real 24/7 settings/conditions, used by majority of players, but it's done this way, because not everybody has a powerful/expensive GPU and 1440/2160p monitor(s). At real 24/7 settings/conditions, like high, very high or ultra video settings + 1440/2160p resolution monitor(s) + multiplayer, FPS difference between 3600 MHz CL16 and 4000 MHz CL15, 4266 or 4400 MHz for a lot of money and difficulties to make them even work, is marginal at best and can only be measured, but not perceived, like at all. For Arma, people with 7/8/9/10 gen Intel CPUs OC'ed to like 4.6-4.8 GHz have no reason to upgrade to Ryzen 5000, since in real 24/7 conditions/settings, performance difference is marginal at best. + in multiplayer, your PC performance will be limited by the server you connected to, depending on the hardware of the server, how efficiently the mission was written by its author and on number of players/AI/vehicles. So instead of buying new CPU + mainboard + RAM, for Arma, if you have 2400/2666/2933/3000/3200 MHz RAM and terrible/high timings and only 8 or 16 GB, I would first suggest you to buy 32 GB 3600 MHz CL16 RAM. And if your CPU runs at stock, OC it to at least 4.8 GHz. This will definitely and noticeably improve the gameplay performance/experience.
  6. @oldbear @Mahatma Gandhi @Hoschi_1975 @Valken Check this video, where a russian guy is testing R9 3950X, R5 3600X and his bad quality R7 3700X, using Ryzen Clock Tuner. It's in russian, but you don't have to understand anything and can simply turn off the sound. Simply observe FPS, which despite frequency difference remains same, temperature and power draw of the CPUs. СТОК in russian means stock, referring to the CPU state. He's testing stock vs. PBO (Precision Boost Overdrive) vs. CTR (Clock Tuner Ryzen) vs. manual OC with fixed frequency all cores and fixed voltage. He also bumps up CPU voltage to 1.35 V in Ryzen Clock Tuner to see if there are any performance gains to be had - none. In the video you can see Shadows of the Tomb Raider, Far Cry New Down and Troy a Total War Saga. - Far Cry New Dawn doesn't use cores much, one of the cores is >90% and thus requires as high as possible single core performance, like Arma 3 - Troy A Total War Saga uses like 80% of 32 threads of R9 3950X with extreme number of soldiers groups and extreme grass setting Conclusion: - manual OC with as high as possible fixed frequency all cores and fixed + higher CPU voltage don't increase FPS at all, but significantly increase power draw and temperature - enabling PBO (Precision Boost overdrive) slightly increases the frequency, but FPS remains same and same as for CTR - increased power draw and temperature - stock/auto has lower or same frequency (all cores) vs. using CTR (Clock Tuner Ryzen), but with higher voltage than really required to do so, resulting in higher power draw and temperature than it could have been - CTR (Clock Tuner Ryzen) manages to select much lower voltage for same or higher frequency all cores or can even increase frequency for cores that are of better quality, while keeping same low/lower voltage So the overall performance is increased, voltage reduced and thus power draw and temperature as well And this is despite slightly lower single core scores in Cinebench and CPU-Z, while all cores scores are higher
  7. Only uninformed people will continue to buy Intel (currently), even at lower cost (CPU only), since at stock they are now significantly behind the new Ryzen (also at stock), in Arma inclusively. + they heat up noticeably more, consume significantly more and require more expensive motherboard and cooling to squeeze out the last bits of performance and still not reach AMD (at stock), which haven't even been OC'ed, which makes an Intel build as whole, still not cheaper, if not more expensive than now much more performant and cooler/efficient Ryzen 5000.
  8. YAAB 1.00 | Arma 2.00 1080p standard (CMA AVX2 malloc) R9 5900X @stock | 2x16 GB 3600 MHz 16-19-19-39 | GTX 970 (3.5 GB) | SSD FPS: 55 min | 82 avg R9 5900X 4.6/1.8 GHz core/Infinity | 2x16 GB 3600 MHz 16-19-19-39 | GTX 970 (3.5 GB) | SSD FPS: 58 min | 88 avg i7-9700K 5.1/4.8 GHz core/cache | 4x8 GB 3600 MHz 16-19-19-39 | RX 470 (4 GB) | SSD FPS: 48 min | 74 avg i7-9700K 5.1/4.8 GHz core/cache | 4x8 GB 4000 MHz 15-15-15-28 | RX 470 (4 GB) | SSD FPS: 57 min | 83 avg R5 5600X, R7 5800X or R9 5900X @ 4.6 GHz with good 3800MHz CL14/15 or with a bit of luck 4000 MHz CL15/16 could deliver 10 min and avg FPS more, judging by Intel with 4000 MHz 15-15-15-28 RAM R7 5800X is the best of all new Ryzens (for gaming). It has "only" 16 threads, and not 24 threads like R9 5900X, which is much easier to cool. + it has more better quality cores than R9 5900X + cheaper. R5 5600X (12 threads) is even easier to cool and even cheaper, BUT! It has the least high-quality cores (what couldn't become 5950X/5800X/5900X). + powerful GPUs will already be limited by it.
  9. I asked a friend that has an i7-9700K 5.1/4.8 GHz core/cache and 32 GB RAM 4000 15-15-15-28 and CMA AVX2 malloc to run YAAB 1080p standard also at 3600 MHz 16-19-19-39 and default (Intel) malloc, so we can compare against R9 5900X. Hope that this guy will do it all correctly, like I explained it to him.
  10. @Valken have asked this guy (WITH R9 5900X) to run YAAB properly and at proper settings and at least 5 times, but preferably 10 times and write down in YAAB comment line for each run min, avg and max FPS. The results he has shared until now are purely useless. CPU frequency is unknown, Infinity Fabric frequency is unknown, enabled in BIOS RAM XMP rofile or not, only 1 run of YAAB, only avg FPS, not known if he has changed anything else in video settings, if any other programs were running, if windowed or full screen etc. also GTX 970 should be run in YAAB at not higher than 1080p standrad settings, to see what the CPU can, and not in high, very high or ultra settings. Otherwise we won't be able to see what the CPU is cpable of and nobody cares what 2015 3.5 GB GTX 970 is capable of.
  11. So YAAB is still a good indicator of your hardware performance, but you just need to concider that in MP, past a certain level of hardware performance, thanks to Arma's netcode, there will be no further benefits.
  12. Tested today in multiplayer my 4790K 4.8/4.4 GHz core/cache + 32 GB 2400 CL10 MHz DDR3 vs. 9700K 5.1/4.8 GHz core/cache +32 GB 4000 CL15 MHz DDR4 Results were same (+- 2-3 FPS difference, when view distance was 500-1500m). So the higher the view distance, the lesser FPS difference between newer and older CPUs. Same goes for the GPUs. The only scenario, where a more powerful CPU/GPU can make a visually unnoticeable, but still possible to measure (for the record) difference is between 500-1500m view distance. So really not worth to OC your Intel CPUs higher than 5.0/4.7 GHz core/cache. Also 4000 MHz RAM vs. 3200 MHz (XMP) made less than 5 FPS difference. We both have same Windows version and SSD model and only Arma, Steam, Discord, Samsung Magician and MSI Afterburner were running on our computers. GPUs weren't helping/hindering none of us, since we've done the tests in 1080p standard and 1080p low + everything possible disabled. Server FPS was same all the time (almost 50 FPS) + hundreds of AI infantry + not more than 15 armored vehicles and not more than 10 jets/helis. We were always looking at same spot, had same resolution and graphics settings and view distance (tried 500-7000m). PvE server was full and we tested FPS as infantry at base, in Pirgos and Kavala, with 500-1500m. And also hoovering 4 kms away from Pirgos and Kavala in the heli at 500m altitude, looking at these 2 villages with 7000m view distance. So in MP, in Arma, you're not limited by server hardware, when server already always runs at max possible FPS (default 50 FPS). But FPS you experience is limited by network traffic and Arma's netcode. Even if the server would have been something like a new R9 5900X, instead of OC'ed i7-3930K, since FPS is already now almost 50, server FPS would have remained same, hitting 50 FPS limit. And I doubt that with more recent/powerful hardware, there would have been less/more network traffic. So what data gets requested/received by the server from what number of clients and what's synchronized to what number of clients is what dictates the client FPS, provided client already has a very good hardware basis. So past certain client hardware performance level, there is 0 performance increase in MP, in Arma.
  13. R7 3800 XT is second from the bottom, not even talknig about 3600(X/XT), 3700(X) or 3800X Well, it's only synthetics and not a game or rendering program, but still.
  14. @Smart Games I already posted links for 2 best german forums, concerning RAM OC (Intel specific). There is already a lot of info and you can ask for advise. But 6 FPS more at 4000 MHz 15-15-15-28-300 vs. your 2666 MHz 13-13-13-35-650, considering your 4K resolution and pretty high graphics settings, it's already a good improvement, especially considering how low FPS is in Arma. It's not like going from 100 to 106 FPS )))
  15. Well, this must have cost you some time to do, but I have to tell you, that primary timings are not those that bring the most FPS, but are only partially responsable for higher FPS. There is even more min. FPS to squeeze out (vs. XMP), despite your 4K resolution and pretty high graphics settings, once you manage to tighten every single timing to the absolute min. stable/working values. + it makes FPS more stable = less and not so severe dips. So while your work already shows slight improvements in having better configured RAM, despite your 4K resolution and pretty high graphics settings, it's only partially true, since FPS difference can be even higher than that. Hope nobody will take your work to show that better RAM is useless, because of "so tiny" improvements. So yes, playing at "only" 1080p and not highest graphics settings, lets the RAM make even bigger min. FPS difference, because at higher resolution and graphics settings, you're more and first limited by the GPU, long before CPU and/or RAM become the limiting factor. But even in this case, without even fully tunning the RAM, there is like 5 FPS difference already.
  16. @Smart Games they can do 4000 15-15-15 or 3800 14-14-14 or 3733 14-14-14, provided your CPU memory controller and motherboard are good + needed but still human SA, IO and VDIMM voltages have to be applied. See here: https://www.hardwareluxx.de/community/threads/intel-ram-oc-guides-und-tipps.1230518/ https://www.computerbase.de/forum/threads/intel-core-i-serie-ram-overclocking-auswirkungen-auf-spiele.1849970/
  17. @Smart Games RipJaws V 3600 16-16-16-36 has same memory chips as all Trident Z 3600 16-16-16-36 variants. Just look simpler and thus cost less. It's possible to OC it a lot with very low timings. But you could have bought these, with same specs and same price. https://geizhals.de/g-skill-trident-z-schwarz-weiss-dimm-kit-16gb-f4-3600c16d-16gtzkw-a1505578.html Just look better and are slightly colder. They were in the list I made few messages higher.
  18. @Smart Games CPU and motherboard model? There are 2 different RipJaws V 3600 MHz CL16 -> 16-19-19-39 and 16-16-16-36, which is not the same.
  19. @Smart Games which RAM model have you ordered and what is your CPU and motherboard model? Have foud Trident Z silver/red 3600 15-15-15-35 2x8 GB. Thought it wasn't available anymore. https://geizhals.de/g-skill-trident-z-silber-rot-dimm-kit-16gb-f4-3600c15d-16gtz-a1439046.html It's slightly more responsive than 4000 MHz 17-17-17-37 and also cheaper. Even cheaper than 3600 MHz 16-16-16-36 Neo/RGB and even more responsive. But it's for Intel - not AMD + the color is not that nice, if you care.
  20. G.Skill Trident Z RGB 4000 MHz 17-17-17-37 if Intel only 2x8 GB - https://geizhals.de/g-skill-trident-z-rgb-dimm-kit-16gb-f4-4000c17d-16gtzr-a1755388.html 4X8 GB - https://geizhals.de/g-skill-trident-z-rgb-dimm-kit-32gb-f4-4000c17q-32gtzr-a1755380.html G.Skill Trident Z Neo 3600 MHz 16-16-16-36 if AMD or Intel 2x8 GB - https://geizhals.de/g-skill-trident-z-neo-dimm-kit-16gb-f4-3600c16d-16gtzn-a2099434.html 4X8 GB - https://geizhals.de/g-skill-trident-z-neo-dimm-kit-32gb-f4-3600c16q-32gtzn-a2099487.html 2X16 GB - https://geizhals.de/g-skill-trident-z-neo-dimm-kit-32gb-f4-3600c16d-32gtzn-a2099396.html G.Skill Trident Z black/white 3600 MHz 16-16-16-36 if Intel only 2x8 GB - https://geizhals.de/g-skill-trident-z-schwarz-weiss-dimm-kit-16gb-f4-3600c16d-16gtzkw-a1505578.html G.Skill Trident Z RGB 3600 MHz 16-16-16-36 if Intel only 2x8 GB - https://geizhals.de/g-skill-trident-z-rgb-dimm-kit-16gb-f4-3600c16d-16gtzr-a1561158.html 4X8 GB - https://geizhals.de/g-skill-trident-z-rgb-dimm-kit-32gb-f4-3600c16q-32gtzr-a1561113.html 2X16 GB - https://geizhals.de/g-skill-trident-z-rgb-dimm-kit-32gb-f4-3600c16d-32gtzr-a2152645.html
  21. 4000 MHz CL15 is for example just 1.79% less responsive than 3800 MHz CL14 (in nanoseconds), but has higher bandwidth, so 4000 MHz CL15 is still slightly better than 3800 MHz CL14 overall. That's why you can see the rest of frequency vs. timing(s) in this specific order. Same goes for 3600 MHz CL16, that's just slightly less responsive than 3200 MHz CL14, but has gigher bandwidth, so 3600 CL16 is still slightly better than 3200 MHz CL14 overall.
×