windies 11 Posted December 6, 2013 @ Windies : you are a funny guy, you say the dev from BIS aren't working, that an opinion. Somebody from BIS tell you they are working on the way server works testing various fixes and changes. But then you say you want to know what exactly is modified. Of course, you have enough knowledge of RealVirtuality engine to understand and of course criticize the choices been made in these fixes. The only "proof" is that in the future we will assess server working better ... If they can produce a changelog of changes for a patch, it's entirely within their scope to be able to write up a short summary of what is wrong and what they are doing to fix it. This is the umpteeth time Dwarden has come in a thread like this and said "But guy's We're working on these abstract changes that I can't tell you what they are but we are doing them". Like I said, it's the boy who cried wolf at this point. This isn't some new issue that just came about in the past year, it's been known for a long long long time. The typical answer at this point is a smoke and mirrors response to tide things over. It's no surprise that there's barely any credibility behind "We're working on it" when that's all that's ever said and there's really nothing to show for it. Whats wrong with writing a short summary about what's going on under the hood? They do it all the time with Sitreps yet they tend to steer clear of anything regarding performance except for the latest one whereby they accused antivirus of conflicting with BattleEye. Share this post Link to post Share on other sites
ratszo 17 Posted December 6, 2013 .... Someday I would love to have everyone coming on the forums to complain about the lack of content, griefers and slight realism miscalculations than a huge and unwieldy thread of people unable to play a game that they've passionately supported. Can Bis ever bridge the gap between optimization and certain expectations --when the loudest hue & cry is, "ULTRA or nothing!", "10k VD or rabble, rabble!", "80fps min. or...", 100% cpu/gpu or....", "1960x1420 rez or ...." If Bis had released this game with say, 2k VD max, all high setting renamed "ULTRA" --maybe even a sub-routine forcing cpu usage up? Would this thread have 238 pages of indignation and outrage? I think not. Share this post Link to post Share on other sites
TSAndrey 1 Posted December 6, 2013 Can Bis ever bridge the gap between optimization and certain expectations --when the loudest hue & cry is, "ULTRA or nothing!", "10k VD or rabble, rabble!", "80fps min. or...", 100% cpu/gpu or....", "1960x1420 rez or ...."If Bis had released this game with say, 2k VD max, all high setting renamed "ULTRA" --maybe even a sub-routine forcing cpu usage up? Would this thread have 238 pages of indignation and outrage? I think not. LOL This guy. Yes it would. Not everyone here has "80 FPS or die" problems. MP performance concerns are legit, and so is bad SP performance in some areas. Share this post Link to post Share on other sites
ratszo 17 Posted December 6, 2013 LOL This guy. Yes it would. Not everyone here has "80 FPS or die" problems. MP performance concerns are legit, and so is bad SP performance in some areas. As is true with any new release of this scope? We all know Bis is looking towards optimization on several fronts. But serial malcontents infesting the board demand instant results based on flimsy reasoning such as, "But Bis you Promised!...wha...wha...wha...! Share this post Link to post Share on other sites
TSAndrey 1 Posted December 6, 2013 As is true with any new release of this scope?We all know Bis is looking towards optimization on several fronts. But serial malcontents infesting the board demand instant results based on flimsy reasoning such as, "But Bis you Promised!...wha...wha...wha...! I don't think anybody wants an optimization patch overnight, but this issue has been present since the Alpha (hell, since A1), and BI has been dodging performance questions. Share this post Link to post Share on other sites
[frl]myke 14 Posted December 6, 2013 That implies that ArmA3 could only run well with specific hardware but I can't find any information about it @system requirements! They only say: Intel or AMD DualCore or newer, 2GB RAM or more, 5-6 years old GPU or newer! Think about it! I did, did you? Please open a online seller for computer parts. Go to CPU's. How many are there to chose from? Anyway, pick one. You will need a motherboard, check which socket you need and look for such boards. How many to chose from? Well, you'll need ram too....and a graphic card...soundcard mayber (if onboard sound isn't yours)....a monitor, peripherals....see where this goes? It is nothing new that certain combinations of hardware can cause problems with used software. This isn't limited to games and ArmA 3 is not the first and surely will not be the last which suffers from this simple truth. It is impossible to test every possible (and sometimes even impossible) hardware combination prior to release. And also after release, new hardware can break things that worked perfectly before. That's the great benefit of consoles (no, i'm not a console player), their hardware is always and everywhere the same. At the end, there is nothing that could be done to prevent this....except stopping development for PC and concentrate on consoles...and i don't think anyone here would like this. Share this post Link to post Share on other sites
MavericK96 0 Posted December 7, 2013 I don't think anybody wants an optimization patch overnight, but this issue has been present since the Alpha (hell, since A1), and BI has been dodging performance questions. Also, I personally haven't observed any real performance gains since Alpha. If anything performance has gotten worse for me. Share this post Link to post Share on other sites
ratszo 17 Posted December 7, 2013 (edited) I don't think anybody wants an optimization patch overnight, but this issue has been present since the Alpha (hell, since A1), and BI has been dodging performance questions. Man, looking at the specs on your card.... If you run a2 ok, why not be happy with that? A2 is years ahead of a3 for content. If your cpu is the same vintage/price-point as the card ..., you're flogging a long dead horse here. It's a media-card, not for gaming, even when brand new., reference Far Cry 2. Optimization aside, you're waiting for a miracle, not a patch. Edited December 7, 2013 by Ratszo Share this post Link to post Share on other sites
R1C0 10 Posted December 7, 2013 (edited) @Myke: It doesn't matter how many different CPUs, MoBos, RAM or GPUs are in a online shop! What matters is the hardware configurations of all these ArmA3 players here! Maybe someone from BI should setup a dedicated survey website where all ArmA3 customers can bring in there configurations and then, with this data, BI can do some analysis an maybe to setup some general testing environment. Please don't open a simple forum pool! I think you'll need the whole configurations about: CPU + MoBo chipset (inkl. drivers), GPU (inkl. drivers), RAM and OS. It doesn't matter which keyboard, mouse or monitor they using...this is ridiculous! I think it's better if someone from BI is able to provide such a survey website so that they can ask and analyse things which fit and aim BI interests much better. But I'm doing a "lean out of the window" and say: You will be surprised about how many ArmA3 players have (nearly) the same system configuration with no significant differences! h.a.n.d. :) I wish I could write in german so that it's easier for me to be much more precise. :( Edited December 10, 2013 by R1C0 Share this post Link to post Share on other sites
Instynct 1 Posted December 7, 2013 (edited) Can Bis ever bridge the gap between optimization and certain expectations --when the loudest hue & cry is, "ULTRA or nothing!", "10k VD or rabble, rabble!", "80fps min. or...", 100% cpu/gpu or....", "1960x1420 rez or ...."If Bis had released this game with say, 2k VD max, all high setting renamed "ULTRA" --maybe even a sub-routine forcing cpu usage up? Would this thread have 238 pages of indignation and outrage? I think not. Really? Because I don't think I've seen one person complaining about not being able to run on ultra. I've seen many people complaining about performance no matter the graphical setting. Myke;2571680']Nope' date=' just yours. On a more serious note, please come back once you've read the full post:Seeing your reaction, i guess this quote nails it pretty well: But i'm absolutely sure your PC is perfectly well balanced, only the finest components, all drivers up to date and perfectly set up. There is no doubt that just the game engine is crap. Got your point.[/quote'] When so many people are reporting the same issues for what months now you still think it's system related? It's like a doctor telling a patient with cancer to take some pepto bismol. It's almost insulting you would insinuate that half the community for this game are computer illiterate. Edited December 7, 2013 by Instynct Share this post Link to post Share on other sites
TSAndrey 1 Posted December 7, 2013 Man, looking at the specs on your card.... If you run a2 ok, why not be happy with that? A2 is years ahead of a3 for content.If your cpu is the same vintage/price-point as the card ..., you're flogging a long dead horse here. It's a media-card, not for gaming, even when brand new., reference Far Cry 2. Optimization aside, you're waiting for a miracle, not a patch. My CPU is much better than my GPU. I know my GPU sucks, I'm not stupid, but that doesn't explain certain FPS problems. Share this post Link to post Share on other sites
mamasan8 11 Posted December 7, 2013 (edited) I have an fx-8350 and Radeon HD7870. Those have been on the market maybe a year in total. 8 Gigs of 1600 Mhz Ram, Crucial Ballistic. In MP, at starting base I get: Low = 24 FPS. Standard = 24 FPS High = 22 FPS Very High = 20 FPS Ultra = 16 FPS Now, with my own settings, which are based on High but some are cranked higher I get 22-24 FPS in the same spot. The only thing that actually seems to make any difference for FPS is Anti-Aliasing for me. I run it at 2x, all the time. Ultra-setting I assume uses 8x which would explain the huge drop in FPS. SSAO/HDR etc does not make even a single FPS difference, my FPS stays the same whatever I choose. Bloom and Blur is turned off, can't stand blur. Bloom I sometimes use but also there, no FPS difference with it on or off. The problem is: 4 Fps difference between Low and Very high doesn't cut it. Its barely noticable in terms of playing the game. And absolutely not acceptable FPS. This is me standing stationary, not doing anything. It gets worse once I start to move. I can run 4 instances of Everquest 2 with higher FPS. Yes, its an old MMO but the graphics have been upgraded many times. In EQ2, with clients running 'Balanced' or 'High Performance' I'm getting 30-40 fps per client. It is not pushing my GFX card much, utilization is maybe 50% but my CPU sees 80% utilization. Talk about good multicore support. It scales extremely well. Edited December 7, 2013 by mamasan8 Share this post Link to post Share on other sites
oldbear 390 Posted December 7, 2013 @ mamasan8 : as you are having CPU limitation in game, you must not set down your video settings. You must look for a balance between the highest level of FPS and the highest visual quality in Single player. With HD HD7870 all settings in Quality section must be set on "Very High / Ultra", you can set down a bit Terrain [CPU impact] but must have Shadows on highest settings [CPU impact]. The only very important setting here in Video General section is Visibility, the impact on FPS rate is huge, so with your rig, you can probably play at a decent over 30 FPS level with 2000m global visibility in Single Player. On the AA&PP section, "Bloom and Blur is turned off, can't stand blur" ... same here and I suggest to follow HardOcp recipe :"... best AA combo in this game, FXAA Ultra + 2X/4X or 8X FSAA..." One of the problem about MP that in Multi-players Visibility is defined in server settings, so you can play a decent game in SP / Campaign over 30 FPS but you will get a crappy 20 FPs rate on some server because Visibility is set over 3800m. Share this post Link to post Share on other sites
ratszo 17 Posted December 7, 2013 My CPU is much better than my GPU. I know my GPU sucks, I'm not stupid, but that doesn't explain certain FPS problems. Ah, yeah, it does explain your fps problem. It's not a gaming card. Nothing will change that. It's a 5 year old media-card. Share this post Link to post Share on other sites
jumpinghubert 49 Posted December 7, 2013 @mamasan8 24 fps max on all servers? Share this post Link to post Share on other sites
TSAndrey 1 Posted December 7, 2013 Ah, yeah, it does explain your fps problem. It's not a gaming card. Nothing will change that.It's a 5 year old media-card. It doesn't explain why performance is worse in MP than SP Share this post Link to post Share on other sites
Merkury713 10 Posted December 7, 2013 To the people saying cor 1: 75%, core 2; 25%, and only getting like 25 fps, has anyone ever tried individual CPU core clocking? Core 1 is higher than core 2 which is higher than core 3 etc. You can get your beginning cores to get a lot higher likee 5 ghz etc, and then it goes down over your other cores. Share this post Link to post Share on other sites
MavericK96 0 Posted December 7, 2013 To the people saying cor 1: 75%, core 2; 25%, and only getting like 25 fps, has anyone ever tried individual CPU core clocking? Core 1 is higher than core 2 which is higher than core 3 etc. You can get your beginning cores to get a lot higher likee 5 ghz etc, and then it goes down over your other cores. I don't think a lot of older processors support that. Share this post Link to post Share on other sites
kklownboy 43 Posted December 7, 2013 You cant OC individual cores, except for Turbo, which will only boost one core, unless you change that setting in the Bios.From which you can then have all cores at the same hz just not Turbo. On laptops you wll have all sorts of issues, with running multicore. Most laptop CPUs when running with all "four" will run slower than the rated speed of the CPU; 2.3gz quad @ 4cores is 1.8 speed..., and will only get the the Turbo boost for one Core while the other ones will go below the rated speed of 2.3.... You have to proactively change the power settings in Bios, and In Windows (coreparking) cause Windows is so cool it will use the non used cores and balance the load...Bad for games, Bad for Single Core type game Engines (RV). A laptop or a budget PC doesn't have the Bios options. Then the issues with slow HDDs on Computers with aggressive power saving options; It will idle some Cores, maybe the Core you need to shoot a evil doer with... ---------- Post added at 11:22 ---------- Previous post was at 11:10 ---------- It doesn't explain why performance is worse in MP than SPwhat needs explaining? MP is ran thru a server, SP is not. MP has to update all players(what they see, hear do etc.) use netcode. SP is just you. Map and Obj choices by Bis or Modders, along with overhead of scripts will hammer a Server, and the clients. You would be amazed by how much can go wrong in a mission. even a Bis mission. 'Wrong" being performance issues. Good missions, good mods, good scripts work fine... not alot of that around yet. Soooo if you like the game you will find your fun, or you will get upset and (to me) have unrealistic expectations of some "mythical" need for 60fps, or 120fps MP gaming with all that the RV engine can provide in game play. Wasteland is a mode that brings players in with that kinda of expectation.. and alot of noise. Share this post Link to post Share on other sites
James111333 10 Posted December 7, 2013 "mythical" need for 60fps, or 120fps Why do people STILL defend the argument of higher frame rates? YES it is a great game / sim. That is exactly why the frustration is so prevalent, the people complaining are complaining because they love Arma. Are you that ignorant that you can't see the night and day difference between 30 and 60fps? It is not mythical, it's profound. Overclocking one core on my 4770k made a huge difference and I was blown away by the difference it makes flying over Altis in a jet. Track IR, Saitek x52 pro, 100" 1080p screen and finally 60fps. Now it's a sim. If you are happy with 20FPS, fine but keep your unreasonable labelling to yourself until you realise that until the desired FPS is reached and surpassed, at least one of the main components in our systems should be at 100% Regardless of my counter statement above, this thread is about low utilisation not anyone's opinion as to whether 20FPS is ok on a £40 piece of software. Share this post Link to post Share on other sites
ratszo 17 Posted December 7, 2013 Why do people STILL defend the argument of higher frame rates? YES it is a great game / sim. That is exactly why the frustration is so prevalent, the people complaining are complaining because they love Arma. Are you that ignorant that you can't see the night and day difference between 30 and 60fps? It is not mythical, it's profound. Overclocking one core on my 4770k made a huge difference and I was blown away by the difference it makes flying over Altis in a jet. Track IR, Saitek x52 pro, 100" 1080p screen and finally 60fps. Now it's a sim. If you are happy with 20FPS, fine but keep your unreasonable labelling to yourself until you realise that until the desired FPS is reached and surpassed, at least one of the main components in our systems should be at 100% Regardless of my counter statement above, this thread is about low utilisation not anyone's opinion as to whether 20FPS is ok on a £40 piece of software. In regards to 'seeing' 60 frames, perhaps, maybe, but what we normally see is screen tearing, say from 60 to 40 or even 80 to 60. It's the fluctuation we see, not the frames in any meaningful way. That's why in the Sim aviation world it is very common to lock frames to 30 or 40 since each frame is a unit of time and of distance. Sixty frames a second is one second express as sixty over sixty, 60/60. But if even a single frame is lower..., time is warped, incorrect. If time is warped, distance is too. So screen tearing is time in flux -slower, faster, faster, slower. This we see in real time on our screens. Hence, i limit frames ingame using nvidia cp 'half-refreshrate' @75hrz. giving 37.5 fps. Worth noting the arma console command: L-Shift + numberpad minus, then, "fps" - limits frames 60/40/20/10 compounded. Chasing fps is a fool's game. Taming time & distance makes for smooth gameplay. Share this post Link to post Share on other sites
mamasan8 11 Posted December 7, 2013 @ mamasan8 : as you are having CPU limitation in game, you must not set down your video settings. You must look for a balance between the highest level of FPS and the highest visual quality in Single player. With HD HD7870 all settings in Quality section must be set on "Very High / Ultra", you can set down a bit Terrain [CPU impact] but must have Shadows on highest settings [CPU impact]. The only very important setting here in Video General section is Visibility, the impact on FPS rate is huge, so with your rig, you can probably play at a decent over 30 FPS level with 2000m global visibility in Single Player. On the AA&PP section, "Bloom and Blur is turned off, can't stand blur" ... same here and I suggest to follow HardOcp recipe :"... best AA combo in this game, FXAA Ultra + 2X/4X or 8X FSAA..." One of the problem about MP that in Multi-players Visibility is defined in server settings, so you can play a decent game in SP / Campaign over 30 FPS but you will get a crappy 20 FPs rate on some server because Visibility is set over 3800m. Sure, in singleplayer I get 40-60 fps. I might get 30-40 in MP, when I'm in open terrain and no towns close but that isn't often since I play Domi/A&W. Generally I have 20-30 fps. I know what to put on high/very high etc. @ JumpingHubert 24 FPS on my fave server. I have similar FPS on all my fave servers. Share this post Link to post Share on other sites
Game__On 10 Posted December 8, 2013 (edited) In regards to 'seeing' 60 frames, perhaps, maybe, but what we normally see is screen tearing, say from 60 to 40 or even 80 to 60.It's the fluctuation we see, not the frames in any meaningful way. That's why in the Sim aviation world it is very common to lock frames to 30 or 40 since each frame is a unit of time and of distance. Sixty frames a second is one second express as sixty over sixty, 60/60. But if even a single frame is lower..., time is warped, incorrect. If time is warped, distance is too. So screen tearing is time in flux -slower, faster, faster, slower. This we see in real time on our screens. Hence, i limit frames ingame using nvidia cp 'half-refreshrate' @75hrz. giving 37.5 fps. Worth noting the arma console command: L-Shift + numberpad minus, then, "fps" - limits frames 60/40/20/10 compounded. Chasing fps is a fool's game. Taming time & distance makes for smooth gameplay. hmm, 37 fps is NOT smooth even if it's 100% stable 37 all the time. Crappy frames is the fastest way to kill and burry the little immersion that is left after watching the enemy A.I and animations of the soldiers. The rest of your post isn't even that bad, but keep in mind most people are on 120 hz or even 144 hz monitors . So, because arma hasn't been able to actually use hardware (50% cpu load, 50% gpu load) since the last, oh what.. 10 years? , while still somehow only putting out a miserable 30 fps, i know have to cripple my hardware so i can somehow get my refresh rate to abysmal levels for "smooth gameplay" at 37 fps . NO. just ... NO . FIX THIS GAME ALREADY Edited December 8, 2013 by Game__On Share this post Link to post Share on other sites
oldbear 390 Posted December 8, 2013 (edited) @ Game__On : On my main rig, I am playing [i7 3770/GTX670OC/8Go/Dedicated Arma* SSD] on a regular basis around 35/45 FPS in SP and over 30 FPS in MP, I enjoy Arma 3, the game is smooth and environment great. I am getting 80% usage on one core and 20% to 40% usage on the others, GPU usage varies from 40% to 60% so I assume it's working well. Playing on my more or less Xperimental Athlon II rig I am playing over 20 FPS in single player and the game is still playable and since I have switched from GTS 450 to HD 7770 and add 4 Go it's now playable AND enjoyable. I found the game NOT playable in MP with this rig. On public server the FPS rate is too often under the 10 FPS line. On our dedicated server, I can get 10/20FPS but the game is jerky and in the same time is loosing visual quality, no fun there ! On SP and MP I am getting 80% to 100% usage on the 2 cores, in SP GPU usage varies from 70% to 90% but in MP GPU usage is falling down from 50% to 20%. Here I assume there is something wrong over the CPU bottleneck effect. As most of my Team mates, I am playing on a 60 Hz monitor, only one of us has a 120 Hz one. I think that this assertion " keep in mind most people are on 120 hz or even 144 hz monitors" is a bit ... well ... over optimistic. Of course, the game need some clarifications about the meaning of the Minimum official specifications. Of course, the game need enhancements in order to get a more playable Altis, to get less FPS loss in MP. But as it is Arma 3 is quite playable and enjoyable on a rig matching the official "Recommended" specifications, so there is no need for a magical "Fix". As an Operation Flashpoint veteran, I can tell you that in the past we have waited for famous patch 1.96 for a while, but we were rewarded for our patience. Edit : After some more tests in Kavala in SP ... with Xperimental "Athlon II x2 250 / HD7770/ 8 Go / Dedicated Arma* SSD" missions with a lot of AI [>100] are hardly playable in Kavala [View distance 500m, Terrain Grid 25] @10/20 FPS, I am a bit worried about playing Campaign part 2 on Altis with such a rig. Edited December 8, 2013 by OldBear still doing tests... Share this post Link to post Share on other sites
nikiforos 450 Posted December 8, 2013 I think its a matter of taste if 37 FPS is smooth for someone or not. Personally everything above 45 is smooth for me. When my FPS drops below that I notice the difference quite clearly. Share this post Link to post Share on other sites