Spooner 0 Posted June 19, 2009 (edited) Secret is to always use your TFT's native (maximum) resolution, but alter fill rate until you have the optimal FPS: * If you have a very weak 3D card: Use native resolution with low fill rate (< 100%). You will probably never gain anything by using FSAA. * If you have an average 3D card: Use native resolution with 100% fill rate. Perhaps consider using FSAA after it is available, since it is optimised over using an equivalent high fill rate. * If you have a very strong 3D card (or SLI): Use native resolution with high fill rate (> 100%), until FSAA is available, then probably use that instead. What is a weak/average/strong card is rather dependent on the resolution of your monitor compared to your cards power, so I can't be more precise. If you have a CRT, things are a bit different, but you are still best using a higher resolution with a lower fill rate over a lower resolution with a higher fill rate. EDIT: For CRTs - Nevertheless, don't run it at a resolution so high you wouldn't run your desktop on it (I once had a CRT that was rated to 1600x1200, but at that resolution it was so flickery and blurred, I couldn't see anything, so I ran my desktop at 1280). Edited June 19, 2009 by Spooner Share this post Link to post Share on other sites
Bigtnaples 10 Posted June 19, 2009 So the fillrate option is most likely limited by your amount of video ram correct? For instance if you have a 256mb card your fillrate would have to be low, while if you had 1gb, it could be higher? Vram makes big difference with high resolutions in other games is why I ask. Share this post Link to post Share on other sites
Alex72 1 Posted June 19, 2009 I am on 1600x1200 (21 CRT) and i run windows in that with 85hz so no flickering. but this is way too much for ARMA2 with my puny 8800GTS 320MB. ArmA1 with 1600x1200 full AA etc on Afghan Village with some fights is ok though. But im buying new system for ARMA2. i tested some urban combat in ARMA2 and its fucking amazing. People saying otherwise are retarded to the max. I need new system though for sure. Alex Share this post Link to post Share on other sites
Spooner 0 Posted June 19, 2009 72;1317327']I am on 1600x1200 (21 CRT) and i run windows in that with 85hz so no flickering. but this is way too much for ARMA2 with my puny 8800GTS 320MB. In this case' date=' you'd run A2 at 1600x1200 [i']with a low fill-rate[/i]. The GUI would be as crisp and clear as you are used to getting on the desktop, but the 3D rendering would run as though you were running on a lower resolution (so running as fast as if you were running the game at 800x600 resolution if you are running at 1600x1200 with 50% fillrate). Share this post Link to post Share on other sites
Zaldronthesage 10 Posted June 19, 2009 We will change logic of this control in version 1.02 to be hopefully more clear to everyone.Our engine can render user interface and game 3D rendering in different resolutions now which is very flexible and useful feature (in some games we noticed this but usually users are not able to change the settings on their own). Main reason is to allow users with hi res displayes to enjoy hi res UI (with native resolution of their monitor as they have it in Windows applications) even when their graphics card can't handle 3D rendering of complex scene at acceptable frame rate. It also allows image supersampling, e.g. if you have very powerful graphics card but not that big monitor, you can render to very high resolution which is interpolated to your monitor (basically equivalent of full FSAA). Note that once we enable MSAA (see http://en.wikipedia.org/wiki/Multisample_anti-aliasing for better explanation of these terms) in our game the game image will look very well even on lower resolutions upscaled to higher native resolution of your monitor and still is much more GPU speed friendly solution than full rendering (e.g. not many cards can cope with 1920x1200 resolution for 3D rendering). Not to kiss your ass too much sir, but you are a god. LOL Share this post Link to post Share on other sites
lowang 2 Posted July 5, 2009 (edited) omg I must say it looks pretty confusing in the graphics settings now after there are normal anti aliasing options:) I found out that fillrate behaves interesting - it works quite poor as an anti aliasing and the horizon and edges are still quite jagged, but IT ADDS DETAIL! I tested it while looking on a village from a hill and with fillrate over 100% I saw fences in the village and some other things which were not visible with fillrate 100%. So I would really not use this as an anti aliasing because normal anti aliasing options work much better and with higher fps, but there are these other interesting image effects worth testing... Anyway I did not find the difference between low, medium, high, very high and 5,6,7,8 anti aliasing options. Can someone explain it please? Also I would except option 4 there which is not present (but maybe the "normal" settings means 4 I guess...) Edited July 5, 2009 by LoWang Share this post Link to post Share on other sites
Squigibo 10 Posted July 5, 2009 I think he just not wanted to be goaded into doing stuff you're too lazy to do yourself. bwahahaaa. Share this post Link to post Share on other sites
Spooner 0 Posted July 5, 2009 I assumed low, medium, high, very high refered to none, 2x, 3x and 4x AA and thus were lower than the 5x, 6x, 7x, 8x options. Your card may not support all the options, but I guess it defaults to the next lowest setting if it doesn't. Essentially, each 'x' means the screen is drawn that many times bigger before resizing to be viewed (well, this isn't literally what is going on, but good enough for a simple explanation). So, for 4xAA on a 800x600 screen will be rendered at 1600x1200 and then resized to 800x600 before the user sees it. I did indeed see that MSAA was many times faster than the equivalent fill-rate. That is, 4xMSAA is vastly faster for me than 200% fill-rate (I think because graphics cards are optimised for hardware AA, but not for fill-rate, which must be done in software). Share this post Link to post Share on other sites
lowang 2 Posted July 6, 2009 I did indeed see that MSAA was many times faster than the equivalent fill-rate. That is, 4xMSAA is vastly faster for me than 200% fill-rate (I think because graphics cards are optimised for hardware AA, but not for fill-rate, which must be done in software). And it actually works against jagged edges compared to higher % fillrate... OK and you say that those AA options are actually MSAA? Share this post Link to post Share on other sites
GLeek 10 Posted July 6, 2009 it's incredibly faster yes. but i'm not sure TSAA 4x is faster than 200% + downsampled in this game. many alpha stuff ... ---------- Post added at 01:13 PM ---------- Previous post was at 01:11 PM ---------- omg I must say it looks pretty confusing in the graphics settings now after there are normal anti aliasing options:) I found out that fillrate behaves interesting - it works quite poor as an anti aliasing and the horizon and edges are still quite jagged, but IT ADDS DETAIL! I tested it while looking on a village from a hill and with fillrate over 100% I saw fences in the village and some other things which were not visible with fillrate 100%. So I would really not use this as an anti aliasing because normal anti aliasing options work much better and with higher fps, but there are these other interesting image effects worth testing...Anyway I did not find the difference between low, medium, high, very high and 5,6,7,8 anti aliasing options. Can someone explain it please? Also I would except option 4 there which is not present (but maybe the "normal" settings means 4 I guess...) 5, 6, 7, 8, aa option are nvidia only. CSAA or TSAA , and maybe 8x Supersampling. unusable anyway. stay with low or normal. no need to push higher than 4x Share this post Link to post Share on other sites
lowang 2 Posted July 6, 2009 and what is the graphic memory settings? I did not notice any effect changing this. Also what is the "default" option I have there. Shall I choose default so it will use my 896MB VRAM? Share this post Link to post Share on other sites
CallMeSir 0 Posted July 6, 2009 (edited) and what is the graphic memory settings? I did not notice any effect changing this. Also what is the "default" option I have there. Shall I choose default so it will use my 896MB VRAM? Running Rivatuners harware monitor when running the game I see a max VRAM use of around 530mb when the Video Memory is set to 'very high'. 'Low' limits it to around 300mb and 'default' gives around 360mb. I can't say i see or feel any difference at any setting. This is with a 1GB 9800GT GPU with textures on Normal, and I guess increasing textures would show more memory being used. Edited July 6, 2009 by CallMeSir Share this post Link to post Share on other sites
lowang 2 Posted July 6, 2009 hmm. I will test it with rivatuner too, but I have an experience that texture detail can be set to maximum in almost every game without any problem...maybe only if you'd have a really low RAM... Share this post Link to post Share on other sites
C4PROOF 10 Posted August 26, 2009 (edited) I'm running Vista Home Premium x64 on an Asus P6T Deluxe v2 with 12 GB DDR3 and an Intel Core i7 920 and two crossfired Sapphire ATI Radeon HD4890 OC's and the sim runs like butter and it screams on 1920x1200 with full detail and the Fillrate Optimiser on 100%. I've tried the slider to the left and it sucked major golfballs...and I tried it to the right and I saw little improvement and a slight decrease in fps' in AI intense close city combat. My advise is not to touch the slider. Yet it is fun fiddling around with it when you're a tweaky person. ;) Edited August 26, 2009 by C4PROOF Share this post Link to post Share on other sites
C4PROOF 10 Posted August 28, 2009 And then there was patch 1.03. :rolleyes: Share this post Link to post Share on other sites
1longtime 10 Posted October 4, 2009 (edited) I need to revive this old thread from the dead to settle a question. To review, alot has been added to Arma 2 since the original post. At this time we're at 1.04 with several AA settings and fillrate options. I've seen many responses to explain fillrate and usually relate it to AA, but what I don't understand is this: does Arma load a different texture image according to the selected fillrate option? I ask because I recently bought an SSD. I could not use a high fillrate using a traditional HDD... it was a slideshow when i turned my head if I went beyond 133% fillrate. However, after changing NOTHING ELSE on my system except buying an SSD, I can now run at 200% fillrate with very smooth frames, often over 30 fps. My natural conclusion is that Arma must load larger texture files from disk when I increased the fillrate, but I really don't understand the mechanics of the fillrate... everything says "image is rendered at x and y pixels" but I'm not clear on which original image is being used or if the loaded target texture image changes when the fillrate changes. If there is just one image of, for example, a tree that is 50KB in size... and it is rendered bigger or smaller... then the system needs to read 50KB no matter what, right? BUT NOW I'm guessing there are actually MULTIPLE images of that tree, and the 200% fillrate is actually much larger (perhaps 100KB or more), therefore placing more emphasis on drive read speed. That would explain the magic improvement I've experienced with an SSD. Again, I really can't see how an SSD made such a dramatic improvement with fillrate capability if fillrate doesn't somehow relate to reading from the drive... I'm just not clear how that really occurs. The following statements are ABSOLUTELY certain in my mind: a) using a traditional HDD, fillrates over 100% had very low fps when I turned my head (below 15, usually 10 or less, unplayable stuttering) b) I bought an SSD and moved Arma 2 to the SSD c) fillrate of 200% runs at over 25 fps continuously now, often over 30 Does anyone know the answer? The curiosity is killing me. Edited October 4, 2009 by 1longtime Share this post Link to post Share on other sites
MadCatChiken 10 Posted October 4, 2009 I don't even get the fillrate option. Share this post Link to post Share on other sites
Deadfast 43 Posted October 4, 2009 I don't even get the fillrate option. It was renamed to 3D resolution in one of the patches. Share this post Link to post Share on other sites
Binkowski 26 Posted October 4, 2009 I don't even get the fillrate option. They changed it in one of the patches to 3D Resolution. Share this post Link to post Share on other sites
1longtime 10 Posted October 4, 2009 So no one knows the answer definitively? Does increasing the 3D resolution to 200% change the volume of read data from the hard drive? I see no other answer to the SSD-fillrate-miracle. Um, maybe it loads the mask that is specific to the configured fillrate? (I'm not even sure what that means, but it sounded cool...) Share this post Link to post Share on other sites