Jump to content
Sign in to follow this  
stang725

Looking for Settings from another High-end NVIDIA based rig

Recommended Posts

Build

4770k @ 4.3GHz

8GB @ 2133

256 SSD

2x GTX 770 4GB @ 1167Mhz (was getting about 2.3GB vram usage)

1080p 144hz

Using: GPU_MaxFramesAhead=3; GPU_DetectedFramesAhead=3; for Frames being rendered in the ARMA .cfg file, no other tweaks. Also using the start up parameters for 4 cores, 7 threads, 3GB vram, 2GB of RAM (its 32-bit, so not sure why people are calling out more than 3GB?) and some of the other more commonly called out start up tweaks for loading world empty and turning off logs.

Also can someone explain what the real optimal value for -maxMem "2047", right? I see people listing 4GB and 8GB? Its a 32-bit program, how it is able to use 4GB+? I'm running 64-bit Win 7 if that matters.

All settings on ULTRA and all other options maxed out as well (FXAA ULTRA & Clouds to low). Draw distances are set to 1600/1200. I get close to 99% usage on both 770s with these settings @ 2.3GB of vram usage. If I go with ULTRA across the board and all settings maxed w/ 3000/1800 draw distance (NVIDIA geforce experience settings), I only see 60% GPU usage, so I assume I am limiting FPS due to CPU and it makes more sense to give up till I am GPU limited (>90% GPU usage).

Using Altis 0.6 benchmark I get 76 for my optimal settings, 43 using NVIDIA settings.

-cpuCount=4 -exThreads=7 -maxMem=2047 -maxVram=3071* -noBenchmark -noLogs -noPause -noSplash -world=empty (*someone told me that 3GB was max setting for vram?)

steamLanguage="English";

language="English";

forcedAdapterId=-1;

detectedAdapterId=0;

detectedAdapterVendorId=4318;

detectedAdapterDeviceId=4484;

detectedAdapterSubSysId=906761304;

detectedAdapterRevision=161;

detectedAdapterBenchmark=1000;

displayMode=0;

winX=16;

winY=32;

winWidth=1024;

winHeight=768;

winDefWidth=1024;

winDefHeight=768;

fullScreenWidth=1920;

fullScreenHeight=1080;

refresh=144;

renderWidth=1920;

renderHeight=1080;

multiSampleCount=8;

multiSampleQuality=0;

particlesQuality=2;

GPU_MaxFramesAhead=3;

GPU_DetectedFramesAhead=3;

HDRPrecision=16;

vsync=0;

AToC=15;

cloudsQuality=0;

pipQuality=3;

dynamicLightsQuality=4;

PPAA=4;

ppSSAO=3;

ppCaustics=1;

tripleBuffering=0;

class ModLauncherList{};

serverLongitude=-97.941002;

serverLatitude=30.712999;

ppBloom=1;

ppRotBlur=0;

ppRadialBlur=0;

ppDOF=1;

ppSharpen=1;

Edited by stang725

Share this post


Link to post
Share on other sites

sorry i cant be more helpful. but expect latency and desync to be your fps bottleneck. welcome to arma 3 and 30 fps with your 3k machine.

Share this post


Link to post
Share on other sites
sorry i cant be more helpful. but expect latency and desync to be your fps bottleneck. welcome to arma 3 and 30 fps with your 3k machine.
If your not going to be helpfull why fucking post? And your experience is not like mine.

@stang725

GPU_MaxFramesAhead=4; GPU_DetectedFramesAhead=3 They should be the same.

I use 1.

Detected is usually what your Vid driver is using. Nvidia has the setting in your 3D settings.

Max is what you set in game to change the Driver setting. 1000 is the default to Driver setting. 0 is just that, tho it may default to Driver depending. 1-8 is the values you can use, tho 1-4 is more current. AMD/ATi is in the 1-3 range, and Nvida is the 1-8, but depends on Driver and Card. The setting is really better for SLI/CF, but works with single cards. I find it to help remove some "hitching". Its a GPU to CPU que thang.

-maxmen=2047 is correct. The folks with 4GB or 8 GB are wrong, and it will just default to 2047.

Draw distance and View Distance are big CPU users. And are the biggest FPS users.

Ultra settings don't really lower my FPS. Fast good cards should use them.

PPAA/FXAA/SMAA are just blur filters to hide some Aliasing, and do very little in my experience. I used to use FXAA for the sharp filter, but now Bis has given us a Sharp fillter , so PPAA isnt needed.IMO

MSAA/FSAA is the real deal for AA. But is the second biggest FPS user. Cant go less than 4X or the game just crawls/jaggies.

You need in game AA to use ATOC. Large FPS user.

Its about the IQ for me. I will run lower FPS for better IQ.

I set my Vid-Cards Settings to application preference. Use only the in game settings.

KK

Share this post


Link to post
Share on other sites
sorry i cant be more helpful. but expect latency and desync to be your fps bottleneck. welcome to arma 3 and 30 fps with your 3k machine.

I can get 60-80fps on the nomercy stratis wasteland server with the near-max settings mentioned above and my ping to the server is around 18-30ms so not really following you on that one...

Max is what you set in game to change the Driver setting. 1000 is the default to Driver setting. 0 is just that, tho it may default to Driver depending. 1-8 is the values you can use, tho 1-4 is more current. AMD/ATi is in the 1-3 range, and Nvida is the 1-8, but depends on Driver and Card. The setting is really better for SLI/CF, but works with single cards. I find it to help remove some "hitching". Its a GPU to CPU que thang. KK

So for 770 SLI, I should just try 1-8 (both settings matched) and look for best value for my specific set up? Also, so no PPAA? I will try that as well, thanks. I don't see much diff in FPS between 4x or 8x AA btw, but maybe I would now that I am maxing out my GPUs with the lowered draw distance and near max settings for my CPU/GPU combination.

EDIT: Went with 3/3 for frame settings, kept PPAA since there was no fps hit from turning it on/off.

Edited by stang725

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
Sign in to follow this  

×