Jump to content
Sign in to follow this  
vipermanden

Server settings on a 10/10mbit

Recommended Posts

Can anybody help me to setup an Arma dedicated server, with the right settings (best performance) for a 10/10 mbit, Athlon64 2.4 dual core 1GB Ram. Best Regards Vipermanden  crazy_o.gif

Share this post


Link to post
Share on other sites

if he use ur tool he still need to know how to config your "" performance options...i think its that one he needs to know

and maybe afterwards not use a tool so it doesnt take supplemental resources

Share this post


Link to post
Share on other sites

Thanks very useful tool, but as Jin wrote I would still learn how to set parameters for best performance for my bandwith 10/10 and my specs on computer.

Share this post


Link to post
Share on other sites

Maybe someone have a 10/10 mbit to run a dedicated server, and have a ArmA.cfg config they can share with a noop. Our server runs great the first 2 min. and then its getting laggy (high ping) at start 10-20ms in ping and after 2 min about 150-200ms.

  Please help me

There is a firewall on , and the ports 2302, 2303 and 2305 are open

Share this post


Link to post
Share on other sites

What matters most when you're hosting a server is your <span style='color:red'>upstream</span> capacity. Generally speaking, consumer upstream speeds will be less than 1024Kbps, that is to say, 1024 kilobits per second. Since there are 8 bits in a byte, you have to divide that figure by 8 which gives you 128KB upstream capacity. Since each player needs 8K to play without lag, you'll be able to host 16 players comfortably. If you're going to be hosting and playing on the same machine, then the number of slots will drop by half.

If you decide to ignore the values and host 64 players for example, then those players with the fastest connections will eat up the available bandwidth first. That will lead to considerable lag for those individuals with slower connections.

The 10Mbit value that you quoted is your own download speed. That's only relevant if you're going to be playing on somebody else's server.

Share this post


Link to post
Share on other sites
Well we have 10mbit up and 10 down

Lucky you...who's your ISP?

Share this post


Link to post
Share on other sites

I'll also gladly get the info.

I've tested a dedicated server with default parameters on a LAN yesterday. I was alone playing London Bridge (Karillion's mission) and I had the teleporting IA effect.

I think that two parameters are important:

MaxMsgSend=<limit>;

Maximum number of messages that can be sent in one simulation cycle.

Increasing this value can decrease lag on high upload bandwidth servers.

Default: 128

MinBandwidth=<bottom_limit>;

Bandwidth the server is guaranteed to have (in bps). This value helps server

to estimate bandwidth available. Increasing it to too optimistic values can

increase lag and CPU load, as too many messages will be sent but discarded.

Default: 131072

The second value states that the server has a 128Kbps upload speed (twice the speed of the analog phone line) so that pretty crappy. I assume this must be changed to a far higher value.

The first one seems directly connected to lag (that is probably causing the teleporting effect). Here I have no clue for choosing a right value.

We can do the trying and guessing way, using monitor to check the BW and CPU used but it's not that satisfcatory..

1.

Share this post


Link to post
Share on other sites

BTW, here's the advice from BIS for a 1Mb upload server:

The greatest level of optimization can be achieved by setting the MaxMsgSend

and MinBandwidth parameters. For a server with 1024 kbps we recommend the

following values:

MaxMsgSend = 256;

MinBandwidth = 768000;

1.

Share this post


Link to post
Share on other sites

So I have a question for SUMA or anyone else that feel extremely confident in their answer.

Can I assume the following recommended calculations?

MaxMsgSend = ?

MinBandwidth = 75% of Upstream (in bps)

So in other words instead if we have a server with an Upstream of 2048 Kbps (2,048,000 bps) then BIS recommends the following?

MaxMsgSend = ?;

MinBandwidth = 1536000;

Is this a correct assumption?

What formula can we use to set the MaxMsgSend??

Share this post


Link to post
Share on other sites

ViperMaul,

I think the formula is ok for MinBandwidth. Assuming you're ArmA server has a dedicated link, it's a conservative setting to specify 75% of your upload.

As for MaxMsgEnd, it's the big unknown ! It's most probably related to the number of used player slot and may be to the number of user generated AI (ex: AI recruited in Evolution). As this will change, we need to find some kind of formula and apply it assuming max connected player.

Googling "MaxMsgEnd", I've seen value as high as 16384 for LAN server !

Beside the BW usage, I also think the more message you're sending, the more CPU it's costing.

As you said, we really need an official BIS feedback here.

1.

PS: I'll try to talk with an ArmA server admin tonight and post his settings if I can have them.

Share this post


Link to post
Share on other sites

I'd also be interested in some detailed information from BIS about what maxmsgsend actually does, what it effects and what it's optimal setting is on various setups.

A formula would also be good.

Share this post


Link to post
Share on other sites

I have been testing server settings on my LAN. It's funny, I can't seem to get the damn units from jumping/warping. This bugs me cause just about any other FPS performs excellent on LAN.

To start, I tested the default settings and then the recommended settings for 1024mbit connection, then I doubled, trippled, 10 fold, 100 fold, the MinBandwidth.

I did several test matrix adjusting Maxmsgsend, maxsizeguaranteed, Maxsizenotguaranted and of course minbandwith. Also tried chaning MinErrortoSend from the default to lower, and increasing lower numbers.

I have a gigabit capable LAN. The host machine is an AMD 4000 with 2gb RAM. Using #monitor and testing normal sized missions, I can maintain 35 to 45 FPS.

Just for fun I set up my primary pc, a core2 duo running at 3ghz, 2gb RAM and tested for jumping/warping units... same result.

And yah, it was always dedicated. Running patch 1.05. My LAN IS communicating at 1gbit.

Never did get Opflash to run any smoother (non jumping units) so why should ArmA be any different?

Anyone have good settings for a dedicated server on a LAN? banghead.gif

Share this post


Link to post
Share on other sites

Oh, I should read more carefully... I was changing the setting in the server.cfg file in the root directory. Guess all those years with Opflash messed me up.

I should have modified the settings in the Arma.cfg file. Found the link in another forum post, but anywayz here is the link

http://community.bistudio.com/wiki/Armed_Assault:Dedicated_Server

Damn, guess I will need to retest my settings.

Will report back with some settings... wish me luck

Share this post


Link to post
Share on other sites
Oh, I should read more carefully... I was changing the setting in the server.cfg file in the root directory. Guess all those years with Opflash messed me up.

That might be why you never saw any differences in your attempts to tweak OFP either - the MaxMsgSend and similar parameters go into flashpoint.cfg in OFP and into ArmA.cfg for Armed Assault. smile_o.gif

Share this post


Link to post
Share on other sites

Sorry to resurrect this thread - but we just got a dedicated server on 10Mbit, and I'm looking for server configs - did Zito or Pong ever get anywhere with their experiments?

Share this post


Link to post
Share on other sites

For 10Mbit line I suggest you try this. But I suggest you try your 1.15B server performance WITHOUT putting any of these in your basic config. Delete all these line and give it a try and change accordingly one by one.

Whats good for one server may not be good for another because of Conn Quality, Server strength, type of gameplay etc...

<table border="0" align="center" width="95%" cellpadding="0" cellspacing="0"><tr><td>Code Sample </td></tr><tr><td id="CODE">

MinBandwidth = 262144;

//* Bandwidth the server is guaranteed to have (in bps). This value helps server to estimate bandwidth available. Increasing it to too optimistic values can increase lag and CPU load, as too many messages will be sent but discarded. Default: 131072

MaxBandwidth = 100000000;

//Bandwidth the server is guaranteed to never have. This value helps the server to estimate bandwidth available.

MaxMsgSend = 256;

//* Maximum number of messages that can be sent in one simulation cycle. Increasing this value can decrease lag on high upload bandwidth servers. Default: 128

MaxSizeGuaranteed = 512;

//Maximum size of guaranteed packet in bytes (without headers). Small messages are packed to larger frames. Guaranteed messages are used for non-repetitive events like shooting. Default: 512

MaxSizeNonguaranteed = 128;

//Maximum size of non-guaranteed packet in bytes (without headers). Non-guaranteed messages are used for repetitive updates like soldier or vehicle position. Increasing this value may improve bandwidth requirement, but it may increase lag. Default: 256

MinErrorToSend = 0.005;

//Minimal error to send updates across network. Using a smaller value can make units observed by binoculars or sniper rifle to move smoother. Default: 0.01

Share this post


Link to post
Share on other sites

Wolfy ... the info from BIS gets a little confusing, and the information is geared to servers that are staying in spec (50 players max)

When you hit 70 players your bandwidth usage increases sharply, not quite exponentially, but close to that.

With 70 players it is not unusual to see 14Mb/s out and 10Mb/s in.  While our settings are not the best for every server, and not set in stone for us (we tweak them almost weekly) here is what we use as the baseline.

We are on a 100Mb/s circuit based out of UK.

<table border="0" align="center" width="95%" cellpadding="0" cellspacing="0"><tr><td>Code Sample </td></tr><tr><td id="CODE">//Data TX/RX configuration - On a 100Mb/s pipe

MaxMsgSend=512;        //can be increased on a high performance server (512)

MinErrorToSend=0.008;      //if you're seeing jumping from a long range (11 for linux box)

MaxSizeGuaranteed=512;       //should be as low as the server can handle (512)

MaxSizeNonguaranteed=384;   //higher bandwidth usage but better fps (384)

// Bandwidth Settings

MinBandwidth=65536000;       // set to 65mb

MaxBandwidth=104800000;    // set to 100mb/s

// no custom faces

MaxCustomFileSize=0;      

Some of these are set differently, like the MaxSizeGuaranteed, we play with that number a lot.

I hope this is of use.

Share this post


Link to post
Share on other sites

You guys are mixing something up here!

"MaxMessageSend" is not a "size" in bytes or whatever, its the amount of packets in one "simulation cycle" (in the Arma-Engine known as "Server-FPS"), which the Server will send out during that cycle.

Just read what BI writes about it:

Maximum number of messages that can be sent in one simulation cycle. Increasing this value can decrease lag on high upload bandwidth servers. Default: 128

"512" is the value we use currently, and this has greatly reduced rpt-log entires about "network messages pending" at around 90% compared to default "128" Value.

Brit-XR from Hotshots once told me that he tried a value of 1024 for MaxMessageSend, but it lead to greatly reduced Server FPS aswell as to quick happening crashes, so we never tried a higher value as 512 because of fear of crashes.

"MinErrorToSend" should at our experience at least be halved comapred to default value (0.005 -> 0.0025).

This greatly reduces that odd "jumping" or "beaming" of enemies when looking through scopes or Binoculars/whatever.

But i wont reduce it tooooo much, as this has according to our test a large impact on Server-FPS.... So its not a good idea to use for instance "MinErrorToSend=0.0001;" tounge2.gif

Bandwidth Values should be clear... we at 100MBit are using these values:

minbandwidth = 10000000; //10Mbit

maxbandwidth = 100000000; //100Mbit

These are the most tricky ones:

MaxSizeGuaranteed = 512;

//Maximum size of guaranteed packet in bytes (without headers). Small messages are packed to larger frames. Guaranteed messages are used for non-repetitive events like shooting. Default: 512

MaxSizeNonguaranteed = 128;

//Maximum size of non-guaranteed packet in bytes (without headers). Non-guaranteed messages are used for repetitive updates like soldier or vehicle position. Increasing this value may improve bandwidth requirement, but it may increase lag. Default: 256

One must calculate average package/datagram sizes of Ethernet, Internet (WAN) and aswell the used Protocols and what headers they use, what user-data space they offer etc, to avoid as much package-fragmentation as possible.

I don't have a perfect value here, but for

"MaxSizeGuaranteed" we use "1024".

With common sense it should be found out that you should not reduce it below the default 512, after reading the BIS explanation that "smaller messages will be packaged into larger frames anyway".

"MaxSizeNonguaranteed" could be halved from its default Value ("256" -> "128" ) as well and we found out, that especially when someone with higher ping than "100" drives a car and you are a passenger in that car, it reduced that odd "sync-problem", where you see for small seconds the car running through houses and stuff. tounge2.gif

I was nagging Suma sometime ago about the "Optimal" Values for the Dedicated Server and he answered me something like this (not word-by word, but very similar):

"There are no optimal Values, that why we provide the dedicated Server as it is."

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
Sign in to follow this  

×