Jump to content
Sign in to follow this  
Nick (SS)

JIP vs Server out put

Recommended Posts

Now we all know these servers have terrible LAG while JIP is in use, what i dont understand is that our 100mbit line servers simply do not utilize the lines, I for one run allot of these servers and have good grasp on the the bandwidth settings and can even explain how it works

but the proplem is the ArmA/OFP server it self, it just does not use the line speeds it has been blessed with, nothing any us does stops LAG from Dropped players or JIP

I would like to know if BIS has other options for what the server can use in the future, there had to be some way of letting the server just use the lines we put on them

For instance, say I have a 64 man server with 30 clients on it out puting 1.5 mbit and 2 players connect with files (sound/xml) the server halts what it is doing and remains at 1.5mbit till these players join, why cant the server use its 100mbit line better to update all the clients by pushing to say 5mbit for a few seconds which is all that it really needs

theres really no reason for this now adays and its really frustrating to every one

Share this post


Link to post
Share on other sites

i agree. But it might be harder to do. who knows into the unknown.

Share this post


Link to post
Share on other sites

It shouldn't be that hard to optimize the net-code to prevent this from happening.

Take for example MMOGs, imagine the mess "JIP-LAG" would cause on that kind of games.

Share this post


Link to post
Share on other sites
It shouldn't be that hard to optimize the net-code to prevent this from happening.

Take for example MMOGs, imagine the mess "JIP-LAG" would cause on that kind of games.

Yes, because MMO have to send incremental update of any object that have changed state between mission.sqm and JiP time (fallen trees, destroyed houses, broken wall parts, on a map that is something like 100 square km large), synchronize running scripts and variables, as well as triggers and all this kind of nice stuff ArmA editor gives to player, because we all know how precise is MMO physics, how you can do everything to the environnement, and how open they are to scripting and such stuff.

Or perhaps MMO only has to update about players and bots position, in fixed scenarios and situations where all bots positions are completely predictable and always the same?

Just try to imagine the effect of a M1 going through the edge of a forest, into a town, across several gardens, shooting at 1 or 2 vehicles (whose explosion will themselves blow a few things around them), count the number of objects (tress, bushes, houses, walls, vehicles) affected and for which connecting clients need a state update.

This is just for like 30 seconds of action of 1 single vehicle over a tiny portion of the total possible map.

And this is just to update objects states.

Count on top of that, that during the update process, game is continuing, adding new updates to be sent (in fact, I believe the additionnal "recieving mission" after connection is for the post-connection synchro with everything that hapenned during connection itself... In fact, I sometimes even see this phase twice, meaning that during the post-synchro, too much other things happened to update them in-game and require another synchro phase)

Take Warfare. All the times, unit are created. The units mid-game have nothing in common with what was originally described in the mission.sqm. That means models to load, variables to attach, etc... on the client.

After 1 hour of Warfare, count the number of updates that need to be sent for all actions that happened over South Sahrani.

Put things in perspective.

"shouldn't be that hard to optimize"? Be my guest, I'm eagerly awaiting your technical proposals for a simple JiP process, given the scale and openness of ArmA smile_o.gif

Share this post


Link to post
Share on other sites

I sure understand there is a big amount of data a client have to receive (and the server to send) for every new client.

But let me use an example.

A 100Mbit bandwidth server, 30 players playing on server, 2 new players join and both players have a good connection. If you check server throughput, the amount of bandwidth used is small compared with the server's limit. Then suddenly everything stops. Players and AI running at the same spot.

If the problem is not the bandwidth, then it have to be a cpu bottleneck, but the server specs are really good.

What kind of cpu is needed to run a 32 players Warfare map if a n Intel QX9650 is not enought?

crazy_o.gif

Share this post


Link to post
Share on other sites

As we do not know how the ArmA netcode/engine works in this

regard we can only guess.

From my understanding if a new player joins, all data is immediately

transfered - from the server to the new guy. No idea if in this

process the other clients are involved, but it could be.

The more players you have and more data has to be sent

(GV, setVariables, position and state of units, etc) and the more

the server has to process.

To me it seems the server halts the game until all data is

transfered. This might make sense to sync the new client with

the rest.

So its not about the amount of data, yet the delays

(confirmation of received data), packet receive time and server

process power.

Again only my guesswork.

PS: I was told that Warfare isn't optimized well in terms of JIP

and network traffic. Not sure if its true.

True is however that the mission has a HUGE influence on the JIP

lag. The more data, the more lag.

Share this post


Link to post
Share on other sites
To me it seems the server halts the game until all data is

transfered. This might make sense to sync the new client with

the rest.

!!!

I sincerely hope not! biggrin_o.gif

Seeing how long the sync process can be on client that would mean a stop of mission for minutes sometimes. Simply not possible, imho.

+ you don't know how fast a client can respond and handle the sync process itself, it can take long simply because of client, far too unpredictable to force the server to halt the mission processing in the meantime.

I think the server continues with the mission at the same time it sends information (taking more CPU for collecting sync data and sending them, meaning you can see desync incoming on clients allready connected as the server takes more time to process data sent to them).

Once new client is synced, the server checks what happened in the mission during the sync process. If the amount of "actions" (ie, data to send to be perfectly synced) is too high, a second sync pass is done (the filling bar just before entering game). Otherwise, it'll be done "in game" as the amount of data to send is manageable while playing (new client will just have a bit more data sent to him for a lil time at the beginning)

Very wild guess there, but .... well, it makes sense, to me, at least smile_o.gif

Share this post


Link to post
Share on other sites

The way I see it, it has to be in at least two stages. The same way starting a regular MP works:

The first stage being every thing that goes on while players are in the briefing screen. Including organizing groups and vehicle roles e.t.c Executing all the script commands encountered up until a script is paused with Sleep e.t.c

The second stage starts once the players move into mission from the briefing screen. Finishing off any scripts that were paused during the first stage e.t.c

JIP looks very similar:

The first stage probably as above, with the addition of processing all the SetvehicleInit commands executed on the server since mission start. Scripts launched with SetVehicleInit are also subject the Sleep command. It looks like the processing during this stage, is represented on screen as a progress bar.

The second stage, again, including all the SetVehicleInit commands still to be finished.

As all of this can take an undetermined amount of time. At some point, as whisper says, the client has to be re-synched, as no doubt something has changed since startup.

Quote[/b] ]I was told that Warfare isn't optimized well in terms of JIP and network traffic. Not sure if its true.

How is this measured? Are there any concrete guidelines for optimising JIP. I would certainly be interested in seeing them, as long as it was info gained through experimentation rather than personnel preference.

I guess some things are just common sense, this must be bad:

<table border="0" align="center" width="95%" cellpadding="0" cellspacing="0"><tr><td>Code Sample </td></tr><tr><td id="CODE">For "_i" From 0 to 1000 Do

       {

       _Object SetVehicleInit "[""This is a very long string designed to waste space""] ExecVM ""AReallyLongNameUsedToDescribeAScript.sqf""";

       };

ProcessInitCommands;

Compared with:

Init.sqf:

<table border="0" align="center" width="95%" cellpadding="0" cellspacing="0"><tr><td>Code Sample </td></tr><tr><td id="CODE">S1="This is a long string designed to waste space";

S2=Compile PreProcessFile "AReallyLongNameUsedToDescribeAScript.sqf";

<table border="0" align="center" width="95%" cellpadding="0" cellspacing="0"><tr><td>Code Sample </td></tr><tr><td id="CODE">For "_i" From 0 to 1000 Do

       {

       _Object SetVehicleInit "[S1] Spawn S2";

       };

ProcessInitCommands;

Ok, might not save massive amounts of time, but the difference in size must have some effect when executed a 1000 times?

Share this post


Link to post
Share on other sites

I do not mean that the server waits until the client confirms

to be in sync. That would be crazy. Yet the server sends all

required data to the new client or maybe even needs to gather

data for that.

Anyway in the end with high amount of data to be transfered -

either high player count or high data from scripts or both - can

make the server sort of halt, all clients get yellow/red chain etc.

If the server is not in very low FPS (= below 3 FPS), he should

work again after a few seconds.

Sometimes the server is unable to resync - endless yellow chain

and crashes eventually.

Share this post


Link to post
Share on other sites

My Experience:

BandWidth:

[*] The connection between your network card and the network switch, might be 100mbit/sec or even 1 gb/sec. The problem however is that it doesn't mean you have that bandwidth guaranteed available.

[*] You can set the bandwidth limits etc in arma.cfg (check the biki). The default limit iirc is 20mbit, so then it will never use the full tounge2.gif

LAG at JIP:

[*] Too much text sent over network, like UNN says

[*] Too much usage of setVehicleInit, even where you don't need the command remote executed on JIP player machines. A better solution is 1 publicVariable, with publicVariableEventHandler and use that to execute non-persistent code

[*] Generally MP unoptimized code. When you know how the engine works and where the problems are, you can compensate for that during development.

Share this post


Link to post
Share on other sites

the real key is that we have tried optimizing these servers to use more then they are using but they never do when a new client connects or drops, when a client drops I would call that CPU issue and even a bug, I know the server needs to figure out if the client is still there but really why does it need to kill every one on the server, BIS has to fix this in ArmA2 because even I have stopped playing ArmA

Share this post


Link to post
Share on other sites

Yes the Arma Server is probably the unoptimized piece of work i ever used. The Arma Server can even Lag the fastest PC on this Planet to death.

Yes the JIP-Lag is a very, very, very big Problem (i cant say that often enough), but i also see that abandoned Cars and wreckage things + dropped guns and all reduced the Server FPS tremendously.

Question is: Why?

Why does it increase the Server-FPS tremendously (up to 100%!wink_o.gif when removing any wreckages and abandoned cars during longer plays?

This must be truly fixed for Arma2.

BIS must spend more work on optimizing the Arma dedicated Server, Netcode + deliver this time a VALID and Easy-to-use user-guide, on which values detailed explained what they do, or even better the server should measure the available CPU/RAM + Bandwith and decide his optimal Value from alone.

@ Sickboy: The Default Value is as far as i know optimized for 1Mbps, according to BIKI-entries.

Anyway, if i read values like "MaxSizeNonguaranteed=" and the description for that like "Maximum size of non-guaranteed packet in bytes", i must put my hand on my head.

What does that mean "non guaranteed" ? Packets that are not guaranteed to reach its destination? Thats one of the real flaws in a easy to understand (for optimization) user-guide in values

Also i have so far no official word seen on how to set these Values to a Optimal Level for a common Dual-Core with 100Mbps Internet-Connection, as this is probably the most-used configuration (at least in terms of Bandwith).

I hope that parts of such improvements will be done by an extra Dedi-Server update for Arma1, but i wont hold my breath for it.

Regards, Christian

Share this post


Link to post
Share on other sites
Quote[/b] ]Why does it increase the Server-FPS tremendously (up to 100%! when removing any wreckages and abandoned cars during longer plays?

Removing any wreckages e.t.c sounds like something done with scripts? How do you know the scripts aren’t the cause of the drop if FPS.

Quote[/b] ]Easy-to-use user-guide

I suspect there is no such thing. Considering the complexity of the game without it, JIP is a brave undertaking compared to any other game? I'm sure BI are still working on improving it.

There was an obscure bug that caused desynch with JIP, if you had your squad in a vehicle when you disconnected and reconnected. It's been fixed since 1.14 and I can't remember seeing it in the change log.

Share this post


Link to post
Share on other sites
Quote[/b] ]Removing any wreckages e.t.c sounds like something done with scripts? How do you know the scripts aren’t the cause of the drop if FPS.

When I do a fresh server start, the 1st MP mission played seems to keep all dead bodies and wreckage, but on a mission restart, or the 2nd MP mission, they disappear over a short time.

Share this post


Link to post
Share on other sites

I had a little time to start doing some math on the servers outputs

these are in no way 100% but will show the differences between the amount of scripts/AI/Players can have on the server

I do have some reports that 100 players have only used 4mbit out but i think thats a ban width setting

I'm going to try and get more real world numbers and then produce a formula that will give us more precise ban width settings and then ask BIS to use the formula in the game so we do not as 2% of us even know how to really use them lol

usages.gif

Share this post


Link to post
Share on other sites

most of you will be in the 2nd chart, other CTF servers will use the top charts, while massive EVO/CTI/LIFE maps will use a mixture of 2nd and 3rd as not ever client will have max AI so adding 2 and 3 together will get a better average from some of you

the next part is figuring the math BIS used for ban width settings

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
Sign in to follow this  

×