Jump to content
bladrzz

Arma 3 on AWS EC2 - how a large scale MilSim shaved 30% off their server costs

Recommended Posts

The preface

 

Couple of points before we get into it:

  1. Moved this post from r/arma to the forums here since this place is probably much better suited for it and I feel like it's worth to share this story.
  2. This forum post is just as much a discouragement as it is encouragement. I'm absolutely not claiming that this is the best solution for all units. If you run a small group, or you're just not comfortable with AWS, don't even consider something like this.
  3. The time cost of setting all of this easily outweigh the potential savings. This was initially mostly educational to see if it was possible, but turned out to be the best solution for us and have been running it for about 1.5 years now. Your mileage may vary!
  4. Any links in this post are purely to show the underlying resources at work. I'm sorry if it includes links to our unit or shows our units' branding. I'm not recruiting anyone or even remotely trying to advertise. It's just the easiest way for me to share.

 

About me

Hello A3 server admins/unit leaders. I am currently the server administrator/developer for a decently sized MilSim A3 unit. In real life I'm a full stack developer who's gone balls deep into corporate grade AWS infrastructure and thought to myself, why the heck are we still using these stupidly overpriced dedicated servers in gaming communities while cloud computing has gotten so far in recent years. And with Arma MilSim being in its current state, these powerful dedicated boxes are just collecting dust until it's time for that one or two weekly zeus mission.

And so I did the research, called up some work colleagues, checked and triple checked all the numbers (AWS billing is fucking scary, press 1 wrong button and boom, you've just got a 1k bill) and went straight in.

 

About our unit (tried to keep it relevant for server infra)

We are a decently sized milsim unit with ~50 weekly active players and a decently sized modpack. We host 2 mandatory Zeus missions with the entire unit, and run courses and classes throughout the week. Our usage is roughly 200 hours per month.

 

Motivation

I'm not going to go too in-depth in configuration, most resources are available online. This is also not a tutorial, maybe I could make one if there's demand for it, it's more to show that cloud computing is feasible for Arma 3 servers. I was told multiple times by the Arma community that this was a bad idea on multiple platforms such as Discord and Reddit. Well. It's not. And here's the proof.

 

The requirements

  • 2 public servers,
    • one that was running our own modpack
    • one that was running creator DLC
  • 2 password protected servers
    • one main servers for all of our planned operations
    • one backup server in case the main server was in-use
  • 24/7 TeamSpeak!

 

The old

We used to rent a dedicated server from NFOServers. This was some old dedicated box with 4CPU/8vCPU, 64GB of RAM and a 2TB SSD and a shit ton of bandwidth (the CPU would literally die if you ran a speedtest... That either tells us that the CPU was dogshit or something else was going on, I'd like to think it was the first).

This single box ran 6 arma servers (max 2 at the same time) and teamspeak. No room for HCs whatsoever. Also, it was Windows, YUCK.

 

Price for this beauty? ~150 USD/mo.

 

The new

Well, this post is about AWS, so... Yeah, we use AWS EC2 instances now.

 

Teamspeak

Teamspeak was an easy pick. We used a t2.micro (free tier) for a year, and since that expired, we did a slight upgrade to a t3.micro. TS needs very little to run, the 2 vCPU and 1 gig of RAM do the job with 50 consecutive users in TFAR so we just bought a 3 year reserved instance for a one time payment of 100 USD.

 

Arma

First up I had to pick a general instance type. 2 stood out for me. z1d and c5. At first I was worried about memory, remember how I said we run a big modpack? Adding mods significantly adds to server memory. So I went with the z1d's. Setting up a simple server all by hand on a z1d.large, to then upscale later onto a z1d.2xlarge. This ran "fine". My initial fears were unjustified. We were CPU bottle necked, not memory. Also z1d's come with on-instance fast M.2 storage, something Arma doesn't really need.

 

So after testing, I swapped to c5. Initially I set up all 4 servers to use a c5.xlarge instance with 1HC. As we got into more AI populated areas, our Zeus' noticed that the AI on the HC was struggling a lot, and so we decided to upgrade the 2 servers that run our mandatory events to c5.2xlarge and added 2 HCs for a total of 3 HCs.

 

Conclusion

So we essentially went from 1 dedicated machine with 8 available vCPUs and 64GB of memory to 5 virtual machines with a total of 34 vCPUs (8 * 2 + 4 * 2 + 2) and 65GB of memory (16 * 2 + 8 * 2 + 1). Not a bad upgrade... But now the real question. What does all of this cost?

Well, AWS is dynamic. You pay for what you use. I averaged out the prices of the months with our current setup and this came to a grand total off......... ~110 USD/mo.

 

Yeah, you read that right, we got a massive infrastructure improvement and save 40 bucks per month at the same time!

 

The hurdles

 

So now that the general scope is defined, I'd like to show a couple of parts of this setup that provide solutions to some of the issues we ran into. I've made most of it available to the public so that it could be useful for other A3 sysadmins that are running into the same issues.

 

Hurdle 1: Players just want a server without contacting a server admin to start it for them

The first hurdle I had to cross was a change in how we use our servers. Previously they were just always there ready to be used, now they had to be started when they needed to be used. Initially this caused for quite some problems since people just wanted to hop on Arma and play the game.

 

Our website uses invision community, so I made an invision community application to manage servers straight from our website. Just put in the configuration, set the permissions for who can start/stop/restart, and *bam* players can now request a server on the fly!

 

Now players are forgetful, and don't turn off the server when they leave. So I added a feature to the forum application to check player count with the steam servers every 5 minutes, and if the server was empty, shut it down after a configurable amount of time.

 

This application is not public, mostly because of liability issues, although it hasn't failed us yet and no significant bugs have lead to increased AWS costs, I just don't feel comfortable having it out there in public. If you are interested, shoot me a PM and I'm sure we can work something out 😉

 

Hurdle 2: Automatic steam workshop mod updates

Don't you hate it when your players contact you and they're like "we can't join the server because of mismatches pls help". Yeah I hated it too. Since I knew I was pretty much out in the ocean with this completely exotic infrastructure I decided to not look around for any scripts or tools and just made my own. It's open-source and you can find it here! Just let it run during the cronjob scheduled at startup that also launches the Arma processes and you're in the money. It can be a bit funky when dealing with the bigger mods because of steamCMD limitations, but I can't completely automate my work. A good programmer ensures job security :))).

 

The reason why this is a nodejs script is because initially I made this script to run on Windows Scheduler, but I already had plans to move to Linux regardless of this AWS story. So I decided to go for a cross-platform solution, that way I didn't have to redo the entire thing when we eventually made the switch to Linux.

 

Hurdle 3: Mission files

Our mission development team had access to an FTP to upload their missions, however, since the Arma servers are now turned off, they can't just access them and upload files to it. My solution? Teach the mission development team Git. Yeah I'm a masochist if you hadn't noticed by now. So now on launch it fetches the latest version of the repository and *bam*. Each server reboot all their missions are there for them to enjoy.

 

Hurdle 4: Automatic launch script & config changes?

Okay I admit, I went way too far in all of this....

 

I was sick of having to copy every little change I did to the launch script (scheduled with cronjob on startup) to all 4 servers. So I thought to myself: "We have a website, websites have file storage, websites are accessible from everywhere". In hindsight I should've done the mission files like this too, however getting a folder is a bit more of a hassle than just single files. So now I modified the cronjob to just set some environment variables, download the latest version of the launch script, and run it. If you're interested in seeing our Linux launch script, you can find it here (note it's a .sh file so just right click and edit with vscode or w/e editor you like).

 

Then I was like, okay, I've got this script to download now, why not just download all of our configs like that too? So I also moved our cba_settings.sqf, basic.cfg, mod-config.json, server.cfg and optional mod keys onto our website for the script to download. If you want to see any of them, you can do so by going to https://3rdinf.us/gameservers/<previously mentioned file here>. Since it's publicly available, it's void from any passwords or crucial info, those are all filled in on the server using environment variables.

 

The future

So... what's left to do? One thing I'm experimenting with at the moment is moving the mods from each individual server onto a central EBS volume that can attach to multiple instances. EBS is quite expensive, at about 10 cents per GB, and we have 80GB allocated per server, we can probably save about ~20USD with just that.

 

The only thing I'm struggling with at the moment is that we'd need an additional server to launch on a schedule during a time when nobody is online and update the mod EBS volume. I don't know what the consequences are of multiple Arma instances accessing the same mod drive, additionally, I don't know what happens in the event that there are players using the servers while the mod update server modifies the volume.

 

The conclusion

This was quite the experience.... We learnt a lot from this transition. From setting up a server on Linux, to cloud computing, to invision community applications and, most importantly, AWS pricing.

 

This infrastructure gives us so much flexibility and scalability. Before our server hardware was locked for 1 or 3 year periods because we could save a couple of bucks that way. So if the unit was to suddenly grow or shrink significantly then this would be a big financial hit. Now we can just downscale if the unit downscales. Or massively upscale for a single event if we want to host a joint-op with another unit, and then immediately after downscale back to our regular sizes with just the click of a few buttons.

 

The pros of "pay what you use" and the extreme scalability allows us to save roughly 500 USD per year. Which is quite a considerable amount for a gaming community. All units have 2 big pillars holding them up. Financials and leadership. And we server admins, we can have a big impact on one of those.

I hope this was useful to anyone looking to suppress costs for their unit, or are just generally interested in AWS. I'm open to answer any questions you've got or to receive feedback/constructive criticism.

  • Like 2
  • Thanks 1

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now

×