Jump to content
Sign in to follow this  
Alabatross

When will the nVidia Arma 3 optimized drivers come?

Recommended Posts

(generally better than Arma 2) so I am probably not suffering the way some other users may be, hard to tell.

That´s comforting.

Pretty sure your engine is using a couple of percent of the potential atm.

Share this post


Link to post
Share on other sites
Well, I guess I just should repeat the same blog for people to bother to read before they get involved in the threading discussion: http://www.bistudio.com/english/company/developers-blog/91-real-virtuality-going-multicore

In short: Path finding is multithreaded since Arma 2 (in that blog you can hover your mouse over the image to get user friendly description about what each thread is doing in that situation profiled there).

Well, I guess something is locking the worker thread because as soon as AI are added my fps goes from 60+fps to 20-30fps in a local hosted MP game.

---------- Post added at 20:47 ---------- Previous post was at 20:25 ----------

when next gen of consoles come out and set the standard to 60fps even for casual players.

This made me think of the Oculus VR headset, we are going to need fast fps for that to work. It will never be on the cards for Arma if we can't get stable fps over 60 at peak times.

Share this post


Link to post
Share on other sites
to be honest, the cpu optimization, ie multie thread implementation was the first thing they should have worked on for the game. i'm seeing almost no improvement from arma 2 oa.

Do you realize how complex multi threading is? Very few games actually utilize multi-threading because the overhead is NEVER developer friendly. This is why Maruk is getting frustrated, multithreading is being blamed for no reason. The game just needs to be optimized. The entire rewritting of the core engine is not worth the small perfomance boost it would give

though, this post:

Well, I guess something is locking the worker thread because as soon as AI are added my fps goes from 60+fps to 20-30fps in a local hosted MP game

Is interesting, because Maruk just said path finding is multi-threaded and its quite obvious AI is causing alot of our issues

Edited by Alabatross

Share this post


Link to post
Share on other sites

Thank you for your responses Maruk, this is really nice to see. :) I definitely understand your points about the limitations in what you guys can do concerning major performance optimization and CPU / GPU utilization.

But is the RV engine really handling everything (loading, processing, rendering, etc) the way that it should, and as well as it should, for a modern (2013) game on modern PCs? When you guys created the first iteration of this engine in the late 1990s, PC technology was quite different, yet so much of how the engine functions is anchored in those origins. Would you agree that if you were to have developed a similar engine in today's tech climate, it would draw upon CPU/GPU/RAM resources differently and in a more modern fashion, leading to better utilization and multithreaded performance as in many other recent game engines? Aren't some aspects of this seasoned engine's resource use a bit outdated?

Current performance is not where it should be, especially in terms of pushing the scale of missions to the level and beyond of the past games.

Now as someone who has been playing OFP/Arma since 2001 and has great love for the RV engine's versatility, I certainly wouldn't propose you abandon or totally re-create the engine. Yet at the same time, as the years pass, it increasingly feels like the engine is being held back by its archaic limitations. So despite the great difficulty, if it's technically possible, I think it needs an overhaul and some parts may need to be re-written. This will allows the series to achieve new great heights. Otherwise it will be very difficult for all of us to deal with the same traditional issues in Arma 4 in 2018, especially since IIRC there isn't projected to be any major increase in CPU power. In regards to the original subject, I think there was a Dev post before about Nvidia reps wanting to work with you guys to see where they could increase performance, and I hope you take them up on their offer and let them take a look.

That said, I still love Arma 3, and Altis is a masterpiece! ;)

Share this post


Link to post
Share on other sites
Well, I guess I just should repeat the same blog for people to bother to read before they get involved in the threading discussion: http://www.bistudio.com/english/company/developers-blog/91-real-virtuality-going-multicore

In short: Path finding is multithreaded since Arma 2 (in that blog you can hover your mouse over the image to get user friendly description about what each thread is doing in that situation profiled there).

What about the AI Behavior's? The FSM's and scripts that control what the AI actually does and how they respond to situations. Is that all multithreaded? Sound and Rendering, are those separate threads or tied to the main processing thread?

Do you think if the way these "systems" worked was rewritten from the ground up, you would see better performance versus what we see right now? Is it a time versus money thing? Is it a difficulty versus time thing? I'm very curious what it is because I fail to see any reason the game should run the way that it does. Aside from having very large terrains, there's nothing this engine really does spectacularly that most other engine's cannot do.

Edited by Windies

Share this post


Link to post
Share on other sites

While you're here Maruk, I have a few questions I'd like to ask:

  1. How is AI processed currently? it seems that with significant AI numbers the main process becomes starved and waits for AI to do things, which in turn affects everything else in the main process(rendering). Has the idea of creating an entirely separate AI process been considered? is anything preventing this? (inter-core bandwidth etc)

  2. Stencil shadows, any progress regarding their removal?

  3. Has there been any investigation into server exe threading? specifically AI separation.

  4. Day-Z client-server architecture, What's your opinions of that approach and is there any possibly of sharing with ArmA3?

  5. Object rendering is extremely taxing on the CPU, could you discuss that at all? What level of multi threading is occurring regarding objects?

  6. Is the AI dev-blog still planned?

Edited by Furret

Share this post


Link to post
Share on other sites

Preface: Not all details are probably correct, yet overall most things should be OK. :rolleyes:

Let me attempt to explain how it works on a basic level:

(even though its pointless for the most part :p)

FPS = Frames Per Second.

How many images shown per second.

More shown, means more fluid image.

Generally 60 FPS are needed for really good experience, 120 for excellent.

On the flipside it also means how long a frame is.

In other words more FPS, means shorter time for a frame.

60 FPS = one frame lasts 1/60s

30 FPS = one frame lasts 2/60s - twice as long, engine can do twice as much simulation in a frame.

Why is that important?

The game, more precisely the CPU, does not only have to do graphics calculation,

yet also handle sound, physics, player input, scripting, AI, network data.

Now there is two ways: At times you can let something slip to the next frame (computed later),

yet most things need to be done in a given frame (especially anything related to sync between the different components).

Here is the main problem: If these take too long, the engine needs to prolong a frame.

Which in turn means you will see lower FPS.

In case the CPU (better main syncing thread) needs to do stuff first or in a specific order,

all other "devices" can't progress and idle - hence you can see low GPU or CPU core use.

Now what are the actual sources in the Arma engine for CPU limited FPS (warning: lots of speculation):

1) AI to some degree

While there is optimization of various sorts, each AI needs to do stuff in "each" frame.

So when idling, AI does not have to do much calculations,

yet when moving in combat it has to evaluate lots of stuff for example.

Of course the more AI there are, the more the CPU has to compute overall.

2) Number of entities (players, AI, mission created objects (vehicles, weapons, wrecks, misc stuff, etc))

The more entities there are, the more simulation (of different sorts) the engine/CPU has to do.

If we get above (estimates) 100 players, or 500-1000 AI, or LOTS of objects spawned across the whole terrain,

the numbers will matter in terms of FPS.

3) (Network) synchronization

The game needs to keep the game state of all clients and the server in sync - otherwise everything would break apart (game stops).

Now if the server is too weak for the given missions/number of entities / there is "too much traffic",

the engine/CPU has to spend a lot of time (of the frame) to keep stuff still in sync and working.

(2 and 3 seem to be the main problem of Wasteland among other problems in that mission)

Now there is/can be of course other sources for low FPS, like GPU as limiting factor, HDD doesnt supply data fast enough,

there is too much data to be cached in RAM (problem of Altis for 32bit OS users - data is put from RAM to HDD back to RAM all the time).

The most feasible "solution"/approach in my view:

Give modders the tools to identify and avoid/solve performance bottlenecks

Edited by .kju [PvPscene]

Share this post


Link to post
Share on other sites

Forget about CPU/GPU usage. Most bottlenecks on ArmA come from slow HDD's, slow RAM and low QPI/FSB clock (l2 cache comes into play too).

I had to reduce the view distance to 3k because my SSD just isn't fast enough and yet I blamed my RAM for the FPS bottleneck...

You should have seen my face when I saw the amount of data that was punishing my SSD... Since SSD can't process that monstrous amount of data, like kju said, frames get prolonged reducing the amount of FPS.

Share this post


Link to post
Share on other sites

The most feasible "solution"/approach in my view:

visitor 4 with small map terrain?;)

Share this post


Link to post
Share on other sites
;2503365']Preface: Not all details are probably correct' date=' yet overall most things should be OK. :rolleyes:

Let me attempt to explain how it works on a basic level:

(even though its pointless for the most part :p)

FPS = Frames Per Second.

How many images shown per second.

More shown, means more fluid image.

Generally 60 FPS are needed for really good experience, 120 for excellent.

On the flipside it also means how long a frame is.

In other words more FPS, means shorter time for a frame.

60 FPS = one frame lasts 1/60s

30 FPS = one frame lasts 2/60s - twice as long, engine can do twice as much simulation in a frame.

Why is that important?

The game, more precisely the CPU, does not only have to do graphics calculation,

yet also handle sound, physics, player input, scripting, AI, network data.

Now there is two ways: At times you can let something slip to the next frame (computed later),

yet most things need to be done in a given frame (especially anything related to sync between the different components).

Here is the main problem: If these take too long, the engine needs to prolong a frame.

Which in turn means you will see lower FPS.

In case the CPU (better main syncing thread) needs to do stuff first or in a specific order,

all other "devices" can't progress and idle - hence you can see low GPU or CPU core use.

Now what are the [b']actual sources in the Arma engine for CPU limited FPS[/b] (warning: lots of speculation):

1) AI to some degree

While there is optimization of various sorts, each AI needs to do stuff in "each" frame.

So when idling, AI does not have to do much calculations,

yet when moving in combat it has to evaluate lots of stuff for example.

Of course the more AI there are, the more the CPU has to compute overall.

2) Number of entities (players, AI, mission created objects (vehicles, weapons, wrecks, misc stuff, etc))

The more entities there are, the more simulation (of different sorts) the engine/CPU has to do.

If we get above (estimates) 100 players, or 500-1000 AI, or LOTS of objects spawned across the whole terrain,

the numbers will matter in terms of FPS.

3) (Network) synchronization

The game needs to keep the game state of all clients and the server in sync - otherwise everything would break apart (game stops).

Now if the server is too weak for the given missions/number of entities / there is "too much traffic",

the engine/CPU has to spend a lot of time (of the frame) to keep stuff still in sync and working.

(2 and 3 seem to be the main problem of Wasteland among other problems in that mission)

Now there is/can be of course other sources for low FPS, like GPU as limiting factor, HDD doesnt supply data fast enough,

there is too much data to be cached in RAM (problem of Altis for 32bit OS users - data is put from RAM to HDD back to RAM all the time).

The most feasible "solution"/approach in my view:

Give modders the tools to identify and avoid/solve performance bottlenecks

That makes a lot of sense Kju.

So basically what we are seeing is that frames are held up by other processing threads, whether it be sound, AI, physics etc... I can see this happening with the AI because that is probably the #1 cause of bad performance. My question is, why does the AI processing have to be tied to the rendering thread? Can it not run independent of the rendering thread or the main thread and still function?

Share this post


Link to post
Share on other sites
Problem here is that you are simply using wrong logic here. I am not sure how you exactly measure GPU usage and what does it technically mean but as I said before: the bottleneck most likely is the main thread on the CPU in any such scenario. You may try to lower settings heavy on the CPU main thread (view distance could be one of the most important here) and you can safely up some of the graphics settings that have not that big impact on your CPU usage (e.g. texture resolution, supersampling, post process quality etc.).

In any case, I am running the game on three different nvidia GPU computers and I am getting performance that is quite good for me (generally better than Arma 2) so I am probably not suffering the way some other users may be, hard to tell.

I monitor my gpu usage with my on screen display (and fps, gpu temp, and vid memory usage). in most games, if I'm not using vsync or otherwise limiting the fps, my gpu usage stays at 99%. I understand there's reasons for what we see in Arma 3, but I'm trying to explain why people are frustrated when they see they're getting bad performance and they see their hardware not being fully utilized... and that using "dummy loops" to increase cpu usage probably wouldn't "please" anyone, nor help them understand why they're getting bad performance.

for the most part, I play my own and other's (simple) single player missions, and the performance is pretty good. I actually recently started using MSAA again to bolster my gpu usage. the problem with doing that is that when you do hit a low point in performance, you start thinking you might pick up a few frames if you put everything back to low settings. and when that doesn't help, it is frustrating. I hope things get better.

Share this post


Link to post
Share on other sites

The id tech engine analysis give a decent idea how things work at the low level:

http://fabiensanglard.net

You can also find some more about performance profiling in the VR engine and sample outputs here:

http://community.bistudio.com/wiki/Performance_Profiling

@ Windies

Listing to John Carmack's talks it often seems very hard to decouple the different processing threads.

There are probably no easy solutions, often those would result in significant effort and larger engine rewrites I presume.

Edited by .kju [PvPscene]

Share this post


Link to post
Share on other sites

If you are running intro server side or CPU limitations, then GPU drivers can't really do much to improve that. In SP if you don't exaggerate with the view and object draw distance, it should run pretty well even with the game maxed out.

For the future, without Tiled Resources, AI moved on the GPU (at least some civilians in the beginning when you experiment with the technology, to avoid the locations look so dead and also bring new gameplay mechanics and mods to the game), 64 bit application and other "smart" ways of doing to job, probably we won't see much evolution on the performance side and NPCs populations.

// http://www.tweaktown.com/news/10612/amd_releases_dx10_1_tech_demo_froblins/index.html
The Froblins demo is designed to showcase many of the new techniques for character-centric entertainment made possible by the massively parallel compute available on the ATI Radeon HD 4800 GPU series. In our large-scale environment with thousands of highly detailed, intelligent characters, the Froblins (frog goblins), are concurrently simulated, animated and rendered entirely on the GPU. The individual character logic for each froblin creature is controlled via a complex shader - 3200 shader instructions for each froblin. We are utilizing the latest functionality available with the DirectX� 10.1 API, hardware tessellation, high fidelity rendering with 4X MSAA settings, at HD resolution with gamma-correct rendering, full HDR FP16 pipeline and advanced post-processing effects.

In this interactive environment, thousands of animated, intelligent characters are rendered from a variety of viewpoints ranging from extreme close-ups to far away "bird's eye" views of the entire system (over three thousands characters at the same time). The demo combines state-of-the-art parallel artificial intelligence computation for dynamic pathfinding and local avoidance on the GPU, massive crowd rendering with LOD management with high-end rendering capabilities such as GPU tessellation for high-quality close-ups and stable performance, terrain system, cascaded shadows for large-range environments, and an advanced global illumination system.

Physics wouldn't hurt to improve as well, but for that they need to move away from PhysX or any other proprietary tech that don't run on all hardware - physix does all the calculations on the CPU, minus the advanced stuff which anyway, is not in the game. This also needs to be done on the GPU - through open CL, in order to see some truly wonderful instances.

Anyway, the bottom line is not to expect miracles from the GPU side if your CPU isn't up to the job - overclock would help A LOT, at least in SP. We'll see what the future will bring and when the people will move to newer hardware in order for Bohemia to see a critical mass worth developing for. Happy gaming! :)

Share this post


Link to post
Share on other sites

Interesting stuff, keep it coming! ;)

/KC

Share this post


Link to post
Share on other sites

The AI had their own "jobs" like finding stuff to survive, take shelter, run and so on. For civilians when you start to play with it, it shouldn't need more than that to be honest - and I think that would help a LOT to bring life to the islands. Don't you think it would be quite awesome when you driver from one side of the island to another to encounter vehicles on the road going about their business, people in villages or towns moving about and so on? To see them gather around trucks or cars to move away from the front line of the conflict, police, fireman and ambulance vehicles and "crews" doing their job and so on? After that, the technology can be used for other games they have in mind, better campaigns and so on. :)

Also as an example -

Share this post


Link to post
Share on other sites

The problem I have with all the excuses is this, if I create an empty mission with just players I can host it locally and have players connect with me and we can run around Altis with good FPS. As soon as AI are added to the mix it is guaranteed to go down to 20fps for all players. That to me clearly shows what the problem is. Why we are going in circles discussing possibilities while ignoring this one simple fact is a mystery to me.

Share this post


Link to post
Share on other sites
The problem I have with all the excuses is this, if I create an empty mission with just players I can host it locally and have players connect with me and we can run around Altis with good FPS. As soon as AI are added to the mix it is guaranteed to go down to 20fps for all players. That to me clearly shows what the problem is. Why we are going in circles discussing possibilities while ignoring this one simple fact is a mystery to me.

Nobody is ignoring it though they may not be very clear about it. I know the largest cause of performance issue's is the AI and the largest section of slowdown from the AI is their behavior routines and FSM's. Then you run into things like hitting the 32 bit limit of memory addressing in about 10 minutes of gameplay, and actually exceeding it but they use the file mapping AI to map a lot of data which is how they get around it. Very large I/O reads and writes which is what causes the stuttering as it streams data into physical memory.

There's more than just 1 reason why we get poor performance, but yes AI is one of the biggest reasons.

Share this post


Link to post
Share on other sites

Well I am still at the training stage of the game and what I find bizarre is that my CPU's on my I5 quad core

only run between 22% and 25% of their capacity and the GPU on my Gv-660OC-2GB doesn't even work up a sweat.

That all with 12gb of RAM

Processors run at a surprising cool 42C ingame and when the computer is idle doing nothing at all they are at 37C.

In World of tanks it doesn't take them long to run up to 55C to 60C with the game fully optimized.

and the Processers run really high in WOT fluctuating constantly not so in ARMA 3

The GPU runs at a cool 40C in ARMA 3 and in World of Tanks optimized it goes up easily to 56C to 60C in the game.

The Framerate in WOT runs around 87-95 FPS on average.

Just pointing it out.

Edited by Wtornado

Share this post


Link to post
Share on other sites

Lol, WoT uses manly one core and a little bit from a second one. You are comparing a relatively simple game, small maps and gameplay, with one such as arma 3.

Share this post


Link to post
Share on other sites
Lol, WoT uses manly one core and a little bit from a second one. You are comparing a relatively simple game, small maps and gameplay, with one such as arma 3.

I don't know What type of system you have or what program you have to log your I5 quad core data but my 0 core is running at 60C-62C and my 1 core,2 core,3 core are running at 58C to 60c.

All 4 cores vary and fluctuate equally from 40% to 70% while in game with my monitoring system.

If your running just one core and a bit of your second core and you are running a quad core your are far from being optimized in World of tanks.You better check your system settings you might be using XP and you

haven't activated your 4 cores in your systems configuration.

With the new ''Nvidia Experience'' games are automatically optimized for your games such as Battle field 3,World of tanks,etc....personally I don't like the new system I like my old Nvidia control panel instead.

Edited by Wtornado

Share this post


Link to post
Share on other sites

Some people mentioned that the client gets bad frame rate because the server has bad frame rate. What makes you think that because if a server had bad frame rate in Arma 2, the client didn't get that frame rate as well. What happened was it would just make the AI move very slowly, just like when you see a player with low fps, they do the same run/walk animation as everybody else but move at a slower speed. This also causes the AI to be very unresponsive and it also causes other issues with things happening on time in the mission. That's what I noticed from playing on servers with bad fps in Arma 2 anyways.

Share this post


Link to post
Share on other sites

Yes I agree if the server is slow it will affect AI and player positioning horribly.

I don't know if they can do this on their servers but lets say you join a server with a ping of 30

and players join with 180+ pings can the server auto-kick high pings like in the old games if you

set the server with a range of 1<80 ping maximum?

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
Sign in to follow this  

×