Jump to content
k3lt

Low CPU utilization & Low FPS

Recommended Posts

Yeah this Mantle talk is quite off-topic...has nothing to do with CPU utilization.

Share this post


Link to post
Share on other sites

Polymath820's Request:

Ok first of this is wierd arma 3 is using little or no GPU percentage. I mean the GPU or GPU engines presented by "Microsoft Sysinternals" find it here:http://technet.microsoft.com/en-au/sysinternals/bb896653.aspx explorer says arma under MAXIMUM load is using 6.57% of the GPU! total... I mean what the heck... See Figure 1 for the details of the GPU engine, I have 11 GPU engines and 10 of them are sitting idle arma 3 is putting GPU engine 0 under maximum load. Why is it not doing what a GPU should in an ideal world and that is in Nvidias fermi / Kepler GPU architecture shifting the work across all the engines? Meaning Figure 1 shows how Nvidias GPU's are designed the general outline.

Another question refer to Figure 2

Figure 1

(GPU Engine 0) | GPU Cores 0 | GPU Cores 1 | GPU Cores 2 | DRAM Register 0 (This engine is under 99% of the load)

(GPU Engine 1) | GPU Cores 0 | GPU Cores 1 | GPU Cores 2 | DRAM Register 1 (Engines from here are using 0% and doing absolutely no work other than the sysinternals explorer which is csrss.exe using 0.01% of the GPU)

(GPU Engine 2) | GPU Cores 0 | GPU Cores 1 | GPU Cores 2 | DRAM Register 2

(GPU Engine 3) | GPU Cores 0 | GPU Cores 1 | GPU Cores 2 | DRAM Register 3

(GPU Engine 4) | GPU Cores 0 | GPU Cores 1 | GPU Cores 2 | DRAM Register 4

(GPU Engine 5) | GPU Cores 0 | GPU Cores 1 | GPU Cores 2 | DRAM Register 5

(GPU Engine 6) | GPU Cores 0 | GPU Cores 1 | GPU Cores 2 | DRAM Register 6

(GPU Engine 7) | GPU Cores 0 | GPU Cores 1 | GPU Cores 2 | DRAM Register 7

etc.

Arma 3 is also commiting the usage of 1.5GB of GDDR5 RAM

Figure 2:

PID(7812) CPU(14.90%) Cycles(1,047,366,185) arma3.exe!??_B?1???7Quaternion@math@simul@@QAE?AV012@XZ@51+0x315a36

Why is a mathematical operations thread taking up most the CPU time? I thought the GPU was more suited for those kinds of operations as it uses both floating point and double precision floating point values? I would think raw number crunching such as that would be all done on the GPU?

Otheruseful information:

Stack call of CPU: Running the Above Thread

ntoskrnl.exe!KiCpuId+0x2c8

ntoskrnl.exe!KeSynchronizeExecution+0xa78

arma3.exe!?GetOrdinalOfNoiseOctaves@CellularCloudGrid@clouds@simul@@SAIXZ+0x22e28

arma3.exe!?SetInt@BaseKeyframe@sky@simul@@UAEXPBDH@Z+0xd67e

ntdll.dll!ZwDelayExecution+0xc

KERNELBASE.dll!Sleep+0xf

arma3.exe!?NeedsRecalculation@CellularCloudNode@clouds@simul@@UBE_NXZ+0x96667

Any explanation would be nice bohemia...

-----------------------------------------------------------------------------------------------

Some Links polymath820 has provided:

http://imgur.com/AnyHObK

http://imgur.com/l6u5wyy

------------------------------------------------------------------------------------------------

Edited by Ranwer

Share this post


Link to post
Share on other sites
Yeah this Mantle talk is quite off-topic...has nothing to do with CPU utilization.

Not really off topic, the reason mantle was mentioned is because one of the causes for the low cpu utilization is bad multithreading, and one of the causes for that is draw call overhead for the gpu which are singlethreaded because thats how apis currently deal with it, serialized batches, and also affects how the game bottlenecks on the first core piling up with bad optimizations, unnecessary simulation syncing, etc etc, stopping all other cores from doing anything while waiting for the first core, thus causing the low usage once you throw more things in a mission. Not the only reason, but one of them.

But that´s a dream anyway, they might only think about implementing such a thing in DayZ SA since that´s the game they consider rewriting the engine worthy of. Maybe ArmA 4.

Edited by Th4d

Share this post


Link to post
Share on other sites

Gahh can't post links -_-

P.S Thank you for the help Ranwer

Mantle won't help, BF4 supports mantle it doesn't help the game is still as unstable and as buggy with no performance increase really in sight. Mantle is also very infant like and, DirectX took from 1994 till today to mature thats what... 19 years of development from DX6 to DX11.2 and mantle has had about 1-2months at most...

Mantle: Immature Prepubescent child

DirectX: Sophisicated Wise Oldman

Edited by Polymath820
Clarify

Share this post


Link to post
Share on other sites
Gahh can't post links -_-

P.S Thank you for the help Ranwer

Mantle won't help, BF4 supports mantle it doesn't help the game is still as unstable and as buggy with no performance increase really in sight. Mantle is also very infant like and, DirectX took from 1994 till today to mature thats what... 19 years of development from DX6 to DX11.2 and mantle has had about 1-2months at most...

Mantle: Immature Prepubescent child

DirectX: Sophisicated Wise Oldman

DX is more like a sick blind old man holding back on the gas pedal, it wasn´t made for the current hardware architecture. BF4 currently does not support mantle, and not that it needs, it´s pretty easy to get 60fps with that game, which does have a ton of bugs that affect gameplay.

You should watch the already posted mantle video presentation.

Share this post


Link to post
Share on other sites

I reserve judgement

From an objective point DX has done things most people could never have dreamed of when I was about 16 - 17 I was tracking the work CryEngine were doing with Linear Lightning Calculation and DX 11 Image tessellation. I also got a taste of how anti-aliasing actually works and how bigger the mathematical function is to process it. No wonder PC's have trouble with it is a mixture of trigonmetric functions and base 2 logarithms.

As stated in multiple PC magazines: PCAuthority,PCMag,APCMag,Atomic, they have said the juries out on whether mantle will mean anything.

DirectX API was used to do CryEngines linear light calculation, and it was not easy, considering calculating ray-tracing in a manner that was efficient enough to be fast and stable enough to run on PC's without a hitch, but then you look at the "hacker central with all the Crysis games"... Sorry but I respect Microsoft and DirectX how can we just insult microsoft in their face over DirectX after 19 years of it's operation providing a framework for all PC games since 1994. And it would have taken millions and millions of dollars just to develop DirextX

All the hype about Mantle is a "marketing" gimick just like 100Hz TV which were really (Subfield engines) and 200Hz TV's and 120Hz monitors I maintain objectivity and say that, if people start saying Mantle works then ok, and it needs to be a large enough statistical quantity that warrants it to be looked at. Other than that sit on the fence and observe. Waiting. Watching. Studying. Then and only then make a judgement. But even after you make a judgement things continue to change and maintaining a solid absolute judgement would be flawed in the lack of adaptability and moving with the updates.

What I think would be truly "revolutionary" would be a "universal graphics development platform" that allowed both AMD and Nvidia to play nice. Sure they can have the proprietary hardware components and software, but have a universal platform in which games can use. Yet that's easier said than done.

Just like I postponed purchasing BF4 while I wait for the class action to turn out and the quantity of people of keep complaining about crashing and malfunctioning to decrease.

Edited by Polymath820
Additional info

Share this post


Link to post
Share on other sites

All the hype about Mantle is a "marketing" gimick just like 100Hz TV which were really (Subfield engines) and 200Hz TV's and 120Hz monitors I maintain objectivity and say that, if people start saying Mantle works then ok, and it needs to be a large enough statistical quantity that warrants it to be looked at. Other than that sit on the fence and observe. Waiting. Watching. Studying. Then and only then make a judgement. But even after you make a judgement things continue to change and maintaining a solid absolute judgement would be flawed in the lack of adaptability and moving with the updates.

Just like I postponed purchasing BF4 while I wait for the class action to turn out and the quantity of people of keep complaining about crashing and malfunctioning to decrease.

There´s no gimmick on 120hz monitors, i have no idea what you are talking about, unless you are one of the ones that can´t see the difference between framerates higher than 30 and think the human eye is stuck at 24fps and all that crap. But yeah, tv´s do use fake rates with blurred interpolation to "improve" motion, but pc gaming monitors don´t. It appears you already made your judgement despite not even watching the practical usage on the presentation shown on the video, on a live test with benchmarks on a game engine even, but yes its performance will depend on how it is implemented in each game in which its used.

There are those that had no issues with BF4, and crashes have stopped for me a couple of patches (weeks) ago, but yes they were very annoying, and it was a very bad launch. My main issue with that game now is with gameplay balance and how they deal with netcode meaning how other players lag (high ping) affects how/when you are shot. They chose to make the game playable for high ping players but that causes wierd behaviour/glitches on the outcome of firefights, that is something that will never be perfect but in BF4 is worse than BF3. I just think it´s funny that you first mentioned how bad bf4 was with mantle support, when in fact it has none, and now that you didn´t even play it. Seems that there is a lot of negative pre judgment lacking better information going on, good luck with that.

Edited by Th4d

Share this post


Link to post
Share on other sites

How do you know the 120Hz monitors are not a gimick? What solid information have you been given in regards to 120Hz monitors other than the companies specifications documentation? Condescending attitude is not going to resolve issues. It's not negative prejudgement it's just looking over the BF4 forums people are very disgruntled with the program and we are now way off topic. and I mean way, this is not "debate forum where nothing gets done". CPU and FPS issues only.

Share this post


Link to post
Share on other sites
Not some, all 7XXX upwards and every card from now on, if you buy a new AMD card today it will support it, And AMD has a big market share, something like 40%, it´s a matter of time. Who develops a new technology and software and simply refuses to innovate thinking about future hardware and get stuck with horrible bottlenecks simply because old soon to be obsolete cards aren´t compatible? ArmA 3 runs bad (less than 20fps) on old hardware anyway (hell, runs with 20fps even on high end gear on some scenarios), your only leg to stand on in this argument is nvidia support which does hold a bigger part of the market.

Also there is word that in fact any card from both ati and nvidia are Mantle compatible, but i haven´t seen the actual statement for it. http://www.dsogaming.com/news/amds-mantle-does-not-require-gpus-with-gcn-architecture/

We are not missing your point, you simple are trying to create an excuse for it not to be worth it, even worse, on a game that clearly would benefit tremendously from anything that gave out significant gains and lessened bottleneck issues, in short, could finally make the game perform acceptably. Anyway, EA and others already disagree with you, they think it will be worth it and already announced titles with it, the BF4 patch for Mantle is soon to be released already.

I´ve said all i wanted, best of luck.

After reading your explanation here, I am sure we can appreciate your vast experience in this area that will help BI make the decision whether to support mantle in the future. [/sarkasm]

Incidentally, to those who feel that the discussion of Mantle is off-topic, I would argue that to "performance" as a whole, and the "low FPS" portion of the discussion specifically it is relevant - as a subject to look at, anyway.

To understand what Dwarven is trying to tell you, you need to understand what Mantle actually is. Mantle is a framework that lets developers go "to the source" as it were and access very fast lower-level functions and pipelines on a graphics card. What makes it different from DirectX is that DirectX (and OpenGL for that matter) implement high level functions. Instead of telling the card how to move data between memory sets, how to implement shader functions, etc, you tell OpenGL or DirectX how to light or manipulate higher level objects like surfaces, vertices, etc. The graphics library - DirectX or OpenGL - have been written to support world + dog.

From a commercial standpoint this is critically important. For a developer like BI, putting an application like ARMA 3 together is expensive enough. To produce an engine. To produce assets. To keep tooling up to date and usable. To complete game design. Then to balance the whole thing out and eventually move it into the channel. To say nothing of things like keeping a roof over the head of the people who are doing this, keep the framework of people who hire and fire and clean the floors, buy the paper, the servers and cables and myraid hundreds of other ridiculous things that it takes to build a game these days. All before you publish into the channel and start throwing gobs of money into buckets labeled fuzzy words like "marketing".

Anyway... yeah, expensive.

Which means that a firm like Bi almost has to use de-facto industry standards for graphics libraries so that nearly all mainstream graphics options are supported without chipset specific code.

There is a tradeoff to this decision. Over time, the DirectX and other libraries have accumulated support code for a lot of hardware. And I mean a gigantic ridiculous lot. It impairs performance. To put it a simple way, what if I had a relatively basic high level function that said to draw a triangle. No matter what that function does, all BI needs to know how to do is call my triangle code when the engine says it needs a triangle somewhere on the screen. When I originally wrote the code, it was pretty straight forward, calculate some points, create a surface that spans the area between them based on the point data passed during the call, and then initialize a framebuffer object so that the next frame includes the triangle, plus whatever additional passes I do on the triangle to create its final view.

Except along comes Vendor_X who does something slightly different for how that surface has to be created if its an _X series card. So I check to see if its _X and then do my triangle.

And then there is a whole class of cards where mobility means I have to slightly reduce the size of my points because of bezel prediction, so I add a _M case check.

And then there is a whole class of embedded video devies from days of yore that are still technically in the WHQL BOM which means DirectX has to have a _OLD case check too, just in case you want to write a triangle for a REALLY old graphics display. Because DirectX doesnt know its running a high performance game. It only knows its providing a function that could provide a triangle in any number of situations. For all DirectX knows, you are writing a triangle for Pong.

And here you have the challenge Mantle is "solving". AMD comes along and says "look, the old ways don't take advantage of new flexible pipelining and massively parallel processing infrastructures because to do so explicitly will leave out a lot of hardware that it still needs to support. Plus its relatively slow because there is a lot of accumulated 'case' checking that makes sure your old TNT card is still supported, even though it couldn't even run your new game."

So implementing Mantle is not a simple decision where someone slaps thier forehead and the choice is not between "best possible graphics performance" and "hating innovation" or some such as some here seem to think. The choice is about choosing between two imperfect choices that have thier own costs and tradeoffs. Adopting mantle is non-trivial. A couple of tiny development houses with small games built some apps from the ground up with mantle support. EA got on board with early API access and dev support for Frostbite. This is no different than what NVidia provided to Eidos for Hitman or to the CryEngine team that helped them along with GeForce specific optimizations. Well actually it is a little different. AMD was willing to reach a higher level of investment to get a big name partner (EA) on board with the Mantle tech. Adopting mantle means making a lot of investment that will leave a lot of other graphics options "out in the cold" to get no benefit from all of those dev dollars.

BI would be moving forward with Mantle without a lot of that support. Building an engine update at significant expense to implement and test. In a world where there is little reference data for the engine rebuild, to realize performance improvements ONLY on a fraction of the available install base, with no clear roadmap for what the broadening of the install base looks like. It's one thing for AMD to say anyone can get on board with mantle. It's another thing for NVidia to make the decision to get on board with mantle :)

So lets look at what the steam hardware survey tells us the install base narrowing looks like:

http://store.steampowered.com/hwsurvey?platform=pc

Scroll down and click on "Video Card Description (PC)".

Several observations.

1) Look at how fragmented even a huge install base like steam has in video cards! Look at how many awful cards are on that list! Realize DirectX and OpenGL support them all but mantle doesn't.

2) You have to go down to 13th place to find the first card that has mantle support today. The next nearest after that is in 20th place! The next in 29th! Between the three, you get about 3% of the install base. Finishing up the first page, a scan and some mental math puts my total at about 5.5% total penetration for mantle support - and even that I think is being a bit generous on my part, to be honest.

3) If you were a senior producer or director or executive vice president of Cool Stuff at BI, looking at this list, would you invest in an engine update for Mantle support or Nvidia OEM support for Nvidia optimizations? The market right now clearly indicates which is the better investment.

And that last point is where I think Dwarven's short posts are leading (because he doesnt have time to go into all of this). Understanding what Mantle is trying to fix, it ONLY works when more of the market is supported. And it's an open question whether the efficiencies still remain when that larger market is supported (does Mantle scale well to supporting multiple vendors hardware?) There are a lot of unknowns about the magnitude and value of the investment. And the market data RIGHT NOW today does not show a strong enough trend that says its worth taking the risk that Nvidia is going to get on board any time soon. Without Nvidia on board, the better market-driven investment is probably Nvidia OEM support, or its to hire a performance focused dev in-house.

So, mantle is relevant to the discussion as a possible consideration to address engine graphical performance. But frankly its irrelevant to the market until Nvidia gets on board or ATI/AMD gets back near 50% market ownership on the back of new card sales (excluding dedicated litecoin/bitcoin sales).

I would also argue its irrelevant for the specific performance challenge of Arma 3... but that's a post for another night. Most of what I am seeing is a low-scale app that is mostly CPU and throughput bound, not neccessarily GPU bound, until you have addressed the CPU anyway!

Edited by OddballSix

Share this post


Link to post
Share on other sites
After reading your explanation here, I am sure we can appreciate your vast experience in this area that will help BI make the decision whether to support mantle in the future. [/sarkasm]

Ad hominem fallacies don´t do anything for me, just makes you sound stupid imho.

Anyway, i agree for the most part, but here are my points:

1- Even if we ignore that the percentage of compatible cards will increase a lot because from now on all sold amd cards will be compatible, 3% of the steam userbase as potential buyers is nothing to ignore, steam is huge, would be enough for bohemia to increase their sales tenfold. And i also bet that 90% on that chart already cannot play ArmA online in it´s current state, because of how bad the performance is, there are daily complaints about performance in pretty much every arma discussion anywhere.

2- Star Citizen is being made by a small independent studio backed up with kickstarter money and they already announced they will use Mantle aswell, it´s a no brainer if you care about delivering the best product you can. It´s not just EA already announcing titles with it. Apparently for those inexperienced people that pretty much dominate the gaming industry it is worth it, and i bet EA more than anything only considers how much money they will gain from implementing it.

3- It´s a fact that ArmA performance is bad, no matter what hardware you use, simply blaming servers and user made missions doesn´t solve anything and isn´t a sucessfull way of shifting blame, just read this 200+ pages topic. It isn´t even worth it to mention the necessity of legacy compatibility to older hardware because it´s barely playable on them, noone can play multiplayer with the minimum recommended specs, noone. ArmA fans spent a fortune a trying to run this game with acceptable fps with overclocks and i´m sure they would welcome and even switch to AMD cards if that meant having a significant better performance. Hell i´ve upgraded my pc in the past solely because of ArmA. Some people buy nvidia cards for those apex physx they get for some games, would be the same.

My reasoning is very simple "can i make the game i want with current hardware using dx and opengl and give out a great experience with great performance delivering what i promise?" No?, Then mantle it is, unless you can do it some other way but apparently they can´t. In the official webpage for this game they claim you can play a 60 VS 60 player warfare game, and they give out those ridiculous minimum hardware settings, do you believe EA could get away with that? They got pretty beat up with what they did with Sim city and now with BF4, the fallout from bad design choices/skimping on development costs is worse than simply doing it right in the first place.

I understand the development cost isn´t small to adapt the engine, but if it´s essencial for them to deliver what they promise on the first place, then there you go, even DICE was humble enough to admit they screwed up and that made EA stock prices fall, losing a lot of money, but there is no going around it when something is wrong, excuses aren´t enough, money will have to be spent one way or another, otherwise the franchise suffers and risk dying out, which is a far bigger price. I also understand that they simply might not have the money or technical hability to do so, if so that´s fine, but don´t tell me that 3% of steam users is something to ignore when 90% of them can´t run the game as is anyway.

Edited by Th4d

Share this post


Link to post
Share on other sites

2- Star Citizen is being made by a small independent studio backed up with kickstarter money and they already announced they will use Mantle aswell, it´s a no brainer if you care about delivering the best product you can. It´s not just EA already announcing titles with it. Apparently for those inexperienced people that pretty much dominate the gaming industry it is worth it, and i bet EA more than anything only considers how much money they will gain from implementing it.

It's one thing to develop a new game (what is the engine used in SC?) upon a new API, it is a complete different thing to include a new API in a fully completed engine.

It's like "hey, here's the GAU-8, let's build a plane around it" versus "can you include the GAU-8 into <insert name of any existing fighter>?". Okay, maybe not the best example but you surely get the point.

Share this post


Link to post
Share on other sites
Myke;2587634']It's one thing to develop a new game (what is the engine used in SC?) upon a new API' date=' it is a complete different thing to include a new API in a fully completed engine.

It's like "hey, here's the GAU-8, let's build a plane around it" versus "can you include the GAU-8 into <insert name of any existing fighter>?". Okay, maybe not the best example but you surely get the point.[/quote']

I agree.

Star Citizen uses the Cryengine, and it´s looking awesome:

https://robertsspaceindustries.com/comm-link/transmission/13362-Star-Citizen-To-Include-Mantle-Support

Share this post


Link to post
Share on other sites
How do you know the 120Hz monitors are not a gimick? What solid information have you been given in regards to 120Hz monitors other than the companies specifications documentation? Condescending attitude is not going to resolve issues. It's not negative prejudgement it's just looking over the BF4 forums people are very disgruntled with the program and we are now way off topic. and I mean way, this is not "debate forum where nothing gets done". CPU and FPS issues only.

:butbut:

Are you serious ? Man, going from 60 hz to 120 hz was the best hardware decision i ever made. Not so much for arma multiplayer because that never runs at even above 40 fps or something , but for all other games , my god it was beautifull, SO AMAZINGLY SMOOTH.

Really man , you ought to try it. You will not be disappointed, i promise. Then, when you install "lightboost", you will be in awe even more. CRT fluidness. For real.

So, 60 hz vs 120 hz , not even a question, 60 hz is medieval tech now, for old people that are stuck in the past .

120 vs 144 hz though, if that was the discussion you might have a point. I personally can not see a real difference between 120 hz and 144 hz, HOWEVER, some people might .

Share this post


Link to post
Share on other sites

AMD-Desktop-GPU-Market-Share:

The desktop market shrunk by 4.1% in 2012 and shipments continued to decline in 2013.

They try to reach a total market-share of 40%, but mainly via consoles / mac's / pro-gpu's:

AMD derives approximately 7% of its valuation from the professional graphics business segment, as per our estimate. The company currently accounts for 21% of the professional GPU market, which is dominated by Nvidia. However, the Mac Pro deal can considerably elevate AMD’s standing in the professional GPU market. Digitimes reports the deal could increase AMD’s professional GPU market share to 30% by the end of 2014. [4]

As part of its restructuring initiative, AMD aims to derive 40%-50% of its revenue from high growth businesses, including professional graphics and semi custom devices, in the next two to three years. In addition to scoring the Mac Pro deal, AMD’s GPUs power all three next-generation consoles (Nintendo Wii U, PlayStation 4, and Xbox One)and together this helps the company move closer to its target.

Source: Click

There is no point in investing into Mantle-Implementation right now, if you are a small company with a niche product like ARMA.

:)

Share this post


Link to post
Share on other sites

1- Even if we ignore that the percentage of compatible cards will increase a lot because from now on all sold amd cards will be compatible, 3% of the steam userbase as potential buyers is nothing to ignore, steam is huge, would be enough for bohemia to increase their sales tenfold. And i also bet that 90% on that chart already cannot play ArmA online in it´s current state, because of how bad the performance is, there are daily complaints about performance in pretty much every arma discussion anywhere.

Which gets us to part two of my participation in this thread, the part that I left off last time saying needed to be saved for another day.

Let's assume, arguendo, that you were 100% right and everyone else on earth is wrong. That every possible favorable assumption could be applied to mantle happens inside of 6 months (a logical fallacy in an industry where driver cycles are 2 years, hardware cycles are 12 to 18 months per generation with replacement cycles of 3 to 5 years for end users, and game engine / development cycles are usually 18 to 36 months). Anyway, in our world of pretend, Mantle becomes king within 6 months. Like Quake 3 did in showing people a new way of programming shaders and using 'true' curves, Mantle shows people the light for how to more efficiently access lower level capabilities of modern video card pipelines.

Here is the thing, with the challenges that the Arma 3 engine actually has creating the problems today, it would mean very little! Before people sound off, let me explain.

First, realize that writing a game engine from the ground up to handle units, AI, netcode and all the rest of it is really freaking hard. Like P vs NP hard. I know this because I have worked for a game developer, I have been elbow deep in the guts of multiple game engines, I have been fortunate to be a tiny cog in the works of a team of really smart people. I have even given a press conference and several interviews at E3. A bit part from a second or third tier indie dev, sure, but anyway, I have been fortunate enough to work with the old GameSpy tech, the Unreal engine, CryEngine, plus Quake engine back in the day, etc. My day work these days pays better and doesn't have the same awful work environment that the poor maturity of the game industry maintains for some dev houses, but I still like to look at things like this because it's something I will probably always care about if nothing else but because I find it interesting. Some people wonder why stars burn yellow or give off x-rays. I wonder why frame rates on an engine from a second tier dev house are challenging.

All that to say, writing an engine from scratch is really freaking hard. The actual render of the frame itself - whether using OpenGL or DirectX or Mantle - is only part of the performance equation.

The hardest damn thing in the world to troubleshoot is multiplayer performance. You have so many variables for what is going on to get represented in the frame. Does the engine have a lag in processing the information back and forth from the server? Does the engine have a lag in how it moves that representation to an update in animations? Have the clients triggered damage factor communication, etc, to other players back through the servers synchronized in time with the animation being displayed? Is the server lagging in how it validates the input from a given player before providing a game snapshot to the other clients? It goes on and on. All kinds of factors to look at when you talk about something as generic as "multiplayer performance". What to you feels like slow rendering or low framerate in the multiplayer client could be as likely bad net code as it could be inefficient rendering.

So, the best way to open the discussion is to take multiplayer off the table first, and then come back to it.

A lot of the perceived (and measurable for that matter) performance is driven as much by how the frame drawing is triggered (or held up) and how it is synchronized with the engine's understanding of the game world at the split second that the frame is intended to represent. The relationship that the rendering code has to the rest of the game can be complicated OR it can (should) be highly abstracted, potentially at a slight performance penalty. Still, implementing support for some new technology (cough, mantle, cough) is a heck of a lot easier when you are building a new module or a new game than it is as a retrofit where you already have assets and animations and mechanics that have been implemented and tested on the existing render mechanics.

I did a quick audit on a few game sessions. Frankly I don't have the time or inclination to do really deep runtime monitoring beyond a couple of perf captures I've already run. A few observations:

1) It's clear to me that rendering triggers are in some cases dependent on engine events / engine cycle completions like world updates.

2) In some cases we are seeing circumstantial evidence that in multiplayer cases high latency events are a modal part of that cycle (like logging disk writes).

3) The AI implementation here is monolithic from a processing perspective. This has tradeoffs. Minimized conflicts and synchronization "misses" resulting in a tighter/better AI first-flight but resulting in a really "heavy" thread that is killing low-clock-frequency CPUs like i5s and pegging one entire core on higher frequency i7s, but not utilizing spare capacity on other cores.

4) There is currently no way for editors to get in-game advice for "weight" assessment of missions, resulting in some inefficient designs. (E.g. some single player custom mission profiles are seriously heavy performance profiles compared to the campaign mission releases).

5) There is legitimate beef needed in the GPU area once these other areas are addressed. It's a demanding game once view distances, etc, are set pretty significantly. To me though its clear that a lot of the observed performance issues are from 1, 2, and 3 WAY more than anything gpu based.

But here is the rub, and here is where I kind of part ways with a lot of this thread. Remember what I said about engines being really hard to write? A lot of really smart people work together to put these things together. They make decisions that have trade offs. One of the hardest things to do in engines these days is to make AI work halfway decently. As it is, ARMA 3's AI has mixed results. Let me say that again, AI is really hard. Whether you realize it or not, when you choose to take an action, your brain often does a lot of pre-calculation and analysis that is never consciously noted to establish a conscious outcome and take action. On a computer, the ability to do this for a mass of individual units who all have individual circumstances and stay near-real-time in their reactions is very challenging indeed at current processing scale - so long as that is done linear.

The solutions would seem to be obvious and simple, would they not? Any armchair gamer could get on the forum and say:

1) Audit the code, no disk write should ever be modal. EVER. Server or client.

2) Isolate the render loop to the extent possible to permit dynamic update while the renderer generates new frames. E.g. no dependency on explicit "complete" world AI/state cycle updates to create a new frame.

3) Modularize the AI to allow instantiation of separate AI threading for group or "side" level AI cycles. Cost is potential for synchronization between different threads being needed, and some additional overhead being required as each thread carries a substantial set of identical code being run in identical but parallel tracks.

4) Consider investing in some consulting from NV or AMD for graphics engine optimization if you just don't have confidence to make this a "hero" effort in-house - AFTER the other stuff is done.

Here is the rub. We (or at least I) don't know enough to know how far 1 or 2 would even matter in this case. It could be that they have made decisions that already do this but because of the way that elements relate to each other, they have to do what they are doing today or face major re-work.

3 is a son-of-a-bitch to implement. Like I said, AI is hard. Enhancing AI is hard. I don't know enough about why they implemented monolithic AI in the first place and how it c ycles through the decision set and units that make choices to know how much resource duplication threading would create. Would creating 2 to 4 separate threads (one each Opfor, Blufor, Independent, and potentially one for civilian side) cause some ridiculous amount of additional RAM usage? What about processing in older systems? What about dual core? It looks like the current setup is built for dual core "max". Is it such an expensive amount of work to create instantiation for an AI instance per side that it's something that requires a new revenue stream to be able to sustain the expense? E.g. Arma 3: Operation SuperFriends for $9 would allow the money to do the AI work plus a campaign and a few editing assets to justify the dollars?

We don't know what we don't know.

I know what the behaviors and performance profiles suggest to me. I know what collating what reading 30+ pages of the thread illuminate to me. I can make some guesses at possible solutions. At the end of the day, that's all they are. Educated Guesses informed by a little bit of experience. It's not my money riding on taking action on these guesses, and I don't have the complete commercial and technical picture to really drive the solution. That's what some senior producer at BI is getting paid to do.

At a certain point, all we can do is provide the best, most complete picture of our experience and the data that we can provide around it and trust the devs to either take action, or "vote with our wallets" for those of us who don't feel they see adequate progress on this issue to support follow up expansions or next gen product. My hope in hammering out my couple of posts in the thread is tying together some concepts because its clear that people don't understand how complex tackling some of these performance issues is and how much of an investment it represents BI making into the product.

2- Star Citizen is being made by a small independent studio backed up with kickstarter money and they already announced they will use Mantle aswell, it´s a no brainer if you care about delivering the best product you can. It´s not just EA already announcing titles with it. Apparently for those inexperienced people that pretty much dominate the gaming industry it is worth it, and i bet EA more than anything only considers how much money they will gain from implementing it.

EA was "bought" onto the platform with free support from AMD / ATI development teams directly for frostbite. From EA's perspective, building a new engine, with the opportunity to get "free"/subsidized opportunity to extend performance gains on a subset of the industry at little cost when they are building a new engine for a slate of games anyway is a no-brainer business decision. It's night and day different from what BI would have to consider in implementing Mantle.

If BI were just now building ARMA 3 from the ground up starting today, Mantle would stand a much better chance for adoption (yet still not clearly a yes decision, and also not clearly the performance boon you appear to believe it would be). The fact remains that there is no parallel for the effort BI would have to undertake to implement it today and precious little commercial imperative to drive the effort.

3- It´s a fact that ArmA performance is bad, no matter what hardware you use, simply blaming servers and user made missions doesn´t solve anything and isn´t a sucessfull way of shifting blame, just read this 200+ pages topic. It isn´t even worth it to mention the necessity of legacy compatibility to older hardware because it´s barely playable on them, noone can play multiplayer with the minimum recommended specs, noone. ArmA fans spent a fortune a trying to run this game with acceptable fps with overclocks and i´m sure they would welcome and even switch to AMD cards if that meant having a significant better performance. Hell i´ve upgraded my pc in the past solely because of ArmA. Some people buy nvidia cards for those apex physx they get for some games, would be the same.

My reasoning is very simple "can i make the game i want with current hardware using dx and opengl and give out a great experience with great performance delivering what i promise?" No?, Then mantle it is, unless you can do it some other way but apparently they can´t. In the official webpage for this game they claim you can play a 60 VS 60 player warfare game, and they give out those ridiculous minimum hardware settings, do you believe EA could get away with that? They got pretty beat up with what they did with Sim city and now with BF4, the fallout from bad design choices/skimping on development costs is worse than simply doing it right in the first place.

I understand the development cost isn´t small to adapt the engine, but if it´s essencial for them to deliver what they promise on the first place, then there you go, even DICE was humble enough to admit they screwed up and that made EA stock prices fall, losing a lot of money, but there is no going around it when something is wrong, excuses aren´t enough, money will have to be spent one way or another, otherwise the franchise suffers and risk dying out, which is a far bigger price. I also understand that they simply might not have the money or technical hability to do so, if so that´s fine, but don´t tell me that 3% of steam users is something to ignore when 90% of them can´t run the game as is anyway.

I am sorry, I don't know that performance is bad. That's subjective. Mine is actually quite good on 12G RAM, i7 930, 2 x GTX 460. For me, the promise was delivered. And for many others as evidenced by the many playing the game. Sure, some folks aren't having a great experience, but many others are. While you seem to think its a significant fraction, all it would take is a couple percent of the users of a game of this class to create a thread like this with this duration and participation level.

Your simple reasoning is, indeed, quite simple. I can understand how it would seem like a yes/no decision, but nothing in business is ever as simple as you seem to think. I also think that its important to avoid undermining your argument with preposterous statistics.

Asserting that "90% of them cant run the game as is anyway" is absurd. My own scan of the stats would indicate the reverse would be true, if anything. That video card list, adding up the models that cannot run the game even on low settings, I come away with something like 10 or 15% on the first page. And that's ok. Not every one on steam is in the ARMA 3 target market. Some people on steam buy Plants vs Zombies. Some people want puzzle games. Some people want find-the-object games. And that's ok.

It's also important to note that "3% of steam users" are NOT being ignored. The game has DirectX support which serves those users (the mantle supported users). Perhaps not as fast as you may think it may do so with Mantle support, but it's not like the choice is Mantle or the game just does not run for those users.

I think that we generally agree that Mantle is worth watching. Will Nvidia get on board? Do Devs see enough value that more dev houses start playing with it in new products?

The point I would leave you with is this: be reasonable in your expectations when you look at your subjective expectations for the ARMA 3 performance issues. Commercial considerations have to be included in what is reasonable here. BI has to run a business to pay the developers to make these changes. They care about the experience you have. They care enough to try and respond to what they are looking at and whether they are looking at Mantle or not. It may not be the answer you want right this second, but they care and they are listening. That should be a great sign. Keep an open mind and keep sharing your thoughts, with an eye towards understanding WHY they do the things they do (or do not do things you may want).

Edited by OddballSix

Share this post


Link to post
Share on other sites
!

Another excellent post. I'm sure if BI did ever write that performance blog post they promised they'd corroborate what's written here.

Share this post


Link to post
Share on other sites

The point I would leave you with is this: be reasonable in your expectations when you look at your subjective expectations for the ARMA 3 performance issues. Commercial considerations have to be included in what is reasonable here. BI has to run a business to pay the developers to make these changes. They care about the experience you have. They care enough to try and respond to what they are looking at and whether they are looking at Mantle or not. It may not be the answer you want right this second, but they care and they are listening. That should be a great sign. Keep an open mind and keep sharing your thoughts, with an eye towards understanding WHY they do the things they do (or do not do things you may want).

Great post, thanks for taking the time.

And i agree that Bohemia might not even have the technical hability to do so, i was reading Maruk´s interview for Operation Flashpoint in which he said that people even forgot how things were implemented in the engine for lack of documentation, and that was during its development. But as is, i simply can´t play the game like i would wan´t it, big vehicle battles with a lot of people online, because i cannot stand playing with 20fps or less and i find 20-30 player missions against AI very boring. So to me the game is broken and need a fix, since it does not deliver what it promises for me, and my gear is far superior to the minimum specs (FX8350, R9 280X, 16GB, 256SSD). But i can also accept that you can enjoy the game both with 20fps or with higher fps on smaller missions with fewer than whats officially suported players (60 VS 60). People´s expectations are very different indeed.

Let´s just remember what the minimum specs are:

Minimum:

OS:Windows Vista SP2 or Windows 7 SP1

Processor:Intel Dual-Core 2.4 GHz or AMD Dual-Core Athlon 2.5 GHz

Memory:2 GB RAM

There is no way someone with that can play online with more than 15fps. I was watching Jack Frags video on DayZ SA (which supposedly received a massive rework on the ArmA´s engine), he has a 290X gpu and i saw his framerate dipping into lower 15´s while he played, and he even comented on it. "It´s alpha" might be the choice response for my comment, which is exactly what i heard for ArmA 3 before launch, not very promising but lets hope, if they finally optimize their engine somehow for that game i might buy it, but after ArmA 3, this time around i´ll simply wait and see.

But again, i do agree with pretty much all you said.

And, so far, the ones that announced Mantle support: Cloud Imperium (Star Citizen with cryengine), Eidos (Thief), Oxide (their engine), EA (several) and Square Enix. I´m guessing they know something people here don´t in order to jump in the bandwagon of Mantle, since AMD users are so irrelevant on the desktop market (and apparently isn´t supported on consoles), after all there is an added cost and development time for it, and Cloud Imperium is a small independent developer working with an already "done" engine.

Edit.:

"Delving deeper into AMD's Mantle API" (more information about the mantle presentation)

http://techreport.com/review/25683/delving-deeper-into-amd-mantle-api

Edit 2.: Slide presentation:

http://www.slideshare.net/DICEStudio/mantle-for-developers

Edited by Th4d

Share this post


Link to post
Share on other sites

Modularising the AI is already possible, to some extent, by using a headless client and a dedicated server. Assign one side of AI to the HC and one to the dedicated server. If you are a single player with a hexacore box, you could do this on the same box as your client.

Share this post


Link to post
Share on other sites
Modularising the AI is already possible, to some extent, by using a headless client and a dedicated server. Assign one side of AI to the HC and one to the dedicated server. If you are a single player with a hexacore box, you could do this on the same box as your client.

...Or the game could just do a better job at multithreading.

Share this post


Link to post
Share on other sites

Hi all

I only just got this game, and only 2 weeks ago upgraded my computer. Problem is the FPS are terrible, from startup to game play. I've been able to play for some time, but after removing a anti-virus software that was meant to have an effect on this game, it's now just totally unplayable. While fiddling with the settings i was able to get up to 70 fps in the lobby (looking over water) but they only lasted a few minutes. Now it's sitting at 12 FPS, but sometimes goes as high as 25. (now in a test, in a server (invade and annex co-op) with only 5 people, i have a FPS of 12.

My specs are (as said brand new)

i5 3760k (not overclocked... yet)

ATI 280X 3 GB GPU

240 solid state HDD (where it's installed)

[figure these are the relevant specs, can detail more if needed).

I've tried everything, from driver updates (now up-to-date) to switching off antiaisaling and all other settings.

Can someone plz help me. Love the game play but so annoyed at the issues. I would have thought a brand new comp would be able to play this game?!?!

Hawk

Share this post


Link to post
Share on other sites
Hi all

I only just got this game, and only 2 weeks ago upgraded my computer. Problem is the FPS are terrible, from startup to game play. I've been able to play for some time, but after removing a anti-virus software that was meant to have an effect on this game, it's now just totally unplayable. While fiddling with the settings i was able to get up to 70 fps in the lobby (looking over water) but they only lasted a few minutes. Now it's sitting at 12 FPS, but sometimes goes as high as 25. (now in a test, in a server (invade and annex co-op) with only 5 people, i have a FPS of 12.

My specs are (as said brand new)

i5 3760k (not overclocked... yet)

ATI 280X 3 GB GPU

240 solid state HDD (where it's installed)

[figure these are the relevant specs, can detail more if needed).

I've tried everything, from driver updates (now up-to-date) to switching off antiaisaling and all other settings.

Can someone plz help me. Love the game play but so annoyed at the issues. I would have thought a brand new comp would be able to play this game?!?!

Hawk

In MP, FPS heavily depends on server. How long the server has been up, how many FPS the server is running at (less fps the longer it has been up).

Try another server.

Also run Shadows = High, moves them to GPU. Object detail at Standard to start with, draws less objects. HDR and SSAO, play with those. View distance 1500 - 2000 as infantry. Turn off grass. Usually theres a scroll wheel menu with 'Settings' or similar. If that doesnt work, try holding down 'U'-key and choose show status, go to settings.

Share this post


Link to post
Share on other sites
Hi all

I only just got this game, and only 2 weeks ago upgraded my computer. Problem is the FPS are terrible, from startup to game play. I've been able to play for some time, but after removing a anti-virus software that was meant to have an effect on this game, it's now just totally unplayable. While fiddling with the settings i was able to get up to 70 fps in the lobby (looking over water) but they only lasted a few minutes. Now it's sitting at 12 FPS, but sometimes goes as high as 25. (now in a test, in a server (invade and annex co-op) with only 5 people, i have a FPS of 12.

My specs are (as said brand new)

i5 3760k (not overclocked... yet)

ATI 280X 3 GB GPU

240 solid state HDD (where it's installed)

[figure these are the relevant specs, can detail more if needed).

I've tried everything, from driver updates (now up-to-date) to switching off antiaisaling and all other settings.

Can someone plz help me. Love the game play but so annoyed at the issues. I would have thought a brand new comp would be able to play this game?!?!

Hawk

I'd say something is wrong with your configuration there. Got a similar system and I run at 45 frames on multiplayer, in the very worst case drops down to 25 when the server gets bad. And if you upgraded your system 2 weeks ago did you make sure to install all the chipset, ATI catalyst and all the other drivers? Also make sure to play around with your graphics settings, like disabling vertical sync, using most settings to medium/high and lowering down your view distance. Last, but not least did you format your hard drives before upgrading? Because old chipset drivers might be interfering with your current chipset.

My system:

i5 3570k (Stock clock at 3.4ghz)

HD 7750 1GB GDDR5

Standard 7200rpm HD

Dual channel Crucial Ballistix 8GB DDR3 at 1600Mhz

Share this post


Link to post
Share on other sites

I didnt mind much for framerate but it seemed sluggish. So I got a program to show fps... turns out my framerate is 10-20 on average in combat. Even on Very Low graphics settings with 1024x768 resolution!

Vanilla game (dev build/no dev build) 3 vs 3 groups match in Altis town. CPU utilization is 30-80% tho.

Specs:

Windows 7

I5 quadcore @ 3.8Ghz

HD7850 2gb (newest drivers)

8GB ram

SSD

I will go do some test now but I doubt that the problem is on my end.

Edited by b0s

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now

×