frawo 10 Posted February 7, 2014 Could also make an opengl renderer with proprietary extensions. But why bother when it won't really help performance in a big battle and is not supported for a large part of the user base. If no one bothers about new technology, how can it ever become standard? And where is our future then? Take me for a fool, but I don't want to be sitting here for the next 20 years, constantly buying windows bloatware just to get the latest DX. And always having in mind that my computer could do much better with another renderer ... So, why does Dice bother about Mantle, as they have an engine that runs incredibly good even without it? Just for the fun because they are rich? Oh, don't make me laugh! Share this post Link to post Share on other sites
nikiforos 450 Posted February 7, 2014 It's great to see that kind of interest for Mantle but it will be more interesting to hear what the devs and BIS thinks about it. Yes , no or maybe? Share this post Link to post Share on other sites
Leon86 13 Posted February 7, 2014 So, why does Dice bother about Mantle, as they have an engine that runs incredibly good even without it? Because they're being paid to do so. Share this post Link to post Share on other sites
frawo 10 Posted February 7, 2014 Unlikely. the battle simulation is what's holding back performance, not the rendering overhead. On an empty map performance is fine. Or, make a big battle in the editor, get low fps, press escape, freezing the battle simulation and watch the fps go up dramatically.It'll only allow for higher viewdistance at roughly the same performance, not increase performance. So the correlation between viewdistance and performance will be gone with Mantle? Hmm, I highly doubt that ... And why does the performance in the Star Swarm Benchmark really explode with Mantle? It also is a battle simulation, similar to Arma in many aspects. Share this post Link to post Share on other sites
hardsiesta 1 Posted February 7, 2014 Well where have you been the last few Arma 3 patches, performance has been degrading across the board not improving. Decisions where made against performance. Also im not buying the BI is poor argument. You could have talked about this some years ago but dayZ and Arma 3 has made BI alot of money now. Theyre also rich now. Playing the game. Failed to see any dramatic decrease in performance, although it may well be the patches have given or taken some frames. Overall the playability and performance have increased over the years, including A3, for myself at least. So you're not buying the straw man you just made yourself? Cool. They still both haven't received a million from AMD, or made the game for benchmark enthusiasts. It's ArmA, even if it's visuals have been brought to competitive level. I don't think this is a reasonable answer. I see his point BUT ... They have a product they want to sell. The buyers however will have to have a hardware infrastructure to run their product which is, according to steam statistics, probably in the hand of the upper 10% of all steam users. If I look around, a lot of friends would go and buy a used Radeon 7850 to play ARMA3 but they are also sitting on Phenom II X2,X3 and Core2duo and max Q6600 systems. So upgrading the graphic card won't change it. They just don't buy ArmA3 because it sucks on those systems. If a Mantle support would be available, the Framerate of ArmA3 on these systems would probably increase by 30-50% which makes it playable. Instead of $59 for Arma + $100 for a used 7850 (the latter pays of in other games too) they need to invest about $500 to play Arma without Mantle support and this just is to much for a Game as even any hardcore ArmA Fan like me knows.So spend 3 devs, 2 months on Mantle (Pricetag: X) Make money and share by getting new costumers. (Priceless) I think this is a very reasonable decision.... Ned We have ArmA because they're making the game first, business next. Not the other way around. This isn't EA, and that's a pretty damn good thing. Going for Mantle at it's experimental stage is hardly any business move in any case. You're free to believe anything else, though. With those CPU's, A3 isn't the only modern game your many friends might want to skip. Although I myself can play A2, A3 and DayZ on low-end hardware just fine. Besides, people don't generally buy computers just to play one game. That's kind of a console thing. So they're not buying/upgrading their puters just for A3? Wow, I'm sure BIS can rest assured and invest in Mantle to get them to do so. Right. What's your "probably 30-50%" based on? All of your numbers are imaginary. Another reason why I shouldn't expect you to see it's obvious that BI or any smaller producer would rather concentrate on making the game run better on DX for everyone, and see about Mantle once there's more than one broken game trying to use it, potentially yielding some extra frames for <10% players, sponsored by AMD, which isn't A3. You should go tell AMD to sponsor Mantle to ArmA instead of trying to convince anyone that BIS should take it up on their own, with such arguments. Share this post Link to post Share on other sites
Leon86 13 Posted February 7, 2014 And why does the performance in the Star Swarm Benchmark really explode with Mantle? It also is a battle simulation, similar to Arma in many aspects. They probably have a fully multithreaded simulation, so every bit of cpu time saved on the directx overhead can be put to use in making the simulation go faster. Or they have a simpler simulation, meaning the rendering overhead is more of a bottleneck, or both, which is most likely. Share this post Link to post Share on other sites
hardsiesta 1 Posted February 7, 2014 So, why does Dice bother about Mantle, as they have an engine that runs incredibly good even without it? Just for the fun because they are rich? Oh, don't make me laugh! That's a joke, right? Well, I guess... When it actually runs. Go ahead and laugh, they were paid by AMD to do it and it was just a another (free) bait to attract benchmark enthusiasts and media coverage in general, for their PR department which already composes most of EA's game development. So yeah, for fun and more money that isn't a problem for them anyway. Share this post Link to post Share on other sites
duffbeeer 10 Posted February 7, 2014 That's a joke, right? Well, I guess... When it actually runs. Go ahead and laugh, they were paid by AMD to do it and it was just a another (free) bait to attract benchmark enthusiasts and media coverage in general, for their PR department which already composes most of EA's game development. So yeah, for fun and more money that isn't a problem for them anyway. What? Star swarm is running flawless, issues are only occuring in bf4. Also this is still a beta driver. Actually people like you BI fanboys deniyng simple facts like worse performance since last patch isnt particulary helping BI either. Pls take a look at the most popular arma 3 missions. You will quickly realize how mission makers are struggling with the performance since the latest stable patch. EUTW warfare is running with 20 - 25 despite nearly not having any AI at all. Breaking point went from being playable to unplayable for alot of users. And its all due to the changes BI made to the AI in the last patch. Dwarden himself even admitted that. Multi core cpus are being idle most the time except for one core. And you still keep telling the fairy tale about BI actually caring about performance. Share this post Link to post Share on other sites
hardsiesta 1 Posted February 7, 2014 What? Star swarm is running flawless, issues are only occuring in bf4. Also this is still a beta driver.Actually people like you BI fanboys deniyng simple facts like worse performance since last patch isnt particulary helping BI either. Pls take a look at the most popular arma 3 missions. You will quickly realize how mission makers are struggling with the performance since the latest stable patch. EUTW warfare is running with 20 - 25 despite nearly not having any AI at all. Breaking point went from being playable to unplayable for alot of users. And its all due to the changes BI made to the AI in the last patch. Dwarden himself even admitted that. Multi core cpus are being idle most the time except for one core. And you still keep telling the fairy tale about BI actually caring about performance. Just that last time I checked, Star Swarm was on Nitrous, not Frostbite, and wasn't made by Dice. So your favorite custom missions went unstable with a patch that changed something, and you'd want BI to be one of the first to invest into Mantle because you'd really, really want it? My condolences. I still don't have your problems and I don't even have to lie about it. But I might have got an improved AI. Core utilization is not a real problem for myself either, but I do recognize that as a potential area for improvements. Doesn't break my game either way. But you go on with the drama, straw men and "fanboys", girl. I'll try not to stand in your way any further, with my obvious fanaticism about the subject. Have fun. Share this post Link to post Share on other sites
Leon86 13 Posted February 7, 2014 Multi core cpus are being idle most the time except for one core. With mantle it'd be the same, it's pointless to implement it until the simulation runs faster. Share this post Link to post Share on other sites
calin_banc 19 Posted February 7, 2014 (edited) Unlikely. the battle simulation is what's holding back performance, not the rendering overhead. On an empty map performance is fine. Or, make a big battle in the editor, get low fps, press escape, freezing the battle simulation and watch the fps go up dramatically.It'll only allow for higher viewdistance at roughly the same performance, not increase performance. It depends on the cpu and the scene you are testing in. More soldiers or "stuff" on the map, which is exactly what you have in campaigns or missions, means higher complexity of the scene and there is where mantle kicks in. Because they're being paid to do so. And kuddos to them, for me the performance jump was HUGE (50% +) at the minimums and median and that's what matters in the end. Although worth mentioning, they were "coop developers" with AMD. They do stand to gain as well, 'cause in the end, their engine will perform better on lower end systems as well = more customers! So the correlation between viewdistance and performance will be gone with Mantle? Hmm, I highly doubt that ... it doesn't go completely, but it IS significantly lower to the point of irrelevance, depending on the complexity, can go from few percent to about 2x or more. Edited February 7, 2014 by calin_banc Share this post Link to post Share on other sites
duffbeeer 10 Posted February 7, 2014 Just that last time I checked, Star Swarm was on Nitrous, not Frostbite, and wasn't made by Dice.So your favorite custom missions went unstable with a patch that changed something, and you'd want BI to be one of the first to invest into Mantle because you'd really, really want it? My condolences. I still don't have your problems and I don't even have to lie about it. But I might have got an improved AI. Core utilization is not a real problem for myself either, but I do recognize that as a potential area for improvements. Doesn't break my game either way. But you go on with the drama, straw men and "fanboys", girl. I'll try not to stand in your way any further, with my obvious fanaticism about the subject. Have fun. Thanks for confirming that you simply dont care because youre obviously got an i7 or something. Also there is tons of other missions having issues since last patch not only my favourites. Like domination or warfare BE. You shouldnt take your very own situation as a given. Alot of people are having problems despite your setup being working fine. Share this post Link to post Share on other sites
mamasan8 11 Posted February 7, 2014 With mantle it'd be the same, it's pointless to implement it until the simulation runs faster. I don't see it that way. Directx drawcalls happen on that one core. Freeing up that time for the simulation would benefit the simulation and should give more CPU-time for the GPU as well. Share this post Link to post Share on other sites
Leon86 13 Posted February 7, 2014 Directx drawcalls happen on that one core. I don't think so. Arma uses multiple cores just fine on an empty map, it's just a large part of the battle simulation that's singlethreaded. Share this post Link to post Share on other sites
calin_banc 19 Posted February 7, 2014 Even so, there are points where GPU usage drops. Mantle is a must, multithreaded physix is a must (thankfully, it is though I don't know how many threads it uses), multithreaded ai (or even accelerated on the gpu), etc. All those need to be implemented in time for the series to go forward. Share this post Link to post Share on other sites
kklownboy 43 Posted February 7, 2014 I don't think so. Arma uses multiple cores just fine on an empty map, it's just a large part of the battle simulation that's singlethreaded. Really? that would be a no. Or do you mean two cores as "multiple"? Drawcalls are the number one issue for stall on the CPU. There are several tools out there you can check it out yourself. Even Frostbite, and Crytec quit ther optimization for multi-threading, due the the ever increasing issue of Calls(Draw and Cull) when using more cores...Wasnt worth the effort, so they only took it so far, and started up Mantle. It would be awesome if BIS went with some lower API just for cooler effects. Use the vidcard power we have fully. Basically lighten the load. Share this post Link to post Share on other sites
nedflanders 12 Posted February 7, 2014 (edited) Battlefield simulation is the bottleneck and this is why Mantle won't help right? Lets see the facts: http://www.computerbase.de/artikel/grafikkarten/2014/erste-eindruecke-zu-amds-mantle/3/ BF4, Singleplayer on AMD 7850K APU (Mid Range CPU Low end Graphics) Mantle 33,9 FPS DirectX 33,5 FPS No advantage from Mantle....purely GPU limited.... So let's add a Battlefield simulation and go Multiplayer Mantle 34,6 FPS DirectX 25,8 FPS makes 34% advantage... and this is not with a dedicated high end GPU where the difference would be even way higher. I really don't understand why someone would oppose Mantle in ArmA3. Its the easiest and cheapest way to increase performance in ArmA3 without reducing complexity Its even coded in the same language... DirectX and Mantle use the same High Level Shading Language (HLSL). Edited February 7, 2014 by nedflanders Share this post Link to post Share on other sites
Leon86 13 Posted February 7, 2014 (edited) Really? that would be a no. Or do you mean two cores as "multiple"? Drawcalls are the number one issue for stall on the CPU. There are several tools out there you can check it out yourself. Even Frostbite, and Crytec quit ther optimization for multi-threading, due the the ever increasing issue of Calls(Draw and Cull) when using more cores...Wasnt worth the effort, so they only took it so far, and started up Mantle. It would be awesome if BIS went with some lower API just for cooler effects. Use the vidcard power we have fully. Basically lighten the load. Meh, I get 60% with peaks to 80% cpu use on my quadcore in agia marina without ai, about 80 fps. With a big battle you can get it to go under 30 all the time, but if you then press escape, freezing the simulation you get something like 60 fps, even though the draw calls remain the same. While it would be cool to have some lowlevel api for massive viewdistance, but I'd rather have the simulation go faster first (although some people here have already lost hope this will ever happen). Edited February 7, 2014 by Leon86 Share this post Link to post Share on other sites
calin_banc 19 Posted February 7, 2014 (edited) You are missing the point Leon. That extra cpu time wasted on that inefficient render technique, can be used to speed up the simulation even in the state that it is now. Edited February 7, 2014 by calin_banc Share this post Link to post Share on other sites
Leon86 13 Posted February 7, 2014 but how? moving from 4 to 6 cores gets you loads of extra cpu time and does very little for performance :( Share this post Link to post Share on other sites
calin_banc 19 Posted February 7, 2014 Simple, instead of doing 15-20 ms out of 33.33ms frame (presuming you're doing 30fps), on the render, you're ending up doing only 4-5ms and you put the extra 10ms for simulation time, which means 33,3ms - 10ms= 23,3ms, which means about 42 fps from 30fps. Also, because you are doing work much more efficient, input lag is lowered = better aim control and precision. Of course, I've put that numbers out of nowhere since I don't know how much time everything takes during a frame, but is a good example of what you can expect. They are putting everything into perspective - . Long watch time, but worth it. Share this post Link to post Share on other sites
Leon86 13 Posted February 7, 2014 It would help with mouse-screen latency yes. Share this post Link to post Share on other sites
mamasan8 11 Posted February 7, 2014 (edited) but how? moving from 4 to 6 cores gets you loads of extra cpu time and does very little for performance :( Arma 3 doesn't really seem to be made for more than dualcores. Thats why you see no difference no matter how many cores you throw at the problem. But...I just tried the Star Swarm demo and one interesting thing popped out. I ran it in both DX and Mantle-mode. In DX-mode, when there was maybe 5000 units on the screen, my FPS plummeted to single figures, CPU usage went to a third and GPU usage halved. Sound familiar? This was not the case with Mantle. Settings: Follow mode, Timed Run, Extreme detail First, the Mantle test results: =========================================================== Oxide Games Star Swarm Stress Test - ©2013 C:\Users\so\Documents\Star Swarm\Output_14_02_08_0103.txt Version 1.00 02/08/2014 01:03 =========================================================== == Hardware Configuration ================================= GPU: AMD Radeon HD 7800 Series CPU: AuthenticAMD AMD FX-8350 Eight-Core Processor Physical Cores: 4 Logical Cores: 8 Physical Memory: 12849831936 Allocatable Memory: 8796092891136 =========================================================== == Configuration ========================================== API: Mantle Scenario: ScenarioFollow.csv User Input: Disabled Resolution: 1920x1080 Fullscreen: True GameCore Update: 16.6 ms Bloom Quality: High PointLight Quality: High ToneCurve Quality: High Glare Overdraw: 16 Shading Samples: 64 Shade Quality: Mid Deferred Contexts: Disabled Temporal AA Duration: 16 Temporal AA Time Slice: 2 Detailed Frame Info: Off =========================================================== == Results ================================================ Test Duration: 360 Seconds Total Frames: 12929 Average FPS: 35.91 Average Unit Count: 4070 Maximum Unit Count: 5415 Average Batches/MS: 538.07 Maximum Batches/MS: 2749.41 Average Batch Count: 17290 Maximum Batch Count: 157669 =========================================================== Lowest FPS I saw was 16 FPS. Now the DX-results: =========================================================== Oxide Games Star Swarm Stress Test - ©2013 C:\Users\so\Documents\Star Swarm\Output_14_02_08_0051.txt Version 1.00 02/08/2014 00:51 =========================================================== == Hardware Configuration ================================= GPU: AMD Radeon HD 7800 Series CPU: AuthenticAMD AMD FX-8350 Eight-Core Processor Physical Cores: 4 Logical Cores: 8 Physical Memory: 12849831936 Allocatable Memory: 8796092891136 =========================================================== == Configuration ========================================== API: DirectX Scenario: ScenarioFollow.csv User Input: Disabled Resolution: 1920x1080 Fullscreen: True GameCore Update: 16.6 ms Bloom Quality: High PointLight Quality: High ToneCurve Quality: High Glare Overdraw: 16 Shading Samples: 64 Shade Quality: Mid Deferred Contexts: Disabled Temporal AA Duration: 16 Temporal AA Time Slice: 2 Detailed Frame Info: Off =========================================================== == Results ================================================ Test Duration: 360 Seconds Total Frames: 7548 Average FPS: 20.96 Average Unit Count: 4068 Maximum Unit Count: 5675 Average Batches/MS: 320.53 Maximum Batches/MS: 899.06 Average Batch Count: 17483 Maximum Batch Count: 110843 =========================================================== Lowest FPS was 6 FPS. Quite a difference wouldn't you say? 36 avg FPS vs 21 avg FPS This is on Win 7 64-bit. Edited February 8, 2014 by mamasan8 Share this post Link to post Share on other sites
calin_banc 19 Posted February 9, 2014 http://www.techradar.com/news/computing/pc/amd-on-mantle-we-want-our-gaming-api-to-become-the-industry-standard-1218560 AMD told TechRadar that it'd be willing to make Mantle, or an API based on it, available across the industry. Even, the company said, if it means Mantle is adopted by competitors like Nvidia."Mantle for now is straight up in a closed ecosystem, a closed beta, which you have to do in a complicated project like this to get it off the ground. It's us and a few key game developers," Robert Hallock, technical communications, AMD Graphics & Gaming, told us in a recent interview. "After that phase is done, we do hope that Mantle becomes an industry standard. We'll be releasing a public SDK later this year, and hope that others adopt it. If they don't adopt it itself, then we hope they adopt APIs similar to it that become an industry standard for PC gaming." Share this post Link to post Share on other sites
mamasan8 11 Posted February 9, 2014 Let's hope this comes to fruition so all games get the fix. Who doesn't want 3 times better FPS in certain situations (when it actually counts), hands up? And 10-50% better in every other situation. Share this post Link to post Share on other sites