Jump to content

dnk

Member
  • Content Count

    578
  • Joined

  • Last visited

  • Medals

Posts posted by dnk


  1. Yeah, but I just don't see my pagefile being read all that much. 10 minutes of intense gameplay over 1km with a 2700m VD resulted in 434/691 reads/writes, for a total of 13MB read and 723MB written (obv, a lot is being written, likely at game start, though I had other programs running in the background, including Firefox, so it could've been W7 moving their RAM files onto the pagefile). Immediately after stopping play and resuming the other programs, I got a ton of new reads (10k+ for over 200MB in 4min), which sort of supports my idea that a lot of those writes were moving lower-priority stuff out of the way, not moving game-related files around during play.

    By comparison, reading of the "anims_f_data.pbo" was at 1600 with 44MB read; "sounds_f.pbo_" had 1370 reads for 47MB; "map_stratis.pbo" was 1333 for 27MB. The pagefile just isn't being read that much compared to the disk, and I severely doubt the game or Windows is placing such extremely highly used data like that in the FSM on the pagefile and reading it from there often. If it was, it would be reading it every frame, perhaps multiple times, which would have over 10,000 reads in 10min. Same goes for textures/models being referenced every frame or every few seconds. I don't know how much time needs to elapse between when they are last used and W7 shoves them off to the pagefile, but it can't account for frame-by-frame low performance when your view isn't changing that much. That's more likely the latency/bandwidth between main RAM and the GPU or CPU.

    Additionally, the game is only using about 1.4GB of 2.5 available, so the Prius is hardly full even with a 2GB limit.


  2. It largely depends on the size of the program you're running and the size of the instructions being performed as well as the amount of instructions there could be.
    Right, I'm not being exact in my wording. The point is that if you only have to read the FSM to get the instruction set for the AI routines once per simulation step, you only have to read from RAM that one time for the instructions from that file, which shouldn't cause many issues and wouldn't increase latency in itself when scaling up the amount of AI running. I would assume this is what happens at the least (if not that the instructions are being kept permanently in L3, moving to L2/1 as the engine switches from other threads/instructions to the AI execution).
    The FSM's being an addon would make no difference
    You didn't understand my point. Making them an addon allows you to be able to monitor their reads, while keeping them packed in a PBO doesn't allow processmon or other programs to keep track of when they're read from disk as it will only say that "characters.pbo" was read or whatever file it is. That said, I haven't seen mass reading from that or any AI-related PBOs, so it seems they get loaded at least just to RAM at game start and stay there.
    Also larger texture files and models and the graphical stuff that generally takes the most space, would preferably be stored in RAM or mostly all in RAM because the disk access would be ungodly if it wasn't.
    Then what's with millisecond-lag pop-in and the constant reads from the models/textures files? They are being read from disk regularly during gameplay, as my own monitoring has shown. In ~15min gameplay, I had 1200 reads from just the structures PBOs. I had 2300 reads in another 15 min of play from an addon's environmental sounds (which did add a large %age of extra I/O).
    So it's basically like a very very fast game of musical chairs with 3,000 chairs and 6-7,000 people.
    Yes, my point as well. What "optimizations" are going to change this to drastically improve performance (well, 64-bit RAM usage could help a lot)? It might be they haven't implemented everything, and BIS has never been a graphics/rendering-focused developer (the simulation is the core of their engine and military contracting business) so that might just be the case.

  3. It then has to read from the pagefile.

    Even if you sat with an API monitor and watched hard drive usage, you wouldn't see a lot of usage because you're measuring bandwidth not transfer latency.

    It's possible that highly-used FSMs are being kept on the pagefile, but I would highly doubt such a mistake would not have been corrected long ago by the devs. Those are fairly small files, I believe, and could easily fit on even L1 cache, easily on L3, certainly on main RAM. Now the latency between RAM and the CPU may very well cause performance issues when they're referenced 80 times per simulation step if they aren't being stored on-die.

    We've talked about this before. If you can isolate the FSMs from their PBOs and use them as an addon, it's not too hard to figure out if they're really being read often or not from disk (but not RAM - not sure how to check if they're on cache or not).

    Otherwise, I agree with the rest (if true). Perhaps it's not the FSMs but some other, larger data that needs referencing through the FSMs that's causing the hangup. There's a lot of data that needs to be saved for each unit (time duration values, states, etc) that might not be able to be saved on-die. If this is the case, there's little BIS can do about hardware limitations (assuming, as I would, they've already optimized cache usage). Many other games wouldn't have this issue due to much simpler AI requiring far fewer reads from off-die. I'm not qualified to say, though, this is something we need a dev to answer.

    ...a 6 year old video card as the minimum?!?!?
    That's not so uncommon. The last STALKER title (which was a bit similar with lots of semi-competent open-world AI and a decent draw distance with complex terrain) had an old FX-series nV card for its minimum. I think those came out in like the 90s or something. By comparison, A3's min specs are much higher, and tbh the game doesn't look half as nice as the STALKER series.

  4. @Insanatrix, etc

    Disk streaming only seems to affect pop-in, not actual performance. Seems like the engine loads the lowest LOD/mipmap first, displays it fairly quickly, and then waits for the larger file to get streamed into RAM and then to the GPU for display. Certainly, being able to address more than 2-3GB of RAM would help reduce pop-in a lot since we could keep LOD/textures on hand, but it wouldn't improve FPS much if at all, as the SDD/Ramdisk users have discovered.

    I think the GPU underutilization issue is clearly (ignoring bugs/poor "optimization" for certain setups) an issue of limited bandwidth and/or latency issues. You agree, yes.

    Ultimately, I think it comes down to:

    The Arma series has a similar amount (maybe more?) of textures it needs to load into the GPU constantly compared with other games (assumed)

    The Arma series does less image processing compared with other games

    That's how it makes sense to me that the quality of graphics rendering in this game is clearly much less advanced than in other current games (even years back), yet we have low performance and GPU core underutilization occurring. Other games are also affected by bandwidth/latency issues, yet this isn't as prominent for overall performance because all the eye candy keeps the core running at full-steam, and itself becomes a limiting factor often.


  5. Anywhere with:

    Ocean

    Mountains

    Good rainfall

    Human habitation

    Unfortunately, the last few BIS maps have been:

    Mountains

    Little rainfall

    Minimal human habitation

    Zargabad was perhaps an exception, as much of the map was well inhabited with farms and such, but it was fairly small.

    Which is boring as hell and leaves the maps feeling fairly empty. Chernarus was nice, but again suffered from limited habitation. Only a few farms, most of the map was just endless forests and plains, with a few small towns sprinkled throughout. Yeah, it runs faster and is easier to make, but it's not realistic or as interesting as a proper map that reflects real places where people have modified their entire surroundings and crammed themselves into every little nook and nature has been relegated to the margins, whether those be between farms or in the swampy/rocky/sloped areas that can't be properly cultivated/developed. That makes for more interesting tactics and a more "alive" map. "Giant empty plains" map makes for dull tactics and the sense of unfinished work.


  6. ^^^

    he was running at the same timings, but he did a final run with 1333 tighter timings.

    Interesting, because it might just be about latency then, not bandwidth.

    The 9/1600 has a 7% increase in latency versus the 7/1333 and a 20% higher bandwidth, yet the two have statistically equal performance. If bandwidth was the primary issue, the 1333 should still be considerably lower in performance regardless of timings, yet it's basically the same. Clearly the bandwidth has some importance, as the faster 7/1333 is not actually a better performer, but it's clearly less important than overall latency right now. That's for the CPU, though, which likely is reading lots of small packets of information to run the simulation. I'd be interested in seeing how this might change to a bandwidth issue when it comes to the GPU, which has to stream lots of larger texture files.


  7. @OP

    Did you adjust timings for the memory with each increase or were they set at a single CAS/etc?

    I was considering doing something like this myself, but sort of got tired of all this. Anyway, thanks for it!

    @Dwarden, White, Insanatrix, etc

    I've certainly noticed 1 AI underutilization bug (whatever it was, devs haven't explained it) that got fixed regarding AI in cities, so it's clear that these are at least part of the issue, but are you saying that all this time (since A1) that the AI issues have been due to these bugs, or has the engine improved somehow to better handle AI AND it has some new bugs?

    I think if you could explain how the engine has improved regarding AI processing and bottlenecking and threading, it would clear things up (or not).


  8. While discussion is good, keep in mind that the devs don't always participate here. For all we know, they may already have seen this and are discussing it now.....
    The modding community does participate here. They have at least as much of an effect on the game that I play as the devs do.

    I mean, I normally play a totally differently-configurated game with a different weapon/model set and different sounds and different AI and different maps and different coms than what the game OA shipped with...

    These discussions aren't just for devs, and that doesn't matter for the end product the community will get.


  9. @Lucas

    It's been increased a lot in the alpha. OA had far more reasonably skilled AI, though it could be turned up to super-sniper AI if desired (Vet/Expert servers tended to have this effect, as I think those difficulty settings overrode whatever skill settings mission editors had given the AI unless scripted in).

    I would suggest choosing between 0.1-0.25 for both aimaccuracy and aimshake for the AI. You can add a simple script to any mission to change that. I think Zeus AI has something that does it, too (check the addons forums).


  10. Set a waypoint with the combat mode set to "never fire" which puts the troops in position. Then have a trigger synchronized to a second waypoint that switches them to "engage/fire at all". That's probably the easiest way, I think, but I'm not sure.

    Additionally, you could just not spawn the units until a trigger is activated. That doesn't take much work - just a simple .sqf and some markers.

    Also, you can try using "disableAI" and "enableAI" via script in a similar manner to the first method. Just disable everything and then reenable it when the trigger is triggered.


  11. Re: the claims about PVP, "thinking/tactics game, not a twitch shooter after all"... :rolleyes: the reality is that "viable or not?" is dependent on the server and on the mission.
    Eh, I've not played Wasteland, but I've done plenty of CTI/WF and DayZ with 200ms+ pings. Sure, I've died a few times from it, but it's hardly made it "unplayable" or made me uncompetitive. For COOP it's not even an issue: Fish in a barrel any way you slice it.

  12. Also I get lag with high AI counts, but if I set waypoints and then have a trigger that disables All AI parameters and turn them all off, I don't experience an increase in performance between AI on/off, it's a fluctuation of 2-3 fps at most. I use http://community.bistudio.com/wiki/disableAI . It's a larger issue with the resources the AI take up than the pathfinding itself.
    You're right, but you weren't. What I mean is that after doing some testing today with the newest build, I don't have this pathfinding issue anymore... I did have it with an older build, though. Same setup now yields drastically different results in terms of FPS.

    The SPOTREP only mentions some minor AI changes... I'm lost for why this change happened. I will stop going on about pathfinding now, at least.

    Could be due to:

    ""Geometry fixing (AI collisions with buildings tweaked)""

    My "pathfinding" issue was when I placed the AI in cities and forced them to march through. It used to result in a halving of FPS with 80 AI - now it has maybe a 5% reduction.

    Could also be:

    ""Fixed: AI no longer fires on targets it does not see (but which are reported by other group members)""

    But I took away guns for both tests, so I'm not sure how that figures in (though this is a very nice fix anyway).

×