Jump to content

dnk

Member
  • Content Count

    578
  • Joined

  • Last visited

  • Medals

Posts posted by dnk


  1. So, what number of AI are we entitled to with a modern CPU? I guess 100 isn't enough. So is it 200? 500? 1,000? How many virtual companies and brigades should we be expected to play with on a home computer?

    And how good of AI are we entitled to with each number? How much will we complain when our 500 AI move like headless chickens and struggle to follow basic formations?

    And what should we prefer, 500 crappy stupid AI or 100 mediocre silly AI? I suppose 1,000 near-human AI, right?

    ***

    Agreed that the series can't really go one more iteration without multithreaded AI. Actually, it's probably necessary for the current iteration (at least it'd be very nice to have). I still think the biggest, and thorniest, issue is that the render and sim steps aren't parallel, but God knows how hard that is to pull off...


  2. Honestly, the directional/environmental/distanceal aspect of the sound engine in this game would be disappointing by 2001's standards. It's abysmal. By far the worst I've seen pretty much ever in a FPS, or at least in the past 15 years (including A2, which was better by a good bit).

    I know they've got a lot on their plates, but... this robs you of so much immersion, not to mention it makes MP quite a bit more challenging.


  3. If you're a noob, don't overclock, certainly not by 15-20%. You might be able to get a 5% bump without problems. Might. Might mess up your BIOS settings in the process. Probably. You can easily fry a chip/motherboard with one wrong click, so ignore that 'advice'.

    ffs why does everyone tell people who know almost nothing about computers to "just OC it", like there are no risks for a total novice or generic computer?

    Short answer: I can't speculate without full specs.


  4. Well, I stand firmly corrected on the 3GB thing :P

    I don't think SSD's can improve stutter, just load times. Once you play a game the cpu moves all files into your RAM because it has a faster access time I believe. At that point your SSD is not doing much unless you don't have enough RAM, in which case the cpu needs to get those files from the SSD. That would be where the stutters would come from. More RAM will fix the stutters unless the graphics settings are too much for your GPU. Try lowering your settings to make sure it's nor your gpu first.
    Well, you can argue whatever you want, lots of SSD folks claim it totally fixes stutters. I'm buying one in the next week, I'll let you know also (and also more RAM, and a new GPU, fun week). I plan to do a pretty full performance writeup with all the components.
    Get yourself some more RAM. 24GB - 32GB is best.
    Yeah, just spend like $300? How does that actually compare to an SSD (which has way more storage space to boot) at $100?

  5. Personally, I'd like 2/3rds realism here, which is to say model reality in terms of armor, then add a 67% modifier to their protection. 556 can do 7-8 shots, now it's down to 5 or so. 762 can do 3-4, now it's 2-3. That's what we had before (about, I think 556 was a wee bit more lethal), and it felt like a good balance.


  6. You really need to learn how to give specs. These are very minimal. I have no idea:

    1) how much RAM on your graphics card

    2) how many sticks of system RAM

    3) speed of system RAM

    4) CAS latency of system RAM

    5) hard drive (how fast, what brand)

    6) brands in general

    IQ settings are also good to list, as well as actual specifics about your performance.

    You will only get back as much as you give with these posts, and you have given basically nothing.


  7. Again, Arma can only use up to 3GB of system RAM. When pressed for RAM, my system can reduce all other uses to well under 1GB. So how would adding over 4GB help?

    It might be your extra RAM was also faster?

    I'm genuinely curious, because right now I'm considering upgrading my RAM, but I'll probably go with an SSD (not just for this game, but for other apps also). I figure the SSD will do more to remove stutters and "big frames" in this game than faster/more RAM (currently 1066/7 2x2 DC).


  8. I really don't remember ghost recon having realistic armour. It was just super unforgiving. Not exactly realistic.

    But I did like the fact that GR wasn't so hitpointish. Now matter what the gun there was always a chance of the guy dying instantly or taking several rounds. Better guns tended to perform better but there was no guarantee. Really added to the tension. That aspect was pretty realistic imo. But not the armour. But the hit reactions were pretty damn awesome too.

    Here's the thing about realism: you sort of can choose with modern armor (not so much earlier ones) to have either realistic Armor modeling OR realistic Gameplay.

    What I mean is that in GR1 (and in Arma when it's got weaker armor) I was a lot more worried about exposing myself to shots. I knew I couldn't just twitch reflex my way out of a snafu because I'll be dead before I turned 90 degrees. With the current armor system, though, I can just run through a hail of bullets, come out the other side, and probably kill whoever was silly enough to shoot at me in the meantime. If you're playing on any sort of game mode that allows "healing", you can end up living through many many hits without issue.

    So, ok, armor is as "realistic" in its protection as ever, but the gameplay has suffered greatly. Now, if it was that armor was properly modeled, so a shot to an unprotected area would hurt me more greatly, or be immediately incapacitating, I'd adjust my gameplay back towards "realistic". Or even if there was a great enough effect of bullet impact (2sec ragdoll for some shots, more ppeffects or damn near anything negative other than a minor flinch sometimes).

    Ghost Recon had good wounding effects, where you would get sort of spun around a bit, and the AI would react in a visually pleasing manner when you shot them.

    Visually though, GR had alot of great effects. Such as fairly precise wound textures that appeared more or less where you shot the guy (Rogue Spear had pinpoint-accuracy in this regard). Also a different set of textures for grenade wounds.

    Yeah, I think I've always sort of appraised Arma's quality in this realm to GR1, which was my first "realistic"(ish) mil game. It's always been a disappointing comparison. Some of those wound textures -for the time- were quite dreadful looking. The animations were top-notch for the day, and since Arma has no injury animations, I guess they far surpass even A3's... by default.

  9. The game doesn't use more than 2 or 3GB of RAM (I forget which), so you'd see no gains from increasing it. I have 4GB and never see my total use rise past about 3.6GB. Most of the RAM usage seems to be on the graphics card.

    SSD would be the way to reduce stuttering, but it seems you've got one already. To be frank, a couple seconds of stuttering once every 5 minutes is nothing. I get constant ministutters every time I turn my head always. What might be the issue is your graphics RAM. If you have IQ settings too high, occasionally it will max out and have to clear out a large part of the memory and start reloading data, at which point you might see the stuttering. Using some sort of benchmarking program (like hwinfo) that measures both VRAM usage and FPS or frametimes will help a lot in finding if that's the issue. Easy fix is to reduce IQ, particularly texture quality.

    Also, your specs are very basic and you didn't mention your game settings, so it's hard to give any further information.

    Also, this is probably best put in the Troubleshooting Forum.


  10. ACE and mods make it a milsim. It's always been a very weak milsim by comparison to non-FPS milsims (flight sims in particular) where attention to detail, simulation, and realism are far greater than this. The campaigns speak for themselves on the overall tone being struck.

    Now, yeah, it's a very "hardcore" milsim by comparison with just about every other FPS ever made, but almost every other FPS ever made hasn't put much effort into simulation.


  11. Just play with the settings yourself and find out. Takes like 5 minutes. You need a thread for that?

    Everyone will have different results anyway due to different hardware. Some people might not see any change with AA settings, while others will see massive FPS shifts.

    It also depends on situation/scene/AI in use/etc.


  12. It would make a difference because you were doing the rendering and simulation in parallel, instead of one after the other. Perhaps that was not clear from my [bracketing]. Also running AI (in more than 1 thread perhaps) and other sim aspects in parallel (if possible).

    The idea is that you render the last frame while simulating/updating the next at the same time - there is some added overhead of course, and I'm sure it causes more issues for cache/memory management that will slow the two parallel processes down compared to their serial execution, but given my CPU runs at like 40% overall even on just a 3.3GHz, I doubt that's going to negate the increased speed/CPU usage from this.

    Also, you allow the player's inputs to be read and applied at the very beginning of the new frame before this two-part process starts, but right after the previous frame had been fully rendered, so the input lag is reduced (what that specific post was trying to address).

    All that said, I'm well in over my depth on this, so I'm really just trying to grapple with why we're stuck in serial with these two parts of each frame, and why a better, faster solution hasn't been implemented (time/money, yes, but what are the technical hurdles and how much time/money do they require).


  13. [frame ended, all sent to GPU]

    [player input adjusts simulation state]

    [start to render new frame from simulation state, and start to advance rest of simulation to new state, ballistics/damage/object position changes are computed on one thread, while AI decisions, pathfinding, and actions are computed on multiple other threads]

    [finish render/simulation, all scene has been sent to GPU]

    [player input adjusts simulation state]

    and so on....

    Why is this not possible then, having the player's input taken into account for the next frame, but then handling the rest of the sim after the rendering starts?

    Like, if I'm driving a tank and shooting it, the tank's position is adjusted at the beginning of the new frame, and if I fire the cannon that is simulated, but otherwise nothing else gets touched. Then the current state is sent to render while the rest of the world is simulated around the player (perhaps affecting the player, yes).

    There would at least be no more lag than in the current engine, perhaps less since the player's actions are being taken into account at the very last moment possible before rendering starts.


  14. Draw calls.... Nobody in the Engine Biz, is doing any "REAL" multi-threading. The big Devs have abandoned it. The biggest engines have to remove alot of planed features due to Drawcalls/performance. There are tools you can test for yourself. You can move alot of stuff to different cores/threads but the overhead to the API/DX is killer. Let alone the netcode that EVERY Dev is having to deal with. 64bit is for what? using 8GB of RAM? to cache/stream large textures (4k)?. Sounds good... hows that working out for Frostbite, in a MP situation? About the only new idea is Mantle... hope to see that working on a complex MP game.
    Still, couldn't we at least run the sim and render steps in parallel, and the AI and other sim stuff also in parallel? Surely, that would decrease frame times significantly, even if we're still bottlenecked at the back end in terms of scene complexity, and indeed it could allow for more draw calls since we are no longer having to wait for the simulation each frame (we'de be render-bottlenecked, so to speak).

    I suppose I don't understand why if API/driver overhead is the thing keeping the render half of the problem from performing better, why that can't be done in parallel. Is there a reason these translation steps need a single thread on a single core?

×