Jump to content
Sign in to follow this  
joejoe87577

Why Bohemia Why?

Recommended Posts

This is about right for ultra settings without any tweaks on stratis in a little bird in places. Performance really is at those levels. I recommend everyone set ultra, don't tweak them at all and then use the editor and place a guy in a littlebird on stratis and on altis and go fly around and see the performance. You should also drop down and go walk around in the main towns as well. I have been saying this for a long time but you can get arma 3 under 30 fps on a blank altis map just by going into the towns. SP/MP and all the other performance impacts they can bring aren't necessary to bring this game to the unplayable point.

Share this post


Link to post
Share on other sites

An understandably frustrated guy asks why his fps drops by 20 when going from 3rd person to 1st person view, and you people start arguing about what's the correct frame rate for games? :j:

JoeJoe87577, it's the PiP. With High settings and your 3000/2500/100 VD I get 65fps in 3rd, and 45fps in 1st. Disabling PiP gives constant 65.

Share this post


Link to post
Share on other sites

My main problem is the fact, that this happens

NVidia control panel has default settings.

I'm used to crappy performance and stuttering games, but the problem is, even with good hardware (I would consider it as high end) you can't get good and stable performance from arma.

And yes, I like 60 FPS, that's why.

Share this post


Link to post
Share on other sites
This is about right for ultra settings without any tweaks on stratis in a little bird in places. Performance really is at those levels. I recommend everyone set ultra, don't tweak them at all and then use the editor and place a guy in a littlebird on stratis and on altis and go fly around and see the performance. You should also drop down and go walk around in the main towns as well. I have been saying this for a long time but you can get arma 3 under 30 fps on a blank altis map just by going into the towns. SP/MP and all the other performance impacts they can bring aren't necessary to bring this game to the unplayable point.

I would say it would be hard to do that, certainly with his setup. If your pulling frames that low, there is something wrong your end, its not the games fault completely. Yes the game is poorly optimised, although better than it was years ago. But in the editor on an empty map, pulling such low performance with decent systems is just not normal. That is why in comparison to the units sold, there is a very small number of players having issues.

Some of those having issues will be players that have put together their own build (not very well, inexperienced). Some of it will be down to what they have running whilst trying to game. There are lots of reasons a system will perform badly but very few of them are the games fault completely, and yes this series is poorly optimised, we know that, however.

When I first got A3 (when first released in Alpha) I put it onto my old A2 gaming system, an Athlon IIx4 640 with a 5850 2gb card. The results are on my YT channel. It ran fine, with or without AI present & with the drop I was getting at the time using msi for recording, which is also shown in the videos there. Without AI present, the system ran high rates, with AI present it was lower, but still impressive for the setup. This is partly due to it being a gaming pc, so there was nothing on there other than a few games, it wasn't used as a daily pc, so no rubbish at all that wasn't needed. I knew it would run fine, because it ran A2 fine.

So for this player to be getting such bad results, it does say to me there is something wrong his end.

Its not very pleasant for players to be told that their system might not be working right, so the first thing/reaction is for those players to blame the game. But unfortunately half the time, if not a lot more than half the time, its at their end that the problem is, not the games.

Its a sad fact that systems put together by inexperienced gamers turn out bad. Players knock the big online 'full build' pc retailers. But the fact is they put together thousands of gaming pc's, whilst many players put together one or two spaced out over several years. It can be very easy to make, what could be, simple mistakes whilst doing it, then not be able to undo it, or have recourse. Its unfortunate, but that's the way it is.

Upgrading a system halfway through its life can be problematic and best done with the help of someone who knows what they're doing.

Oh, and I don't know what I'm doing where that is concerned, so leave it to others, usually. I will/can build a pc to use for daily use, because for me, that is nowhere near as important as my gaming. For my gaming system, I leave that to those that can put together a decent system, may cost slightly more and nowadays, it is only slightly more to get it built. The costs are merging whereas you buy in the things needed and diy, or buying complete (built), is concerned. There is not that much difference and on the whole, the builders will be honest with you concerning performance you want or need and how much they think it will cost, so your prepared.

Share this post


Link to post
Share on other sites

I've got your point. I'm buidling PCs now for more than 6 years, and not only for me years apart.

And I don't see the problem at my end realy, all games that I've tried run fine. Every bit of software that I use runs fine. Visual Studio compiling complex programs runs fine.

But not ArmA.

Why is it like this? And don't tell me the old thing, they can improve their engine. I know it, I've done this with an old enterprise application running for the last 15 years. Without any change to the UI, just the code between the Database and the UI. And the result was a big improvment in performance and stability...

Share this post


Link to post
Share on other sites
My main problem is the fact, that this happens

NVidia control panel has default settings.

I'm used to crappy performance and stuttering games, but the problem is, even with good hardware (I would consider it as high end) you can't get good and stable performance from arma.

And yes, I like 60 FPS, that's why.

I don't know what possessed you to get a 980ti for a 1080p set up. Assuming you're going higher resolution, ah alas that's for another day.

It's already been noted the pip is an fps killer. your fps reflect expected performance on arma3. Sucks but Unfortunately that is how it is. Fancy gpus don't solve armas shortcomings in this department.

Take it your gpu is chewing up other games.

Maybe go to development branch and try the same tests see what your fps at.

Share this post


Link to post
Share on other sites
I don't know what possessed you to get a 980ti for a 1080p set up. Assuming you're going higher resolution, ah alas that's for another day.

It's already been noted the pip is an fps killer. your fps reflect expected performance on arma3. Sucks but Unfortunately that is how it is. Fancy gpus don't solve armas shortcomings in this department.

Take it your gpu is chewing up other games.

Maybe go to development branch and try the same tests see what your fps at.

His problem is not related to Arma 3, he stated that he has 20fps in singleplayer on an empty map. That can't be. I got a GTX 670 and I got stable 60fps on emtpy stratis, with equal settings. In addition PIP does not cause FPS drops for me.

At OP: What about virus & maleware problems or a bad antivirus programm? This can cause severe fps issues.

Share this post


Link to post
Share on other sites
His problem is not related to Arma 3, he stated that he has 20fps in singleplayer on an empty map. That can't be. I got a GTX 670 and I got stable 60fps on emtpy stratis, with equal settings. In addition PIP does not cause FPS drops for me.

At OP: What about virus & maleware problems or a bad antivirus programm? This can cause severe fps issues.

He also went on to say he is getting 50 to 70 and wondering where this 20 fps loss is coming from going from 1st person to 3rd.

I generally don't notice huge fps drops with pip but that might be just me. I only really notice fps when I get down to low 20s so it's not something I constantly monitor.

I asked how his card runs other games to find out is his card having problems.

Share this post


Link to post
Share on other sites

He didn't say at any point that he had 20fps. His rate drops BY 20. And all other games run fine. Read the first post again.

Share this post


Link to post
Share on other sites
Why is it like this?

A potential (likely even) explanation has been posited. Have you compared the FPS change between internal/external perspective with PiP disabled?

Share this post


Link to post
Share on other sites
It'd be PIP on the mirrors, PIP disables in third person

CONGRATULATIONS!!!!!:yay::yay::yay::yay::yay::yay:

All 15 postings before are epic fail and most 270.000 Postings after it too on an obvious problem. Now we are on page 4 LOL

Edited by JumpingHubert

Share this post


Link to post
Share on other sites
Just curious. How many would you want? :confused:

50 FPS it's an outstanding rate.

Traditional cinema has 24 FPS. And the high speed one 48 FPS.

What? This is not cinema.

In computers games the minimum frame per second to achieve for not to have input lag or choppiness is dependable of engine architecture.

I mean, with some engines we have a smooth (free of input lag) gameplay at 60 fps (Arma 3) while with some others we need 90 fps (UE3 or Source) to have a decent gameplay.

Obviously this with Vsync disabled, otherwise we need to achieve at least the same value that we have for our screen refresh rate.

For instance, with Vsync enable in a screen with refresh rate of 60HZ we need to have 60 fps or else we start to face a severe input lag, delay and choppiness, this with whatever game or engine.

For Arma 3 we can have a clean free of lag gameplay at 60 fps, bellow that we start to have input lag, delay and choppiness (with Vsync enabled or disabled). In my case, if I was forced play Arma 3 bellow 60 fps, I would prefer to uninstall.

To OP, as some other stated, your issue probably is PIP since is completely inoperational.

If you have it on Ultra surely will be stealing at least 30 fps. Put it Mediulm or Low or even disable it (the best option)

In my case (with 3 GPUs), PIP steals me 20/30 fps. I have it on Low, still steals me about 10 fps.

Share this post


Link to post
Share on other sites

Bratwurste no offense but what you said makes no sense.

A3 with a constant 24fps has nor input lag, nor choppiness, nor delay. And has nothing to do with the game engines but your psychology and visual capacities. Most people can't even perceive the diff at higher fps than 20 few in normal circumstances.

The only point were you would need high fps is for explosions/fireworks, which in the game is not even important as you won't be there observing the detail.

BTW you only need the same fps as your screen refresh rate to avoid eye strain.

That king of reasoning reminds me of the 90s when people were arguing about sound cards, in the same way its nowadays with the graphics. But hey hardware companies have to sell. Soon it's gonna be all about virtual reality goggles.

Just a question for you Bratwurste: How can people need 90 FPS for UE3 or Source, if most people have 60Hz screens (which means, that the game can only reach up to 60 fps, after that is not gonna be displayed)?

Edited by MistyRonin

Share this post


Link to post
Share on other sites

As I have said previously, frames per second are "linked" to refresh rate only when we have Vsync enabled.

Vsync is nothing more than Vertical Synchronization and it synchronizes the frames that your GPU is drawing with your monitor's refresh rate and to achieve something acceptable both values need be the same.

With Vsync enabled when you have more fps than refresh rate your GPU will drop frames you will have what is commonly called "screen tearing" also commonly know as "frame overwrite".

When you have less fps than refresh rate your gpu will drop frames between screen updates and in reality you will have less frames per second than those you are seeing. That's when input lag, delay and choppiness start.

There is another situation, like I said previously, with some game engines we need more than 60 fps to have smoothness and a clean response time. That's why for people who have 60HZ screens is highly recommendable to have Vsync disabled just to avoid screen tearing.

These days 144HZ screens are pretty common already, means that we can play all games with whatever engine with Vsync enabled, if your GPU can handle with 144 fps.

And beleive, for instance in CS GO, a player with 144 fps has a tremendous advantage in relation to the player who is having 60 fps, he will see everything first.

Screen refresh rate is the times per second that your screen updates or what you see displayed updates, obviously hgher is better.

Frames per second are measured in miliseconds means more fps is equal to less delay between real time action and what you see displayed. Obviously higher is better, however after some values there is no benefit, it depends of game engine capabilities and architecture.

Obviously nothing of this happens with Vsync disabled, in this case we just have to stick at what the game engine requires as minumum in matter of fps (or what our GPU can give).

Also, for cinema the theory of the 25 fps (our eyes can't see more) is completely surpassed, these days led tv screens with 200 HZ are common and the limitation of 59 fps for video soon will be thing of the past. In video/cinema more fps is also equal to more quality and definition, but this is some other subject related with interlaced vs progressive methods and has nothing to do with vídeo games.

Share this post


Link to post
Share on other sites

With Vsync enabled when you have more fps than refresh rate your GPU will drop frames you will have what is commonly called "screen tearing" also commonly know as "frame overwrite".

Vsync prevents screen tearing, not cause it. And I've never heard the expression "frame overwrite", and neither has google apparently. Other than that, you're absolutely right; vsync drops frames.

And beleive, for instance in CS GO, a player with 144 fps has a tremendous advantage in relation to the player who is having 60 fps, he will see everything first.

Yes, he sees things whole 9 milliseconds earlier. I really don't think that's anywhere near "a tremendous advantage". :)

Edited by Greenfist

Share this post


Link to post
Share on other sites
Vsync prevents screen tearing, not cause it. And I've never heard the expression "frame overwrite", and neither has google apparently. Other than that, you're absolutely right; vsync drops frames.

Yes, he sees things whole 9 milliseconds earlier. I really don't think that's anywhere near "a tremendous advantage". :)

Milliseconds in video games have a "special" meaning, I guess that's why screens with more than 1 millisecond of response time are garbage.

And in a game that requires good reflexes, fast reactions and extreme accuracy, yes, 9 milliseconds is a eternity.

However for Arma 3 the same does not apply, the game works smoothly at 60 fps, the engine provide what is needed at these fps.

But there is no comparison term between CS and Arma, we are comparing rabbits with turtles.

Share this post


Link to post
Share on other sites
A3 with a constant 24fps has nor input lag, nor choppiness, nor delay. And has nothing to do with the game engines but your psychology and visual capacities. Most people can't even perceive the diff at higher fps than 20 few in normal circumstances.

No offense, but have you actually researched any of this stuff or are you just going by experience? If 24 FPS looks okay to you, then great, but frame rate can and does affect input as well as simulation. Haven't you ever noticed that your fire rate slows down when you have very low FPS in Arma?

Edited by roshnak

Share this post


Link to post
Share on other sites
Have you actually researched any of this stuff or are you just going by experience?

It was part of the mandatory subjects when I got my Computer Science Degree few years ago.

If 24 FPS looks okay to you, then great

Yes, a stable 24 FPS, look fine for me. In fact a lot of game developers lock the FPS in 30 tops.

In a high speed racing game or in a fireworks simulator it would be different.

but frame rate can and does affect input as well as simulation. Haven't you ever noticed that your fire rate slows down when you have very low FPS in Arma?

Well it really depends how the game has been coded. BUT in SP it doesn't matter as the AI will have the same limitation and in MP ping latency will make the real difference. In the only situation when that may be a minor issue is in LAN, but practically no one I know plays in LAN.

(I'm talking about a decent stable frame rate, hence > 20 FPS)

Share this post


Link to post
Share on other sites
Yes, a stable 24 FPS, look fine for me. In fact a lot of game developers lock the FPS in 30 tops.

It's not really about looks even though 24 FPS is pretty jarring, even in Arma. At 24 FPS all of the nice fluid A3 motion goes out the window, it's clunky, slow to respond and reminiscent of Arma 2. A stable 35-45 is pretty comfortable and anything on top of that is a bonus, 60 fps plus isn't really a requirement and is fairly unrealistic in anything simulation based (e.g. DCS which has the same fps issues with busy situations).

Developers locking at 30fps is primarily about using shitty console hardware and not being able to run anything faster, that's not a design choice even though they'd like to pretend it is.

Share this post


Link to post
Share on other sites
It was part of the mandatory subjects when I got my Computer Science Degree few years ago.

Yes, a stable 24 FPS, look fine for me. In fact a lot of game developers lock the FPS in 30 tops.

In a high speed racing game or in a fireworks simulator it would be different.

Well it really depends how the game has been coded. BUT in SP it doesn't matter as the AI will have the same limitation and in MP ping latency will make the real difference. In the only situation when that may be a minor issue is in LAN, but practically no one I know plays in LAN.

(I'm talking about a decent stable frame rate, hence > 20 FPS)

Consoles lock 30fps because they cannot handle higher fps and most prefer IQ/resolution over fps. Plus they have auto-aim for fps like games.

Higher fps is objectively better than lower in ALL FPS games.

You cannot compare how a movie is displayed on screen and how a game renders images.

Lower fps means higher input lag and less responsiveness from the controls (m+k).

Lower FPS (especially under 30fps) can give you headaches and puts higher strain on the eyes.

etc.

The only objective reason to have lower fps in your game is due to hardware limitations or poor coding.

Share this post


Link to post
Share on other sites
Consoles lock 30fps because they cannot handle higher fps and most prefer IQ/resolution over fps. Plus they have auto-aim for fps like games.

In fact it does not only happen with consoles but with PC Games too. The main reason is to keep a stable amount of FPS to avoid drops.

Also for other efficiency reasons like power save, etc.

Higher fps is objectively better than lower in ALL FPS games.

Define objectively better? The only real "improvement" is that reduces eye strain (as long as it's keep at the same exact and stable fps than the screen refresh. In a 60Hz screen is better for your eyes to have or 60 fps or 30 fps, but no values in between.

You cannot compare how a movie is displayed on screen and how a game renders images.

Yes you can. It's quite similar the effect on our brains and eyes.

Lower fps means higher input lag and less responsiveness from the controls (m+k).

Already commented before.

Lower FPS (especially under 30fps) can give you headaches and puts higher strain on the eyes.

etc.

Already commented before. In fact it's way worst for your eyes to process fps drops at high fps, than stable low fps.

Basically what the eye-brain do is process the differences between frames, in order to generate the motion. Changing constantly the speed, will get you easily tired, than even watching a stop motion at stable 10 fps.

The only objective reason to have lower fps in your game is due to hardware limitations or poor coding.

Against what the hardware sellers say. No. There's a good amount of reasons to keep a medium fps rate (20-30). From pure economics to health ones.

Share this post


Link to post
Share on other sites
NVidia control panel has default settings.

Set your NI or your NV CP "Power Management Mode" setting to, Prefer Maximum Performance.

Share this post


Link to post
Share on other sites
In fact it does not only happen with consoles but with PC Games too. The main reason is to keep a stable amount of FPS to avoid drops.

Also for other efficiency reasons like power save, etc.

Define objectively better? The only real "improvement" is that reduces eye strain (as long as it's keep at the same exact and stable fps than the screen refresh. In a 60Hz screen is better for your eyes to have or 60 fps or 30 fps, but no values in between.

Yes you can. It's quite similar the effect on our brains and eyes.

Already commented before.

Already commented before. In fact it's way worst for your eyes to process fps drops at high fps, than stable low fps.

Basically what the eye-brain do is process the differences between frames, in order to generate the motion. Changing constantly the speed, will get you easily tired, than even watching a stop motion at stable 10 fps.

Against what the hardware sellers say. No. There's a good amount of reasons to keep a medium fps rate (20-30). From pure economics to health ones.

Any developer worth its weight in code has a v sync option in the menu that can go for either 30fps or 60fps. The ones using 30fps, are just cutting corners while porting console locked 30fps games. Also FPS drops occur mostly on half backed ports, usually a good game stays around the same value or the drops are progressive, depending on the scene.

Objectively better as in lower lag, the game world is updating faster than 33,3ms (30fps) and over all less responsiveness lag.

I've played King of the Hill. The mod is basically what Ubisoft would call a "cinematic experience" at 22-24 fps. Rubbish, got headaches and my eyes started to pinch.

Each frame from a movie is intertwine with the previous and next one, which gives it a "natural" motion blur and in most cases works well - some fast panning scenes might not be that great, but fine overall. That's why when you pause a movie, you can get a blurry image instead of a sharp, clear one. In a game all the frames come one after another, every pause shows a clear and sharp image because of it. Due to this differences, games cannot be as cursive as a movie at lower FPS.

Then you have the biggest difference. You watch a movie while you play a game. The "low fps syndrome" comes into play and it's annoying because of the lag and the quick lost of fluidity of the motion (especially at 24fps). 30fps might be ok (acceptable) in different games/genre (like the TW 3 where you just spam the click in some direction), while in A3 you need a steady and precise aim and higher FPS is paramount.

30fps is the "sweet spot" for most games on consoles live I've said (the ones on pc are just bad ports), due to performance reasons. If you don't see any difference between 24fps and 60fps or 120/144fps, than you must be trolling or your computer has some issues. The fact that you're subjectively ok with 24/30 fps is one thing, but to say it's enough for all, takes it somewhat too far. Already disproved this "myth" when I've had the chance, to a similar thinking buddy by playing with him, first on a worse machine then 60fps+ on another. :)

Edited by calin_banc

Share this post


Link to post
Share on other sites

Each frame from a movie is intertwine with the previous and next one, which gives it a "natural" motion blur and in most cases works well - some fast panning scenes might not be that great, but fine overall. That's why when you pause a movie, you can get a blurry image instead of a sharp, clear one. In a game all the frames come one after another, every pause shows a clear and sharp image because of it. Due to this differences, games cannot be as cursive as a movie at lower FPS.

Came here to post exactly this. Games render a series of still images, film captures movement.

I made this gif real quick to demonstrate the concept: http://i.imgur.com/jy0NflW.gif

Both boxes are moving at 24fps but the bottom one appears to move smoother with some slight motion blurring. When you focus on the edges of the top box, you can see each individual jump in position between each frame. Not so with the motion blur. Instead of pixels going instantly from grey to blue, there is a fading. But more than that, the slight elongating and blurring of the shape in the direction of movement simply conveys motion better. Our eyes have an easier time interpreting it as motion and our brains fill in the gaps.

Then you have the biggest difference. You watch a movie while you play a game. The "low fps syndrome" comes into play and it's annoying because of the lag and the quick lost of fluidity of the motion (especially at 24fps). 30fps might be ok (acceptable) in different games/genre (like the TW 3 where you just spam the click in some direction), while in A3 you need a steady and precise aim and higher FPS is paramount.

Another excellent point. FPS as a simple number is definitely not a consistent indication of fluidity across all games. Hell, even introducing a 0.2 second input lag can make a 60fps game feel considerably worse than a 20fps game.

Share this post


Link to post
Share on other sites
In fact it does not only happen with consoles but with PC Games too.

Uh, in which games that are not bad console ports is this true?

Define objectively better? The only real "improvement" is that reduces eye strain (as long as it's keep at the same exact and stable fps than the screen refresh. In a 60Hz screen is better for your eyes to have or 60 fps or 30 fps, but no values in between.

Even if it is true that the only improvement is reduced eye strain, how is that not objectively better? It's certainly not worse.

Also, here's a study that shows increased player performance (shooting) all the way up through 60 FPS.

http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.217.393&rep=rep1&type=pdf

Already commented before.

You did comment before, but, again, input is absolutely affected by frame rate, especially in first person shooters. Your mouse controls the camera, and if the camera position is only updating 24 times per second, it is by definition less accurate and responsive than a camera that has its position updated 60 times per second.

To claim otherwise would be like claiming that reducing mouse polling frequency to 24 times per second wouldn't impact responsiveness.

Edited by roshnak

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
Sign in to follow this  

×