Jump to content
Sign in to follow this  
joejoe87577

Why Bohemia Why?

Recommended Posts

Alright...

But all this doesn't explain why I have 50FPS in the cockpit view and 70FPS in the 3rd Person View.

not-sure-if-trolling-or-being-serious.jpg

Share this post


Link to post
Share on other sites

Seems like I'm late to the party, but still, answering the OP:

LOD switch, seems like too slow \ unoptimized HDD handling

Share this post


Link to post
Share on other sites
Seems like I'm late to the party, but still, answering the OP:

LOD switch, seems like too slow \ unoptimized HDD handling

LOD switching wouldn't cause a persistant lowering of frame rate as was described. The issue was almost certainly PIP.

Share this post


Link to post
Share on other sites
The issue was almost certainly PIP.
Ah, I forgot about this one. True.
LOD switching wouldn't cause a persistant lowering of frame rate as was described.
Not true. It's been like that since ArmA2 and not only on my low-end PC.

Share this post


Link to post
Share on other sites
If you don't see any difference between 24fps and 60fps or 120/144fps, than you must be trolling or your computer has some issues.

Where have I said that I don't see any difference? In fact, I even described a difference. :j:

I simply gave you the scientific point of view from someone that had to study it professionally.

As I said, you can play at 2500 fps if you want that in a conventional 60Hz screen will make no difference than if you play at 60 fps.

And your eye strain will be worst at a 50-60 changing fps than at a 20 stable frame-rate.

But if you rather trust your "feelings/perceptions", hardware companies propaganda and placebo effect. It's fine with me.

Also, here's a study that shows increased player performance (shooting) all the way up through 60 FPS.

http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.217.393&rep=rep1&type=pdf

A study payed by the tech-giant Oracle, that has been partnering with Nvidia and that has showed the clear intention of buying AMD... Really objective. :rolleyes:

- - -

To give both of you a simple example:

You can buy a Ferrari as an utilitarian car to got to work everyday inside the city. It has objectively advantages in acceleration to your average hatchback car.

Would you really notice them inside the city? No.

Are they worth the cost/efficiency? No.

But of course you can strut around, showing how much money you spent on it...

Edited by MistyRonin

Share this post


Link to post
Share on other sites
.

But if you rather trust your "feelings/perceptions", hardware companies propaganda

Thin foil hat on. :rolleyes:

Yeah, I'll rather trust my decreased eyes fatigue, better experience and greater results while playing at 50-60fps compared to 24-30fps, rather than some funny conspiracy theory on the webs.

Share this post


Link to post
Share on other sites

To talk to MistyRonin is pointless. He always is the only one correct on everything. Don't waste your time.

Share this post


Link to post
Share on other sites
Define objectively better? The only real "improvement" is that reduces eye strain (as long as it's keep at the same exact and stable fps than the screen refresh.

It IS Objectively better in response time. I played Planetside 2 with an old card and got 25-30fps. K/D ratio about 1.3, i always lost in close encounters because aiming/ putting the crosshair where i wanted felt difficult somehow. Got a new card, got 50-60 fps and immediately my K/D ratio increased to 2 in play session and eventually leveled out at 2.3, same playstyle, same character classes, same ping, etc, because the increased response allowed for more accurate movement, the difficulties/sluggishness during aiming where gone.

Edited by X3KJ

Share this post


Link to post
Share on other sites
Where have I said that I don't see any difference? In fact, I even described a difference. :j:

I simply gave you the scientific point of view from someone that had to study it professionally.

As I said, you can play at 2500 fps if you want that in a conventional 60Hz screen will make no difference than if you play at 60 fps.

And your eye strain will be worst at a 50-60 changing fps than at a 20 stable frame-rate.

But if you rather trust your "feelings/perceptions", hardware companies propaganda and placebo effect. It's fine with me.

Mate,

When we speak about pc games 60 fps (in general) is the ideal, is the sweet spot.

However, some games like Civilization V can be played at 30 fps (I play it at 30 fps mainly to save cpu power) without issues, but these type of games are not first person shooters.

You spoke about racing games, I play Assetto Corsa which is a racing simulator (very demanding in matters of hardware) and I play it at 60 fps, because there is no point or need for more, however bellow 60 fps we start to have the called "choppiness".

But, as I have said, some game engines require a bit more to achieve the sweet spot. Unreal Engine 3 and Source are 2 of them. It is not by accident that UE3 games have, by default, frames per second capped at 90 fps and CS GO capped at 300 fps (120 fps on menu).

Now, we need to find (for every game) a spot in matters of frames per second where we can have a stable value, without frame rate drop, and that is mainly dependent of or hardware. If we cant achieve at least stable 30 fps (with all choppiness that comes with it) is better to uninstall the game otherwise some health issues may arise, as consequence.

Films, unlike games, run at 24 frames per second, however, 48p is currently being used in the film industry and 72p is in experimental stages.

The main reason why these formats are taking some time to become standard, even for TV, is because broadcast constraints due to increase of data flow in transmissions.

Share this post


Link to post
Share on other sites
It IS Objectively better in response time. I played Planetside 2 with an old card and got 25-30fps. K/D ratio about 1.3, i always lost in close encounters because aiming/ putting the crosshair where i wanted felt difficult somehow. Got a new card, got 50-60 fps and immediately my K/D ratio increased to 2 in play session and eventually leveled out at 2.3, same playstyle, same character classes, same ping, etc, because the increased response allowed for more accurate movement, the difficulties/sluggishness during aiming where gone.

I don't want to sink much on it. But games are programmed in cycles. In each cycle the input is processed, as well as the picture / frame to display. That's why the response it's exactly the same. (in fact for fps a trick I used to use 10 years ago is to limit the fps in order to gain reaction time).

When I studied computer science I had to program a few games as projects.

- - -

When we speak about pc games 60 fps (in general) is the ideal, is the sweet spot.

In fact that's not true. The sweetest spot is when the fps meet the screen refresh rate and most importantly are stable.

You spoke about racing games, I play Assetto Corsa which is a racing simulator (very demanding in matters of hardware) and I play it at 60 fps, because there is no point or need for more, however bellow 60 fps we start to have the called "choppiness".

As I said previously, in high-speed games like racing it's were fps could make a significant difference. Arma is not a high-speed game.

It is not by accident that UE3 games have, by default, frames per second capped at 90 fps and CS GO capped at 300 fps (120 fps on menu).

It doesn't make much sense to generate more frames than the ones you can see. Hence in a 60Hz screen, which is the average nowadays, that 90fps have the same exact effect as 60fps (as your screen will only show 60fps). Well your computer would use more resources, for something that won't be displayed on screen.

Now, we need to find (for every game) a spot in matters of frames per second where we can have a stable value, without frame rate drop, and that is mainly dependent of or hardware.

In that I agree. Stable is the keyword in this debate.

If we cant achieve at least stable 30 fps (with all choppiness that comes with it) is better to uninstall the game otherwise some health issues may arise, as consequence.

Not really. Feel free to ask your favorite ophthalmologist. The main causes of eye strain are listed here: (MD) Eye strain

As you can see the frame rate does not even appear in the list, and it's "main" issue is fps drop. As long as the fps are stable, they won't take too much toll on your eyes.

So health-wise what would matter is to achieve a stable frame rate, not a higher.

Films, unlike games, run at 24 frames per second, however, 48p is currently being used in the film industry and 72p is in experimental stages.

The main reason why these formats are taking some time to become standard, even for TV, is because broadcast constraints due to increase of data flow in transmissions.

48p is starting to be experimented in fast-action movies. And make sense because the frame rate is stable, hence you can push further without issues (well, besides the economical cost). Tho in normal movies it's not needed, as the action will never be "fast" enough to really need that extra frames.

Edited by MistyRonin

Share this post


Link to post
Share on other sites
As I said previously, in high-speed games like racing it's were fps could make a significant difference. Arma is not a high-speed game.

Very much agreed, while I would love a stable 60 FPS in ArmA just as much as anyone else, I just know it isnt going to happen on my hardware with this engine, and as far as play-ability at low frame rates, arma is pretty high up there, I could not imagine myself playing most other games at the frame-rates which arma can dip down to. But, pushing ArmA's performance and stability is still just as important though, regardless of this.

Back when my computer was more shitty, I used to cap my FPS at 30 on games to get a more smooth experience. The only thing worse then shitty FPS, is choppy and "hiccup"y FPS, at least in my opinon, it can be quite jarring and irritating, one of the primary reasons I have stopped playing Fallout: New Vegas, while I could get a full 75 FPS at times, It would become choppy and hiccup in high-stress areas. (As many games do, Arma can go to shit during big firefights for me)

Edited by MikeTim

Share this post


Link to post
Share on other sites

@MistyRonin

Now I see some contradictions in your statements.

First you say that 60 fps is not the sweet spot, then you say the sweet is when meet the screen refresh rate and then you say that the 60HZ screens is the average these days. I am confused here.

If 60 HZ is the average 60 fps should not be the sweet spot and where game developers are aiming?

Another thing, the frames rendered by your graphics card and displayed in your screen are linked to screen refresh rate only when you have Vertical Syncronization enabled, with Vsync disabled your screen will display the frames that your graphics can render (if more than 60 fps on a 60Hz screen).

If your graphics card is not able to render 60 fps obviously your screen will display less with Vsync on or off,

In fact these are the reasons why gamers prefer to have Vsync disabled, some because can have more than 60 fps and some because cant have 60 fps.

High speed games do not necessarily need more fps just because are high speed games, one thing is not linked with another. I have several racing games with different engines in my steam account and not a single one meets that requirements.

The only thing that can demand more fps for a better gameplay is the engine itself, like I said some engines require more than 60 fps and almost all require more than 30 fps for a smooth gameplay, that's for sure.

Mate, 60 fps are not the same thing of 90 fps and btw, my screen show 90 fps even with Vsync enabled (maybe because I have a 144 Hz screen).

But, also like I have said, from all the games that I play the only ones that require more than 60 fps are the UE3 and Source based.

I play all UE3 games locked and 90 fps and CS GO locked and 144 fps and is not because I want, is because these engines need these fps.

With UE3 the frames per second are displayed with milliseconds, here we can see clearly (with some few tools available over the web) where the input lag starts, try it and you will see that it will start soon you go under 90 fps.

With Arma 3, I play with fps locked at 60 and also is not because I want to have 60 fps, is because with Arma 3 engine there is no benefit in having more than 60 fps and because bellow 60 fps we start to have input lag, delay and chopiness.

Also (and this is with all game engines depending of each one), above some values fps will stop to make difference, you can have even 1000 fps and you will see no difference, that's the case of UE3, above 90 fps makes no difference, the same for Arma 3 above 60 fps or even the same for CS GO above 128, but CS GO is a particular case because fps are a linked to tickrate.

Again, video or films can not be compared with games in these matters, since there 2 completely different concepts and not related.

Share this post


Link to post
Share on other sites
Now I see some contradictions in your statements.First you say that 60 fps is not the sweet spot, then you say the sweet is when meet the screen refresh rate and then you say that the 60HZ screens is the average these days. I am confused here.

You should probably read again my message then, what I said was:

The sweetest spot is when the fps meet the screen refresh rate and most importantly are stable.

So no, the sweetest spot it is not an exact value for everyone. It depends mainly on if they are stable (in addition to other variables like the screen, the hardware, etc.). In a 75Hz monitor, the sweet spot will be 75 stable fps (or if not possible a fraction of 75, like 1/2 or 1/4). But that is stable is the main condition.

For a 60Hz screen (which are the most common nowadays), the sweet point would be 60 stable fps, or if not possible again the 1/2: 30fps. But that's only for the 60Hz, not for everyone else.

Another thing, the frames rendered by your graphics card and displayed in your screen are linked to screen refresh rate only when you have Vertical Syncronization enabled, with Vsync disabled your screen will display the frames that your graphics can render (if more than 60 fps on a 60Hz screen).

Of course. You can technically "try to display" more fps than your Hz, which will produce a nice screen tearing because you would be trying to display more frames than your screen can draw per refresh. Hence it would not be playable, and quite absurd.

Again, video or films can not be compared with games in these matters, since there 2 completely different concepts and not related.

Frames per second, are frames per second, be it in a game or in a movie (or in drawing on a notebook passed fast). In a game or in certain movies there is no much motion blur, hence the perception may be slightly different. But the strain is the same. The only main difference is that in bad optimized games the frame rate may vary and in movies it doesn't.

---------- Post added at 21:58 ---------- Previous post was at 21:35 ----------

As I can't explain you better, check this article:

(Ign) Understanding the importance of frame rate

One of the reasons developers lock a game at 30 FPS is due to such dips - it's better to have a solid game running at 30 FPS than one that can run at 60 FPS but doesn't always.

I hope that will calm the "higher! higher! fps" obsession. :D

Edited by MistyRonin

Share this post


Link to post
Share on other sites

And what about Gsync? Does it really help when the fps is not stable?

Share this post


Link to post
Share on other sites

@MistyRonin

Mate, video games are not films, is not the same concept.

We are speaking about online pc games in particular first person shooters, do you know frames per second are also connected with netcode in order to achieve the best synchronization possible between the client and the server?

Do you know that the earlier version of UE3 had fps capped at 62 fps? Do you know why? Because the tickrate of the UE3 netcode was lower.

When you go lower (or above) of certain limits in matters of fps it also affects your experience in matters of online gameplay, for instance, with hit registration.

Mate, fps in pc games are not the fps from movies, its a bit more complex, but if you are ok with your 24/30 fps in games, I am not going to argue about that. Is a matter of personal preference or perception, I guess

Share this post


Link to post
Share on other sites
We are speaking about online pc games in particular first person shooters, do you know frames per second are also connected with netcode in order to achieve the best synchronization possible between the client and the server?

No, we (at least I) were talking about games in general, and using Arma 3 as example.

Arma 3 is not an online first person shooter, is a tactical shooter "sandbox" that has a MP option .

And of course, if we go to the MP part, there's a lot more variables involved, including the most important: internet connection. And of course how the data transfer code was designed.

In that context fps in the client don't matter that much.

And what about Gsync? Does it really help when the fps is not stable?

G-Sync only works for Nvidia cards and in certain screens. Basically what it does is sync the GPU output with the display, hence the screen tearing disappears allowing you to play at variable fps (the display refresh rate matches the gpu fps).

Edited by MistyRonin

Share this post


Link to post
Share on other sites

To me Arma is a first person shooter, I dont use third person view gimmick.

And Arma is a online first/third person shooter. Tactical shooter is something that does not exist as game concept. It is a gamepley style.

Also, as some many other online games, Arma as some options for single player.

Anyway, we dont go anywhere with this, we have a diferent perception about pc games.

Share this post


Link to post
Share on other sites
To me Arma is a first person shooter, I dont use third person view gimmick.

And Arma is a online first/third person shooter. Tactical shooter is something that does not exist as game concept. It is a gamepley style.

Also, as some many other online games, Arma as some options for single player.

Heck if the OFP / Arma series are an online first person shooter, then BI devs have really strange priorities. If for more than a year after the release of Arma 3 the MP part was practically unplayable.

BTW that would also mean that all the magazine and web reviews lie about it's genre. Even BI lies about it:

Experience true combat gameplay in a massive military sandbox. Deploying a wide variety of single- and multiplayer content, over 20 vehicles and 40 weapons, and limitless opportunities for content creation, this is the PC’s premier military game. Authentic, diverse, open - Arma 3 sends you to war.

Share this post


Link to post
Share on other sites
Heck if the OFP / Arma series are an online first person shooter, then BI devs have really strange priorities. If for more than a year after the release of Arma 3 the MP part was practically unplayable.

BTW that would also mean that all the magazine and web reviews lie about it's genre. Even BI lies about it:

I can't even tell where the goal posts are anymore.

Share this post


Link to post
Share on other sites
I don't want to sink much on it. But games are programmed in cycles. In each cycle the input is processed, as well as the picture / frame to display. That's why the response it's exactly the same. (in fact for fps a trick I used to use 10 years ago is to limit the fps in order to gain reaction time).

No the responsiveness is not the same, if you get updates 30 per second compared to 60 per second. Can't be. The time between each update is doubled with 30, ergo it is half as responsive. 60 fps allow higher movement precision. Or how else do you explain the sudden change of ingame success i experienced? Even now in arma i notice the same sluggishness in response when it's running at 30 or lower (again).

Why do you think they use 120fps for VR if not for the increased responsiveness?

Edited by X3KJ

Share this post


Link to post
Share on other sites

Of course more fps is always better. Not sure why this is debated. Although you sometimes go for less fps in exchange for more graphical detail. If you guys didn't care about graphics you'd still be playing OFP... ;)

Share this post


Link to post
Share on other sites

The human eye is designed in such a way, that it can only distinguish 24 different TV channels per Second.

Share this post


Link to post
Share on other sites

I've always been able to tell the difference between 60fps and anything lower than that, especially anything under 30fps even if it is constant and stable. Heck I can tell when a monitor is 120hz instead of 60hz, I would hope that other people can too. :rolleyes:

Share this post


Link to post
Share on other sites
I've always been able to tell the difference between 60fps and anything lower than that, especially anything under 30fps even if it is constant and stable. Heck I can tell when a monitor is 120hz instead of 60hz, I would hope that other people can too. :rolleyes:

Its extremely obvious to anyone that owns a 120hz monitor. Tests done on random people show that basically everyone can tell the difference with a short amount of exposure to it, see the linus test for example where he gets it right 100% on a test to see if he can tell the difference between 60 v 120.

60hz is far from the limit of the human eye. The current estimate based on tests against pilots and spotting silhouettes of planes showed we can do that with just a 1ms flash, lower than that and we can't tell it was there. So it looks like the point where we can't tell a screen from reality in terms of update rate would be around 1000 hz.

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
Sign in to follow this  

×