Jump to content
Sign in to follow this  
Aurvan

How VSync works, and why people loathe it.

Recommended Posts

I stole this from another forum I found using Google. I posted this because there has been some discussion about VSync and performance, and how disabling it improves your FPS. Original thread here:

http://www.hardforum.com/showthread.php?t=928593

I recently learned that how I thought vsync worked was wrong, and now knowing the way it really does work, I think it would be worthwhile to make sure everyone here understands it.

What is VSync? VSync stands for Vertical Synchronization. The basic idea is that synchronizes your FPS with your monitor's refresh rate. The purpose is to eliminate something called "tearing". I will describe all these things here.

Every CRT monitor has a refresh rate. It's specified in Hz (Hertz, cycles per second). It is the number of times the monitor updates the display per second. Different monitors support different refresh rates at different resolutions. They range from 60Hz at the low end up to 100Hz and higher. Note that this isn't your FPS as your games report it. If your monitor is set at a specific refresh rate, it always updates the screen at that rate, even if nothing on it is changing. On an LCD, things work differently. Pixels on an LCD stay lit until they are told to change; they don't have to be refreshed. However, because of how VGA (and DVI) works, the LCD must still poll the video card at a certain rate for new frames. This is why LCD's still have a "refresh rate" even though they don't actually have to refresh.

I think everyone here understands FPS. It's how many frames the video card can draw per second. Higher is obviously better. However, during a fast paced game, your FPS rarely stays the same all the time. It moves around as the complexity of the image the video card has to draw changes based on what you are seeing. This is where tearing comes in.

Tearing is a phenomenon that gives a disjointed image. The idea is as if you took a photograph of something, then rotated your vew maybe just 1 degree to the left and took a photograph of that, then cut the two pictures in half and taped the top half of one to the bottom half of the other. The images would be similar but there would be a notable difference in the top half from the bottom half. This is what is called tearing on a visual display. It doesn't always have to be cut right in the middle. It can be near the top or the bottom and the separation point can actually move up or down the screen, or seem to jump back and forth between two points.

Why does this happen? Lets take a specific example. Let's say your monitor is set to a refresh rate of 75Hz. You're playing your favorite game and you're getting 100FPS right now. That means that the mointor is updating itself 75 times per second, but the video card is updating the display 100 times per second, that's 33% faster than the mointor. So that means in the time between screen updates, the video card has drawn one frame and a third of another one. That third of the next frame will overwrite the top third of the previous frame and then get drawn on the screen. The video card then finishes the last 2 thirds of that frame, and renders the next 2 thirds of the next frame and then the screen updates again. As you can see this would cause this tearing effect as 2 out of every 3 times the screen updates, either the top third or bottom third is disjointed from the rest of the display. This won't really be noticeable if what is on the screen isn't changing much, but if you're looking around quickly or what not this effect will be very apparant.

Now this is where the common misconception comes in. Some people think that the solution to this problem is to simply create an FPS cap equal to the refresh rate. So long as the video card doesn't go faster than 75 FPS, everything is fine, right? Wrong.

Before I explain why, let me talk about double-buffering. Double-buffering is a technique that mitigates the tearing problem somewhat, but not entirely. Basically you have a frame buffer and a back buffer. Whenever the monitor grabs a frame to refresh with, it pulls it from the frame buffer. The video card draws new frames in the back buffer, then copies it to the frame buffer when it's done. However the copy operation still takes time, so if the monitor refreshes in the middle of the copy operation, it will still have a torn image.

VSync solves this problem by creating a rule that says the back buffer can't copy to the frame buffer until right after the monitor refreshes. With a framerate higher than the refresh rate, this is fine. The back buffer is filled with a frame, the system waits, and after the refresh, the back buffer is copied to the frame buffer and a new frame is drawn in the back buffer, effectively capping your framerate at the refresh rate.

That's all well and good, but now let's look at a different example. Let's say you're playing the sequel to your favorite game, which has better graphics. You're at 75Hz refresh rate still, but now you're only getting 50FPS, 33% slower than the refresh rate. That means every time the monitor updates the screen, the video card draws 2/3 of the next frame. So lets track how this works. The monitor just refreshed, and frame 1 is copied into the frame buffer. 2/3 of frame 2 gets drawn in the back buffer, and the monitor refreshes again. It grabs frame 1 from the frame buffer for the first time. Now the video card finishes the last third of frame 2, but it has to wait, because it can't update until right after a refresh. The monitor refreshes, grabbing frame 1 the second time, and frame 2 is put in the frame buffer. The video card draws 2/3 of frame 3 in the back buffer, and a refresh happens, grabbing frame 2 for the first time. The last third of frame 3 is draw, and again we must wait for the refresh, and when it happens, frame 2 is grabbed for the second time, and frame 3 is copied in. We went through 4 refresh cycles but only 2 frames were drawn. At a refresh rate of 75Hz, that means we'll see 37.5FPS. That's noticeably less than 50FPS which the video card is capable of. This happens because the video card is forced to waste time after finishing a frame in the back buffer as it can't copy it out and it has nowhere else to draw frames.

Essentially this means that with double-buffered VSync, the framerate can only be equal to a discrete set of values equal to Refresh / N where N is some positive integer. That means if you're talking about 60Hz refresh rate, the only framerates you can get are 60, 30, 20, 15, 12, 10, etc etc. You can see the big gap between 60 and 30 there. Any framerate between 60 and 30 your video card would normally put out would get dropped to 30.

Now maybe you can see why people loathe it. Let's go back to the original example. You're playing your favorite game at 75Hz refresh and 100FPS. You turn VSync on, and the game limits you to 75FPS. No problem, right? Fixed the tearing issue, it looks better. You get to an area that's particularly graphically intensive, an area that would drop your FPS down to about 60 without VSync. Now your card cannot do the 75FPS it was doing before, and since VSync is on, it has to do the next highest one on the list, which is 37.5FPS. So now your game which was running at 75FPS just halved it's framerate to 37.5 instantly. Whether or not you find 37.5FPS smooth doesn't change the fact that the framerate just cut in half suddenly, which you would notice. This is what people hate about it.

If you're playing a game that has a framerate that routinely stays above your refresh rate, then VSync will generally be a good thing. However if it's a game that moves above and below it, then VSync can become annoying. Even worse, if the game plays at an FPS that is just below the refresh rate (say you get 65FPS most of the time on a refresh rate of 75Hz), the video card will have to settle for putting out much less FPS than it could (37.5FPS in that instance). This second example is where the percieved drop in performance comes in. It looks like VSync just killed your framerate. It did, technically, but it isn't because it's a graphically intensive operation. It's simply the way it works.

All hope is not lost however. There is a technique called triple-buffering that solves this VSync problem. Lets go back to our 50FPS, 75Hz example. Frame 1 is in the frame buffer, and 2/3 of frame 2 are drawn in the back buffer. The refresh happens and frame 1 is grabbed for the first time. The last third of frame 2 are drawn in the back buffer, and the first third of frame 3 is drawn in the second back buffer (hence the term triple-buffering). The refresh happens, frame 1 is grabbed for the second time, and frame 2 is copied into the frame buffer and the first part of frame 3 into the back buffer. The last 2/3 of frame 3 are drawn in the back buffer, the refresh happens, frame 2 is grabbed for the first time, and frame 3 is copied to the frame buffer. The process starts over. This time we still got 2 frames, but in only 3 refresh cycles. That's 2/3 of the refresh rate, which is 50FPS, exactly what we would have gotten without it. Triple-buffering essentially gives the video card someplace to keep doing work while it waits to transfer the back buffer to the frame buffer, so it doesn't have to waste time. Unfortunately, triple-buffering isn't available in every game, and in fact it isn't too common. It also can cost a little performance to utilize, as it requires extra VRAM for the buffers, and time spent copying all of them around. However, triple-buffered VSync really is the key to the best experience as you eliminate tearing without the downsides of normal VSync (unless you consider the fact that your FPS is capped a downside... which is silly because you can't see an FPS higher than your refresh anyway).

I hope this was informative, and will help people understand the intracacies of VSync (and hopefully curb the "VSync, yes or no?" debates!wink_o.gif. Generally, if triple buffering isn't available, you have to decide whether the discrete framerate limitations of VSync and the issues that can cause are worth the visual improvement of the elimination of tearing. It's a personal preference, and it's entirely up to you.

Share this post


Link to post
Share on other sites

good post, hopefully help people understand the workins of the dreaded Vsync better biggrin_o.gif. Thanks for sharing matey yay.gif

Share this post


Link to post
Share on other sites

Question is...does ArmA support triple buffering (if it is also depending on the game SW as suggested here) ? Guess only BI can answer this.

Share this post


Link to post
Share on other sites

Well, would you mind stating what your advise is towards arma players considering this knowledge and the game properties of arma ?

What should they do to get a good number of fps ?

Monk.

Share this post


Link to post
Share on other sites

awesome post. good job :-)

sticky ?

Share this post


Link to post
Share on other sites
Question is...does ArmA support triple buffering (if it is also depending on the game SW as suggested here) ? Guess only BI can answer this.

ArmA uses another technique, which is a variety of triple buffering - all rendering is done into an auxiliary buffer, which is copied into the back buffer once the scene is complete (during the copy a final post process effect is done as well).

From the VSync point of view this should work the same as triple buffering - as rendering is not done into the back buffer, there is no need for the rendering to wait until the back buffer is switched.

Off course, buggy or poorly implemented drivers may behave differently in this respect, waiting even when not necessary, but that is a different story.

Share this post


Link to post
Share on other sites

Good info. So many don't know what vsync is for..

Anyway, I must have one of those "buggy drivers" since I get either 60fps or 30. No 40 at all. ArmA's "other technique" isn't working here.

Trying to force triple buffering in the nvidia control panel doesn't help either.

I'd much rather have 45 (or even 40) fps than 30, so vsync has to stay off for now sad_o.gif

Share this post


Link to post
Share on other sites

Very informative post!

Being a non technical person, I'd like to ask one question to make matters clearer:

Can tearing happen when FPS is below the refresh rate, with Vsync turned off?

From what I understand if Vsync is OFF, tearing diseappears only when FPS and refresh rate are equal.

Please correct me if I'm wrong.

Share this post


Link to post
Share on other sites

Correct, tearing happens when FPS is below refresh rate as well. Drawing 2/3rds of the screen in one refresh cycle gets the same effect as drawing 1 and 1/3rd screens in one refresh cycle wink_o.gif

Share this post


Link to post
Share on other sites

i'd just advise everyone to disable the darn thing. if it bothers you, you'll get used to it quite soon. i couldn't stand tearing before, now i hardly notice it and the games run much smoother. cheers.

Share this post


Link to post
Share on other sites
Question is...does ArmA support triple buffering (if it is also depending on the game SW as suggested here) ? Guess only BI can answer this.

ArmA uses another technique, which is a variety of triple buffering  - all rendering is done into an auxiliary buffer, which is copied into the back buffer once the scene is complete (during the copy a final post process effect is done as well).

From the VSync point of view this should work the same as triple buffering - as rendering is not done into the back buffer, there is no need for the rendering to wait until the back buffer is switched.

Off course, buggy or poorly implemented drivers may behave differently in this respect, waiting even when not necessary, but that is a different story.

Suma in light of Your informations i got question

about situation where driver default = VSYNC ALWAYS ON

what happens if Tripple Buffering was forced for ARMA (e.g. via ATI Tray Tools) ?

thx for answer

in short should i bother with forcing TB or not ? smile_o.gif

Share this post


Link to post
Share on other sites

If the drivers are well written, by using Triple Buffering you are only wasting a video memory. If they are not well written, this might convince them they really should not wait for the VSync.

However, as many people claim turning VSync off helps their frame rate, I guess either most drivers are not well written, or there is something strange going on.

Share this post


Link to post
Share on other sites

When I first started reading about VSync I didn't know what tearing was, so I didn't get a whole lot of it. I notice it if I look for it, but it doesn't bother me that much. My FPS sometimes drops all the way down below 20 in graphically intensive areas, so VSync would probably not be a good idea for me since it wont stay stable.

Anyway, here is a screenshot of what tearing looks like in case you're not sure:

GGDSG_19.jpg

Share this post


Link to post
Share on other sites
If the drivers are well written, by using Triple Buffering you are only wasting a video memory. If they are not well written, this might convince them they really should not wait for the VSync.

However, as many people claim turning VSync off helps their frame rate, I guess either most drivers are not well written, or there is something strange going on.

i assume You mean using TB with ArmA is wasted memory due Your different buffer approach ...

Share this post


Link to post
Share on other sites

Very original good post. I have been using the final release of Riva Tuner to force Triple Buffering in ARMA. It has worked to good effect. My frames are not bottoming out as bad as without Triple Buffering. There also seems to be fps stability with TB on.

The Triple Buffering option can be found under %dir/Program Files/RivaTuner v2.0 Final Release/Tools/D3DOverRider.

I have found forcing Triple Buffering for ARMA works to very good effect.

Share this post


Link to post
Share on other sites

I don't get it. Now that I've turned VSync ON, I get much smoother FPS and overall game experience ...

It used to be pretty choppy, so for a laugh I thought "if it runs things bad, how bad would it run if I turned everything on?", so I went into the ATI settings and turned everything ON: 4xAA, 16xAF, VSync always on, adaptive AA on performance, geometry instancing and triple buffers on and now I get a much smoother and faster performance than with all of these things turned off! huh.gif

Share this post


Link to post
Share on other sites

If I remember correctly, Suma made a post stating that forcing options on or off could have very negative effects on the FPS or rendering of ArmA... Maybe Vsync force-off is causing issues with you, while forcing-on does not?

Share this post


Link to post
Share on other sites

Thx Aurvan nice post!

Quote[/b] ]That means if you're talking about 60Hz refresh rate, the only framerates you can get are 60, 30, 20, 15, 12, 10, etc etc. You can see the big gap between 60 and 30 there. Any framerate between 60 and 30 your video card would normally put out would get dropped to 30.

This is exactly what happens all the time there is no frame between 60 and 30 and so on. I can better life with tearing which wasn´t really noticeable for me then the big frame jumps. As i stated in my post about Vsync and Vista i had no idea how it really works but i played ArmA on my old XP System with Vsync off and on my new Vista Sytem where you cant turn it off via driver. The difference is more then remarkable and is one off the biggest problems i have to deal with on my Vista system and ArmA.

So thumbs up for Vsync option in ArmA biggrin_o.gif

Share this post


Link to post
Share on other sites
I don't get it.  Now that I've turned VSync ON, I get much smoother FPS and overall game experience ...  

It used to be pretty choppy, so for a laugh I thought "if it runs things bad, how bad would it run if I turned everything on?", so I went into the ATI settings and turned everything ON: 4xAA, 16xAF, VSync always on, adaptive AA on performance, geometry instancing and triple buffers on and now I get a much smoother and faster performance than with all of these things turned off! huh.gif

Holy hammer! I did the same thing... I toggled VSync to application controlled, and then Trible buffer to ON.

On a particularly bushy mission I used to get like 10-15 FPS. Now I suddenly got a constant 30! Obviously it wont go above 30 unless it can do 60, which is the monitor limit. So when I look directly down it jumps between 30 and 60 instead of stopping by 40 and 50 on the way. But I don't really care. Even if it shows 30 instead of 35 or whatever, that's way better than 15 FPS anyway!

Baffled! What's going on here?

Also, even with the way ArmA handles the buffering (ehich doesn't seem to work with me) how about an option to enable Triple Buffering within the game?

Share this post


Link to post
Share on other sites

Where do you guys enable Tripple Buffering? ArmA is a directx game, isn´t it?

The tripple buffering options in our drivers (ati and nvidia) only affect opengl-games and hence shouldnt make a diffrence to ArmA.

Share this post


Link to post
Share on other sites
ArmA uses another technique, which is a variety of triple buffering  - all rendering is done into an auxiliary buffer, which is copied into the back buffer once the scene is complete (during the copy a final post process effect is done as well).

From the VSync point of view this should work the same as triple buffering

Does this work for anyone? Is there anyone with vsync and a refresh rate of 60 who ever sees 40 fps, as should be possible with triple buffering? Or do we all have 'buggy drivers?'

Share this post


Link to post
Share on other sites

Well i tried all diffrent Vsync options and with Tripple buffer enabled/disabled on each - made no diffrence in performance whatsoever for my X1800 running with catalyst 7.2.

As for buggy drivers, there are only two - nvidia and ati drivers. If developers of ArmA know ati/nvidia have buggy drivers, they should contact them and warn them beforehand that "ArmA in 2 months will use a technique that might interfere with your drivers" - not just assume that ati/nvidia someday will think "hey guys, what would happen if a game started using an auxilliary buffer instead of regular tripple buffering?"

Share this post


Link to post
Share on other sites
ArmA uses another technique, which is a variety of triple buffering  - all rendering is done into an auxiliary buffer, which is copied into the back buffer once the scene is complete (during the copy a final post process effect is done as well).

From the VSync point of view this should work the same as triple buffering

Does this work for anyone?  Is there anyone with vsync and a refresh rate of 60 who ever sees 40 fps, as should be possible with triple buffering?  Or do we all have 'buggy drivers?'

I tried to enble in the driver settings, but it doesn't seem to make a difference. I jump between 30 and 60 FPS (not very often as I rarely get 60 FPS. So I have no idea where I am suppose to be. Maybe even above 40!

Share this post


Link to post
Share on other sites

DOH! Now I missed the answer to my Q by 6 days!

Quote[/b] ]ArmA uses another technique, which is a variety of triple buffering - all rendering is done into an auxiliary buffer, which is copied into the back buffer once the scene is complete (during the copy a final post process effect is done as well).

So to make Suma`s suggestion short:

Vsync = on

TrippleB = off

...brings the best result, but some getting better results with

TrippleB = on

where Suma mentioned the trippleB would not be used since ArmA is forcing a different way of memory management and hence...

TrippleB = on

...would waste VRAM?

Did I reflected the issue summary correctly with my small knowledge on this?

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
Sign in to follow this  

×