Jump to content
Sign in to follow this  
james2464

New anti-aliasing algorithm TXAA, Arma 3 potential?

Recommended Posts

With Kepler architecture, NVIDIA has three new star technologies that will help it with this round: TXAA, a new anti-aliasing algorithm that offers image quality comparable to 16X MSAA, with the performance-penalty of 2X MSAA (if not less); Adaptive V-Sync which is sure to win gamers by the millions; and a redesigned display logic that supports up to four displays from a single GPU.

TXAA, which we talked about a little earlier, turns out to be a super-efficient temporal anti-aliasing algorithm. It has two levels: TXAA(1), and TXAA2. TXAA1 provides the image quality comparable to 16X MSAA, with the performance-penalty of 2X MSAA; while TXAA2 offers image quality higher than 16X MSAA (unlike anything you've seen), with the performance-penalty of 4X MSAA. Since few games natively support it, you will be able to enable it through the NVIDIA Control Panel, in the application profiles, provided you have a Kepler architecture GPU.

Source

According to Nvidia Arma 3 won't necessarily have to support TXAA natively as you can enable the AA via the Nvidia control panel(driver) so theoretically you could have 0 AA in game and have Nvidia run TXAA. Although ideally this could be a possible feature as BIS are always looking for ways to balance visuals with performance.

In summary:

TXAA(1) = 16x MSAA = 2x MSAA performance tax

Screenshot comparison:

MSAA: http://img857.imageshack.us/img857/1284/183br.jpg

vs

TXAA: http://img714.imageshack.us/img714/1230/183aon.jpg

Subsequently this technology will be available on Kepler architecture AKA: Geforce 680GTX release 22nd March.

Edited by James2464

Share this post


Link to post
Share on other sites

Link to original quote? This is awesome news, BTW. I haven't had an nVidia card for a few years now, and I'm really tempted to jump on a 680GTX on release since my 6970 is currently being a little PITA.

Share this post


Link to post
Share on other sites

So who is going to have a 680GTX when Arma 3 is out?

That is incredible.

What does the TX stand for?

I cannot seem to find it unless im blind...

Edited by Flash Thunder

Share this post


Link to post
Share on other sites
So who is going to have a 680GTX when Arma 3 is out?

That is incredible.

What does the TX stand for?

I cannot seem to find it unless im blind...

In the article is seems to mean Temporal AA.

Share this post


Link to post
Share on other sites

these are driver enforced, driver run, GPU brand and generation specic anti-aliasing methods ...

it's similar to new AA methods in new AMD cards and drivers ...

if there is shared source and if it's usable and if offers gains over the driver method

then it might be directly support by engine

for the moment i think FXAA 3.x and SMAA .x are enough, MLAA 2.0 is big question and FXAA 4 is expected soon ...

Share this post


Link to post
Share on other sites
these are driver enforced, driver run, GPU brand and generation specic anti-aliasing methods ...

it's similar to new AA methods in new AMD cards and drivers ...

if there is shared source and if it's usable and if offers gains over the driver method

then it might be directly support by engine

for the moment i think FXAA 3.x and SMAA .x are enough, MLAA 2.0 is big question and FXAA 4 is expected soon ...

Do you mean shared source as in the code is readily available for you to tinker and implement into the RV Engine? Also FXAA 4.0 according to Timothy Lottes is just the base for TXAA Source

I believe MLAA 2.0 is an ATI driver enforced solution, here are some comparisons for anyone interested: MLAA 2.0 vs FXAA

However FXAA 4.0 also look very promising, it has been developed by Timothy Lottes at NVIDIA.

FXAA 4 is described as "a combination of a demosaicing/deinterleaving, super-resolution, and anti-aliasing algorithm all in one tiny full screen pass." Its goals are as follows:

1. Aid in the support of film quality visuals for games.

2. Ultra high quality/cost anti-aliasing.

3. Decouple rendering resolution from display resolution. Source

Three very ambitious goals for an AA solution, one of which "film quality" is quiet exciting in particular for Arma 3.

Timothy Lottes Blog

Share this post


Link to post
Share on other sites
Interesting info.

Wonder what Adaptive V sync is also?

Basically it acts like v sync whenever your frame rates goes above refresh rate it limits it to the refresh rate of your monitor, but when gameplay slows down for whatever reason the GPU is temporarily overclocked to reduce choppiness.

Share this post


Link to post
Share on other sites
Do you mean shared source as in the code is readily available for you to tinker and implement into the RV Engine? Also FXAA 4.0 according to Timothy Lottes is just the base for TXAA Source

I believe MLAA 2.0 is an ATI driver enforced solution, here are some comparisons for anyone interested: MLAA 2.0 vs FXAA

However FXAA 4.0 also look very promising, it has been developed by Timothy Lottes at NVIDIA.

FXAA 4 is described as "a combination of a demosaicing/deinterleaving, super-resolution, and anti-aliasing algorithm all in one tiny full screen pass." Its goals are as follows:

1. Aid in the support of film quality visuals for games.

2. Ultra high quality/cost anti-aliasing.

3. Decouple rendering resolution from display resolution. Source

Three very ambitious goals for an AA solution, one of which "film quality" is quiet exciting in particular for Arma 3.

Timothy Lottes Blog

MLAA is nothing new, in fact the original MLAA is from 2009 ... and it's purely hardware indepenent

but compared to FXAA and SMAA it's visually inferior, yet MLAA 2.0 is big question

also i don't see anything like TXAA source, details nor Timothy saying it's based off FXAA ...

wishful thinking?

Share this post


Link to post
Share on other sites

MLAA (previous version) had a serious problem cause it couldn't work in most games even though it's driver overridden. apparently, it's fixed in v2.0 and the guys at Tom's claim it has a lagre potential....i don't think that BIS has to "support" MLAA 2.0, it's enforced via driver anyway. most importantly, MLAA is apparently visually superior to FXAA although it appears to be more taxing on performance (Source).

Share this post


Link to post
Share on other sites
MLAA (previous version) had a serious problem cause it couldn't work in most games even though it's driver overridden. apparently, it's fixed in v2.0 and the guys at Tom's claim it has a lagre potential....i don't think that BIS has to "support" MLAA 2.0, it's enforced via driver anyway. most importantly, MLAA is apparently visually superior to FXAA although it appears to be more taxing on performance (Source).

i'm not aware of many game MLAA can't work with , most works fine ...

also the MLAA is easily adoptable like FXAA and SMAA (as source exists)

the real problem is MLAA 1.0 inferior approach (loss of detail)

and resulting quality being worse than FXAA+sharp filter or SMAA

and from the screens i would say that MLAA 2.0 isn't that much better than SMAA 1x or 2x modes

Share this post


Link to post
Share on other sites

the real problem is MLAA 1.0 inferior approach (loss of detail)

and resulting quality being worse than FXAA+sharp filter or SMAA

and from the screens i would say that MLAA 2.0 isn't that much better than SMAA 1x or 2x modes

well in reality difference is hardly noticeable during playing but it is noticeable if u wanna focus on a particular object. if u take a close look at that sample u'll see that MLAA 2.0 indeed gives crispier picture that FXAA which makes it slightly blur.

btw, i suppose ur speaking of MSAA when u say SMAA....i don't really know what SMAA stands for:)

as for the TXAA, i'm really impressed if it can provide MSAAx16 at the cost of MSAAx2...however, i'm not running Nvidia cards. on the other hand, for the truth to be told, playing on 1920x1080 and very high details there's not much difference between x8 and x16. in fact, i doubt that difference could be visible at all during playing.

Share this post


Link to post
Share on other sites

I'm really curious what this TXAA will be about. If those performance penalties are true, then they'd be practically unnoticeable on modern video cards. Those are really high claims. Image quality like nothing we've ever seen? Better than OGSSAA and SGSSAA? I have my doubts about that but we'll see. I would like to see at least FXAA implemented into ArmA 3, since it improves performance over MSAA while not looking much worse. The blur filter it applies actually helps visuals in some cases.

Share this post


Link to post
Share on other sites

ATI Catalyst 12.3 Beta has the new Super Sampling option - unfortunately it's highly unstable and crashes BF3/Skyrim and plays Arma like a literal Slideshow for me and many others have also retreated to previous versions...

Share this post


Link to post
Share on other sites
ATI Catalyst 12.3 Beta has the new Super Sampling option - unfortunately it's highly unstable and crashes BF3/Skyrim and plays Arma like a literal Slideshow for me and many others have also retreated to previous versions...

Yeah Super sampling is not for extremely intensive games like ArmA.

Share this post


Link to post
Share on other sites
well in reality difference is hardly noticeable during playing but it is noticeable if u wanna focus on a particular object. if u take a close look at that sample u'll see that MLAA 2.0 indeed gives crispier picture that FXAA which makes it slightly blur.

btw, i suppose ur speaking of MSAA when u say SMAA....i don't really know what SMAA stands for:)

as for the TXAA, i'm really impressed if it can provide MSAAx16 at the cost of MSAAx2...however, i'm not running Nvidia cards. on the other hand, for the truth to be told, playing on 1920x1080 and very high details there's not much difference between x8 and x16. in fact, i doubt that difference could be visible at all during playing.

SMAA is http://www.iryoku.com/smaa/

http://community.bistudio.com/wiki/arma2.cfg#ARMA_2:_Operation_Arrowhead

FXAA i speak about is with sharpen filter which negates most of the blur issue

Share this post


Link to post
Share on other sites
Source

According to Nvidia Arma 3 won't necessarily have to support TXAA natively as you can enable the AA via the Nvidia control panel(driver) so theoretically you could have 0 AA in game and have Nvidia run TXAA. Although ideally this could be a possible feature as BIS are always looking for ways to balance visuals with performance.

In summary:

TXAA(1) = 16x MSAA = 2x MSAA performance tax

Subsequently this technology will be available on Kepler architecture AKA: Geforce 680GTX release 22nd March.

I just want to correct some things. First of all, TXAA has to be supported by a game to actually benefit from it and will indeed only be available to the GTX 600 series, of which the GTX 680 will be launched and available on the 22nd. It is not possible according to information from nVidia to enable TXAA for a game without support.

Secondly, TXAA (1x) has a quality similar to the 8x MSAA method, not 16x, with a performance hit of only 2x full screen anti-aliasing according to the multisampling method. Very interesting therefore, but so far not a lot of developers (two large last time I checked) have said to implement it in their next games. It is attractive, but for game developers it would be a minor population who would benefit from it. Seeing Arma 3 gets PhysX included and PhysX is not very happy with non-nVidia hardware (singlethreaded poorly written with very old instruction set for CPU), most dedicated Arma 3 players are going to favour nVidia hardware I fear and the GTX 680 will be one of the few graphic cards capable of running Arma 3 with the best possible settings. But this is another discussion.

As for adaptive Vsync which will also be launched with the release of the next generation it is simply a logical improvement of vsync. Currently vsync at nVidia hardware either gives 60 fps with a 60 Hz monitor, or switches back to 30 fps if it cant achieve 60 fps. This switching is always very annoying and noticeable, and therefore adaptive vsync either limits the fps at 60, or tries to achieve as much as possible below 60 so you dont get an annoying large noticeable drop anymore. This is something determined by the hardware, so not necessary for behomia to implement.

Share this post


Link to post
Share on other sites
SMAA is http://www.iryoku.com/smaa/

http://community.bistudio.com/wiki/arma2.cfg#ARMA_2:_Operation_Arrowhead

FXAA i speak about is with sharpen filter which negates most of the blur issue

ok. is SMAA available for AMD 6xxx cards?

Seeing Arma 3 gets PhysX included and PhysX is not very happy with non-nVidia hardware (singlethreaded poorly written with very old instruction set for CPU), most dedicated Arma 3 players are going to favour nVidia hardware I fear and the GTX 680 will be one of the few graphic cards capable of running Arma 3 with the best possible settings. But this is another discussion.

hmmm...i wouldn't be surprised, when i launch Arma3, to see that sign: "Nvidia-The way it's meant to be played":):)

who knows, maybe BI got some sponsoring from Nv:cool:

just kiddin...

Edited by Spotter

Share this post


Link to post
Share on other sites

SMAA is available for everyone, You can already use it in Take On Helicopters 1.05 and ARMA 2: OA multiple latest 1.60 betas

Share this post


Link to post
Share on other sites

TXAA is nVidia proprietary tech. Why would BIS even take it into consideration, when there are others similar approaches, that works on both AMD and nVidia GFX cards. Bis have taken the same route for physix, so i would expect they do the same with this (note that i am planning to replace my amd with an nvidia card - most likely - before A3 alpha gets released...)

Share this post


Link to post
Share on other sites
TXAA is nVidia proprietary tech. Why would BIS even take it into consideration, when there are others similar approaches, that works on both AMD and nVidia GFX cards. Bis have taken the same route for physix, so i would expect they do the same with this (note that i am planning to replace my amd with an nvidia card - most likely - before A3 alpha gets released...)

Me too mate, although careful there are rumors of the 700 series in the second half of 2012

It's most likely to be a 685GTX and a 695GTX dual core.

It is expected that GeForce Kepler GK110 may have up to 2304 CUDA cores - up from GK104's 1536 CUDA cores - thanks to its bigger die which allows more SMX units (Streaming Multiprocessors x 2) to be placed on silicon. The new chip will also contain a total of nearly 6 billion transistors all drawing a total TDP between 250 and 300 watts.

Source: Possible 700 series or 695GTX

Share this post


Link to post
Share on other sites
Why would BIS even take it into consideration, when there are others similar approaches, that works on both AMD and nVidia GFX cards.

Quality and efficiency; even if NVIDIA only accounted for half of BI's prospective Player base (and there are credible statistics that suggest it's more then half) high-fidelity realism simulation and gaming lives and breaths on sharp edge detection.

Pixleation limitations in games only allow us to see and identify targets at a fraction of the distance we can in the real world; added to that are other render and display limitations to seeing to realistic distances further limiting realistic ranged engagements... This always conspires to make offering the sharpest lowest alias artifact edge definition that can be rendered at the lowest render cost a very desirable outcome for our pet genre.

NVIDA and AMD\ATI have competed with proprietary render technologies for the better part of a decade, and on occasion each comes up with something that's implemented, documented and supported to a degree that makes adoption desirable and/or easy for Developers even if it is exclusive. While it's too soon to tell if TXAA will deliver on advertised 'features & benefits' it looks promising from the demos.

:)

Edited by Hoak

Share this post


Link to post
Share on other sites

Not to unnecessarily necro a thread, but the latest nVidia beta driver (304.79) introduces TXAA support for the GTX 600 series.

Here's a blog post about it from the guy who works on FXAA:

http://timothylottes.blogspot.com/2012/07/txaa-out-30479-driver.html

I'd love to see this in ArmA 2 and ArmA 3...How hard would it be to implement?

Share this post


Link to post
Share on other sites

You should be able to try TXAA in ArmA 2 now (as long as you have a 600+ card :p), just be aware that it might cause graphical issues.

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
Sign in to follow this  

×