Jump to content

Welcome to the upgraded PCGamingWiki forums and files page. The current Wiki and Forum bridge is not functioning at the moment, therefore your Forum account currently has no password set. Please reset your Forum password via email check to generate a new password. If you have any issues please message Andytizer on Discord.

Marioysikax

AA, AF, Vsync and forcing through GPU

Recommended Posts

http://pcgamingwiki.com/w/index.php?title=Recettear:_An_Item_Shop%27s_Tale&curid=224&diff=232273&oldid=231165 

 

So technically this is correct, you can force at least basic FXAA, AF and force enable/disable vsync in 99% of cases. Question is that should this be considered as hackable method at all? I would almost say that only if it's for some reason not possible to use GPU panel or it needs some extra steps like differend DX mode or compatibility flags, field should be left either true/false unless game has some game specific methods for things. Because if this was standard thing for all articles, there wouldn't be false status on any articles. 

 

Another one is borderless fullscreen mode. It's nice to have that kind of info available, but at least my testing with borderless fullscreen software has been that basically all games supporting windowed mode works with it, so it would maybe be better to only mention those programs if they do not work for some reason. 

Share this post


Link to post
Share on other sites

Pretty sure it's like that since video section lists if the OPTION is AVAILABLE. Not if the feature itself is present in the game. So if there is no AA option in the game and your only resort is 3rd-party program then it's hackable

Share this post


Link to post
Share on other sites

http://community.pcgamingwiki.com/topic/1120-editing-guide/page-2?do=findComment&comment=5431

Check point 10.

 

And, of course, here we enter into the philosophical thinking and all.

What should hackable mean? What about false?

 

If I put everything aside, and I just focus on AA row, I see its importance comes down from being able to note compatibility bits for Nvidia inspector and AMD thing (aside of mentioning the supported in-game modes of course)

Those specific and -tested- working for that game. And similar considerations also applies to windowed mode. At least for hackable.

 

When one of them is false though I notice an asymmetry: in the later, you'd just put false with no notes (correct me if the contrary ever happened). Even if you tried with DxWnd, who knows? Perhaps it was just a missed checkbox?

In AA case you'd explicitly state: can't be forced from drivers.

 

A question that arises is then: how do we hint or not to readers such implicit things?

We already have those cool tooltips in game data. And we already have pretty icons for DRM (and hopefully one day a big tag at the beginning of every Issues fixed section)

But what about AA injectors, borderless fullscreen tools, custom resolutions for 4:3 only games, FPS limiting or Vsync handling?

Ok, in fact we already have every one (more or less) in the row hyperlinks. But I think you get what I meant.

 

Last but not least, speaking of AF instead, I guess we should set a rule. Is it just about the ability to enable it?

Because in that case 2x is already enough to deserve "true". But personally I'd still think it sucks.

What if 16x was the hackable thing?

Should we give that tag only if the hack was "inside the game" (ie: no video driver involved)?

Once we answer these questions finally, we can ask the biggest one: how do we treat RAGE?

Share this post


Link to post
Share on other sites

Pretty sure it's like that since video section lists if the OPTION is AVAILABLE. Not if the feature itself is present in the game. So if there is no AA option in the game and your only resort is 3rd-party program then it's hackable

Yes, that's exactly right, but question is that should GPU panel be treated as hackable or should the hackable be reserved to that game specific solutions only? Because vsync forcing either on or off has basically always worked for me trough GPU control panel, but I haven't exactly put it as hackable on articles because every article would have it as hackable then. 

 

 

http://community.pcgamingwiki.com/topic/1120-editing-guide/page-2?do=findComment&comment=5431

Check point 10.

 

And, of course, here we enter into the philosophical thinking and all.

What should hackable mean? What about false?

 

If I put everything aside, and I just focus on AA row, I see its importance comes down from being able to note compatibility bits for Nvidia inspector and AMD thing (aside of mentioning the supported in-game modes of course)

Those specific and -tested- working for that game. And similar considerations also applies to windowed mode. At least for hackable.

 

When one of them is false though I notice an asymmetry: in the later, you'd just put false with no notes (correct me if the contrary ever happened). Even if you tried with DxWnd, who knows? Perhaps it was just a missed checkbox?

In AA case you'd explicitly state: can't be forced from drivers.

 

A question that arises is then: how do we hint or not to readers such implicit things?

We already have those cool tooltips in game data. And we already have pretty icons for DRM (and hopefully one day a big tag at the beginning of every Issues fixed section)

But what about AA injectors, borderless fullscreen tools, custom resolutions for 4:3 only games, FPS limiting or Vsync handling?

Ok, in fact we already have every one (more or less) in the row hyperlinks. But I think you get what I meant.

 

Last but not least, speaking of AF instead, I guess we should set a rule. Is it just about the ability to enable it?

Because in that case 2x is already enough to deserve "true". But personally I'd still think it sucks.

What if 16x was the hackable thing?

Should we give that tag only if the hack was "inside the game" (ie: no video driver involved)?

Once we answer these questions finally, we can ask the biggest one: how do we treat RAGE?

Yeah, AA and Windowed modes are bit harder, because they sometimes flat out fail or can't do more than basic FXAA, but forcing FXAA still counts as hackable on current formatting system. 

 

 

AF should still be true even if it's 2x only, then just note about it being 2x and for larger values force config, GPU panel, etc. 

Share this post


Link to post
Share on other sites

Yes, that's exactly right, but question is that should GPU panel be treated as hackable or should the hackable be reserved to that game specific solutions only? Because vsync forcing either on or off has basically always worked for me trough GPU control panel, but I haven't exactly put it as hackable on articles because every article would have it as hackable then. 

Actually forcing vsync off didn't work on one game I was making article for. IDK what game was it.

Dunno, but honestly forcing AA/AF through CP is still "hacky" IMO even if it works 99% of time.

Share this post


Link to post
Share on other sites

I think that anything requiring actions outside of the game itself, regardless of how mundane and/or safe those actions are, should be considered as "hackable" and not "true".

But I guess that decision is ultimately up to the higher authroities of this website.

Share this post


Link to post
Share on other sites

I think that anything requiring actions outside of the game itself, regardless of how mundane and/or safe those actions are, should be considered as "hackable" and not "true".

 

That was already standard.

 

But I guess that decision is ultimately up to the higher authroities of this website.

They are not gods jeez 39ocd3Y.png

Share this post


Link to post
Share on other sites

But then why are we having this discussion?

Because repeating "you can use gpu control panel" all over the world seems a bit redundant. 

 

I'll add another point then, less "OMG" and more practical: if I have an AMD card how could I speak for nvidia users?

And we cannot expect editors to own two cards from different vendors, or to rely on "googling people" either (especially considering how "profound" understanding some have). Personally, I wouldn't even consider external post processing AA.

Not to mention that vendors aren't even the greatest common divisor here. Sometimes it's just a matter of drivers or even card generations.

 

For this reason, I definitively see a problem with "hackable: check graphics driver". Or "hackable: use injectSMAA"

But even sticking with a "just if present in the game" rule and totally omitting stuff like nvidia inspectors bits, seems dumb.

 

At the same time I'd find lame to put them blatantly there too.

I mean: I quite love how we managed to keep all vendor specific stuff for v-syncing, fps limiting, scaling, custom resolutions (not to mention audio fixing/testing) into "external pages" [==> outside game one]. It's much more professional imo.

And neutral (especially in this case where one could only test and write just about a single brand).

 

So.. I'd start from this last point: how to properly present compatibility bits?

Share this post


Link to post
Share on other sites

It seems redundant because that's exactly what it is: a redundant solution.

 

As for your other point I don't think it would be such a bad thing to add sub-sections below the Video Settings table named "For AMD GPU users" or "For Nvidia GPU users" or "For Intel GPU users" since those compatibility issues are game-specific most of the time.

It wouldn't look fancy, but I don't think it would bloat the page either. It's important information after all.

Share this post


Link to post
Share on other sites

It seems redundant because that's exactly what it is: a redundant solution.

It's still a solution though.

 

As for your other point I don't think it would be such a bad thing to add sub-sections below the Video Settings table named "For AMD GPU users" or "For Nvidia GPU users" or "For Intel GPU users" since those compatibility issues are game-specific most of the time.

It wouldn't look fancy, but I don't think it would bloat the page either. It's important information after all.

If I wanted to be finicky, technically this could also apply to games where AA is true.

Enhancing built-in support isn't a crime.

 

Aside of this: is there only just a method that works? What if my computer was slow and -for as much as it sucks- the only acceptable AA was post processing one?

What if CSAA could be faster than MSAA and I was missing that?

 

So.. I'd like to write all the mental "flip flops" but I'll cut with the end result (for AA and hopefully even the rest). Please be as much critic as possible:

  • Unknown: editor couldn't test all the "expected to check" stuff*. Otherwise:
    • True: the "expected to check" stuff has been tested and the feature X works out of the box. Otherwise:
      • hackable: X isn't supported out of the box, but there are general (as in "works for everybody") game specific fixes**. Otherwise:
        • false. What to write in notes here is probably even more important than in the bullet points before, but better if I think to it in another future post.

Speaking of compatibility bits instead, I really like the generic list concept, like the one in the link in the previous post.

We may ask OP if he couldn't maintain his profile list on, say, github. Or I guess BONKERS could arrange these things for us. And mention it only in nvidia page, or a separate page.

 

This way users that "appreciate Inspector" would still benefit, while those that give no damns have no additional "reading burden" (especially if you consider a really good "coverage" it's not only about AA)

I thought a lot about it and I believe somebody that comes down into externally enabling AA (or windowed or others) is way likely to care for that in all his games.

 

But I guess it's not impossible for one to just care for one and only one game either (ie: I especially cared for AA in mass effect).

So I wonder: with all these "not necessarily better" tools, what about a "tested working" tooltip when mouse arrow is over option name?.

This could also apply to [borderless] windowed methods and all the other things that are quite "random" in success rate.

 

 

*= For starters of course not every editor can be expected to have the necessary time/hardware to check everything (just think to the hypothetical "input lag").

But aside of that my message would be: if you can't 100% test it, then don't write anything on it.

This to avoid situations (which I saw in the past) where people tested 5.1 surround sound and put "5.1 works" note. Yes, fine. But if the game was even supporting 7.1 you are giving an information too easy to mistake with "[only] 5.1 works".

 

**= Notice how this wording manages to exclude vendor specific stuff like control panel methods, and generic solutions like "use sweetFX".

But I'd like some feedback on how we should further refine it to either definitively include or exclude stuff like DxWnd

 

Share this post


Link to post
Share on other sites

It seems redundant because that's exactly what it is: a redundant solution.

This was exactly my point. It's so reduntant that because it's so reduntant and I would've included it with my editings, then all but maybe one or two articles would have those sections as hackable! That would make false statement reduntant as well. 

 

Actually forcing vsync off didn't work on one game I was making article for. IDK what game was it.

Dunno, but honestly forcing AA/AF through CP is still "hacky" IMO even if it works 99% of time.

It is hacky, but it's also always the same way for every game, while something like editing configuration file or using game specific mods differ from game to game. That's why the GPU panel stuff should be turned upside down: note if for some reason forcing stuff trough GPU panel doesn't have effect. 

So having that GPU forcing on generic troubleshooting article would be much better and it would also streamline compatibility flag noting on articles as well, remember seeing several times on game specific articles instructions how to use inspector. 

 

If I wanted to be finicky, technically this could also apply to games where AA is true.

Enhancing built-in support isn't a crime.

 

Aside of this: is there only just a method that works? What if my computer was slow and -for as much as it sucks- the only acceptable AA was post processing one?

What if CSAA could be faster than MSAA and I was missing that?

I disagree with this one. First of all that becomes subjective matter, there are people who will refuse using post processing versions of AA. If your machine was bad, you most likely either don't use AA at all or use FXAA from GPU panel. And nothing stops from simply noting how to use better versions of AA, many do prefer using nvidia inspector methods. 

Share this post


Link to post
Share on other sites

​Can you quickly describe when I'm able to force AA from the drivers (Nvidia)? I know forcing anti-aliasing doesn't work with the newer DirectX games, but I've had major success with older games.

​Linux uses OpenGL, does this mean any potential changes?

​

​be3IEzX.jpg

​

​Postal 2: Paradise Lost

Share this post


Link to post
Share on other sites

​Can you quickly describe when I'm able to force AA from the drivers (Nvidia)? I know forcing anti-aliasing doesn't work with the newer DirectX games, but I've had major success with older games.

​Linux uses OpenGL, does this mean any potential changes?

​

​

​Postal 2: Paradise Lost

It varies greatly, but mostly you need to take Nvidia Control Panel and chuck it out the window as it doesn't do the job.

For forcing AA, Nvidia Inspector is what is really necessary.

http://forums.guru3d.com/showthread.php?p=5183388

 

As for what works when, as I said it varies greatly. Generally games running on DX7/DX8 don't need special flags to force basic things like MSAA, sometimes SGSSAA/OGSSAA too.

Sometimes, flags are needed for DX8 games, such as Unreal Engine 2.

 

With DX9, 8/10 times you need a special flag to force AA of any kind. Hence the lists on Guru3D/3DCenter.

Even further, many games might require more specific instructions based on resolution/aspect ratio/whether AA fix is needed or causes problems, etc. Something specific is Unreal Engine 3 games of the last 5 years have had a very annoying trait.

They ship with FXAA only, and there is no way to turn it off in the menus. If there is, sometimes it doesn't work. And then your only other option is to disable something like Depth Of Field in the Engine.ini. Which acts as a super switch for ALL Post Processing. So you either get FXAA interfering with Forced AA and PP. Or no or only some PP with forced AA working correctly.

I try to keep on top of this in the list on Guru3D.

 

Everything after DX9 is a crap shoot unfortunately. I don't know if it was because of changes to the standard and DXGI that Microsoft made, but in DX10 specifically the number of functions available is vastly smaller and there isn't enough to produce any results.

So, with those sometimes it works without any bits. But it's very rare. You can enhance AA too, but it becomes dependent upon the MSAA implementation in the game itself, which with DX10/11 becomes very problematic. Especially with the rise of non forward rendering and the increased performance cost of MSAA in Deferred Lighting/Rendering.

Shortcuts are taken or hacky methods are concocted to gain back performance or some other objective. Which results in lesser quality MSAA or very poor enhancing capabilities.

Even in DX9, this can be a problem. But more often than not, things can be forced instead or specific wacky work arounds come in to play. (Read the AA section of the Inspector link above for examples with Red Faction 3 and FEAR 2)

 

Once upon a time, people had set up a petition on Nvidia's forum asking them for DX10/11 AA compatibility bits. This was stickied in their driver forum with people responding for over 2+ years with no communication from Nvidia nor a final answer other than the thread being unstickied at one point.

We begged and begged, but nothing came. It was a shame, because personally, Nvidia's driver level AA capabilities are a huge selling point and probably the only thing that has really kept me coming back to Nvidia.

If AMD had the same level and quality built in, i'd have cards from both vendors probably. I knew that at some point AMD did have some kind of AA forcing functionality, but it was spotty and the quality wasn't that great.

 

 

Now days we are entirely dependent upon Developers to give a rats ass to put some effort into AA. More often than not, it doesn't happen. And even when it does, you still have to essentially brute force Ordered Grid AA (DSR/VSR/Custom Resolutions/GeDoSaTo) on top to get the quality to a very good point.

DSR/VSR are only saved by their choice of good variable resampling functions for the final image resolved. But it still is only OGSSAA, not the most elegant solution performance wise. That's not to say forcing is any better, SGSSAA in some modern DX9 games has become obscenely expensive due to the hacky nature combined with how games are made.

If a game uses Unity (In DX10+, in DX9 can often be forced. But it's generally more performance sucking than it should be. For whatever reason, whether Unity is unoptimized crap;look at PS4 game performance. Is 7/10 times poor; or what.),you are basically screwed. Their idea of AA only extends to basic FXAA/MLAA/SMAA1x.

And from what I understand. Modifying the Engine directly to do their own isn't available or isn't easy. There was some tech demo a while ago that seemed to have some kind of Temporal AA, but dear god it was total garbage in every way.

Unreal 4's TAA is also equally poor for varying reasons. But Epic doesn't make it impossible for developers to implement their own. (See: Ethan Carter Redux on UE4)

 

I digress, moving on to the forgotten child OpenGL. There are no flags for compatibility like with DX8/9, so you are at the mercy of what feature level and how the engine is set up by developers per game.

When it does work, you just need to set it to override and it just works. With one caveat, for SGSSAA specifically the option labeled "Transparency Supersampling" is fundamentally different than in DirectX.

At the core level, both TrSSAA and SGSSAA options are actually SGSSAA. What separates them is how they work, TrSSAA(SGSSAA) replays the pixel shading of only Alpha Test/Coverage materials to AA them. While SGSSAA(FullSceneSGSSAA) replays the pixel shading for the entire scene unless specified to ignore.

In OGL there is only the one, FSSGSSAA. This is used as the option "Transparency Supersampling" And the sample counts need to match. (8xQ+8xTr|2x+2xTr).

 

Doom 3 OG and the Riddick games(I think Riddick even works on AMD cards https://www.youtube.com/watch?v=gn8EPiiPpMQ) are examples of AAA games off the top of my head that work with forced AA. While Doom 3 BFG does not or any other games made on IDTech after Brink. (Which ran in DX9 IIRC)

I wish we could have compatibility bits for OGL and Vulkan too, this would greatly benefit Linux and SteamOS.

 

AMD several years ago published a sample and source for their own SGSSAA implementation called "SSAA SF" (SampleFrequency) geared towards a Forward+ renderer.

https://github.com/GPUOpen-LibrariesAndSDKs/SSAA11/releases

The code sample is vendor agnostic and works just fine on Nvidia cards.

It includes some presentation and documentation on how it works("SSAA11/Doc"). It does automatic LOD adjustment, though if one implemented it in their game it could be adjustable.

But it works very well and is very performant. I Wish this was used in games. Even if not performant, it's available as a future proof option.

 

 

 

This was exactly my point. It's so reduntant that because it's so reduntant and I would've included it with my editings, then all but maybe one or two articles would have those sections as hackable! That would make false statement reduntant as well. 

 

It is hacky, but it's also always the same way for every game, while something like editing configuration file or using game specific mods differ from game to game. That's why the GPU panel stuff should be turned upside down: note if for some reason forcing stuff trough GPU panel doesn't have effect. 

So having that GPU forcing on generic troubleshooting article would be much better and it would also streamline compatibility flag noting on articles as well, remember seeing several times on game specific articles instructions how to use inspector. 

 

I disagree with this one. First of all that becomes subjective matter, there are people who will refuse using post processing versions of AA. If your machine was bad, you most likely either don't use AA at all or use FXAA from GPU panel. And nothing stops from simply noting how to use better versions of AA, many do prefer using nvidia inspector methods.

Anything that isn't in the game to use should be marked "Hackable" IMHO. Or a new icon and setting should be come up with to replace it. Like "Driver Override Nvidia" "Driver Override AMD" or "Driver Override Both"

If the driver offers a higher quality or in general better method than available in game, then it should be noted what is in game but higher quality is available.

Share this post


Link to post
Share on other sites

Pretty detailed message. 

 

Anything that isn't in the game to use should be marked "Hackable" IMHO. Or a new icon and setting should be come up with to replace it. Like "Driver Override Nvidia" "Driver Override AMD" or "Driver Override Both"
If the driver offers a higher quality or in general better method than available in game, then it should be noted what is in game but higher quality is available.

Yes, but that still isn't what I was proposing. Of course if feature requires actions outside of main game then it's hackable. 

 

I wasn't trying to change GPU forcing as true/false, I was suggesting of leaving it out completely unless there's some game specific stuff like compatibility flags or if it doesn't work with that particular game for some reason. 

Share this post


Link to post
Share on other sites

It varies greatly, but mostly you need to take Nvidia Control Panel and chuck it out the window as it doesn't do the job.

For forcing AA, Nvidia Inspector is what is really necessary.

http://forums.guru3d.com/showthread.php?p=5183388

 

As for what works when, as I said it varies greatly. Generally games running on DX7/DX8 don't need special flags to force basic things like MSAA, sometimes SGSSAA/OGSSAA too.

Sometimes, flags are needed for DX8 games, such as Unreal Engine 2.

 

With DX9, 8/10 times you need a special flag to force AA of any kind. Hence the lists on Guru3D/3DCenter.

Even further, many games might require more specific instructions based on resolution/aspect ratio/whether AA fix is needed or causes problems, etc. Something specific is Unreal Engine 3 games of the last 5 years have had a very annoying trait.

They ship with FXAA only, and there is no way to turn it off in the menus. If there is, sometimes it doesn't work. And then your only other option is to disable something like Depth Of Field in the Engine.ini. Which acts as a super switch for ALL Post Processing. So you either get FXAA interfering with Forced AA and PP. Or no or only some PP with forced AA working correctly.

I try to keep on top of this in the list on Guru3D.

 

Everything after DX9 is a crap shoot unfortunately. I don't know if it was because of changes to the standard and DXGI that Microsoft made, but in DX10 specifically the number of functions available is vastly smaller and there isn't enough to produce any results.

So, with those sometimes it works without any bits. But it's very rare. You can enhance AA too, but it becomes dependent upon the MSAA implementation in the game itself, which with DX10/11 becomes very problematic. Especially with the rise of non forward rendering and the increased performance cost of MSAA in Deferred Lighting/Rendering.

Shortcuts are taken or hacky methods are concocted to gain back performance or some other objective. Which results in lesser quality MSAA or very poor enhancing capabilities.

Even in DX9, this can be a problem. But more often than not, things can be forced instead or specific wacky work arounds come in to play. (Read the AA section of the Inspector link above for examples with Red Faction 3 and FEAR 2)

 

Once upon a time, people had set up a petition on Nvidia's forum asking them for DX10/11 AA compatibility bits. This was stickied in their driver forum with people responding for over 2+ years with no communication from Nvidia nor a final answer other than the thread being unstickied at one point.

We begged and begged, but nothing came. It was a shame, because personally, Nvidia's driver level AA capabilities are a huge selling point and probably the only thing that has really kept me coming back to Nvidia.

If AMD had the same level and quality built in, i'd have cards from both vendors probably. I knew that at some point AMD did have some kind of AA forcing functionality, but it was spotty and the quality wasn't that great.

 

 

Now days we are entirely dependent upon Developers to give a rats ass to put some effort into AA. More often than not, it doesn't happen. And even when it does, you still have to essentially brute force Ordered Grid AA (DSR/VSR/Custom Resolutions/GeDoSaTo) on top to get the quality to a very good point.

DSR/VSR are only saved by their choice of good variable resampling functions for the final image resolved. But it still is only OGSSAA, not the most elegant solution performance wise. That's not to say forcing is any better, SGSSAA in some modern DX9 games has become obscenely expensive due to the hacky nature combined with how games are made.

If a game uses Unity (In DX10+, in DX9 can often be forced. But it's generally more performance sucking than it should be. For whatever reason, whether Unity is unoptimized crap;look at PS4 game performance. Is 7/10 times poor; or what.),you are basically screwed. Their idea of AA only extends to basic FXAA/MLAA/SMAA1x.

And from what I understand. Modifying the Engine directly to do their own isn't available or isn't easy. There was some tech demo a while ago that seemed to have some kind of Temporal AA, but dear god it was total garbage in every way.

Unreal 4's TAA is also equally poor for varying reasons. But Epic doesn't make it impossible for developers to implement their own. (See: Ethan Carter Redux on UE4)

 

I digress, moving on to the forgotten child OpenGL. There are no flags for compatibility like with DX8/9, so you are at the mercy of what feature level and how the engine is set up by developers per game.

When it does work, you just need to set it to override and it just works. With one caveat, for SGSSAA specifically the option labeled "Transparency Supersampling" is fundamentally different than in DirectX.

At the core level, both TrSSAA and SGSSAA options are actually SGSSAA. What separates them is how they work, TrSSAA(SGSSAA) replays the pixel shading of only Alpha Test/Coverage materials to AA them. While SGSSAA(FullSceneSGSSAA) replays the pixel shading for the entire scene unless specified to ignore.

In OGL there is only the one, FSSGSSAA. This is used as the option "Transparency Supersampling" And the sample counts need to match. (8xQ+8xTr|2x+2xTr).

 

Doom 3 OG and the Riddick games(I think Riddick even works on AMD cards https://www.youtube.com/watch?v=gn8EPiiPpMQ) are examples of AAA games off the top of my head that work with forced AA. While Doom 3 BFG does not or any other games made on IDTech after Brink. (Which ran in DX9 IIRC)

I wish we could have compatibility bits for OGL and Vulkan too, this would greatly benefit Linux and SteamOS.

 

AMD several years ago published a sample and source for their own SGSSAA implementation called "SSAA SF" (SampleFrequency) geared towards a Forward+ renderer.

https://github.com/GPUOpen-LibrariesAndSDKs/SSAA11/releases

The code sample is vendor agnostic and works just fine on Nvidia cards.

It includes some presentation and documentation on how it works("SSAA11/Doc"). It does automatic LOD adjustment, though if one implemented it in their game it could be adjustable.

But it works very well and is very performant. I Wish this was used in games. Even if not performant, it's available as a future proof option.

 

 

 

 

Anything that isn't in the game to use should be marked "Hackable" IMHO. Or a new icon and setting should be come up with to replace it. Like "Driver Override Nvidia" "Driver Override AMD" or "Driver Override Both"

If the driver offers a higher quality or in general better method than available in game, then it should be noted what is in game but higher quality is available.

That was beautiful. I always assumed that AMD has similar driver-level AA forcing capabilities, are they completely lacking on that front?

​I want to start playing games in 4K, when using Nvidia DSR the results are excellent. To me downsampling looks better that 4x MSAA, I see almost no aliasing. Can you tell me if actual 4K resolution will look as good as downsampled image?

Share this post


Link to post
Share on other sites

Technically no. Especially with modern games using complex lighting and shading. There was a great video some time last year of Epic showing off UE4's Infiltrator demo with specific rendering techniques. One of them was TemporalAA, with it off you can see how hideously harsh the raw shading is.

That's going to look aliased at just about any resolution.

 

On a high PPI, high res device on a game with simpler looking visuals. It will be less noticeable a bit. At least in Edge Quality. But it won't look as polished as an AA'd image.

 

 

When 1080p was newish, people used to say the same thing. About how they couldn't see aliasing any more, or that they couldn't see any aliasing when using 4xMSAA. (Which might be somewhat correct considering visuals were less complex and edge/geometry aliasing were the predominantly known forms of "Jaggies")

 

I haven't been able to test it, but a HQ PP/Temporal solution that isn't actually terrible at 4k native might actually look "good enough" .

Share this post


Link to post
Share on other sites

I would have never believed, but I have fallen in love with FXAA, Nvidia's driver-level FXAA. I disliked post-processing anti-aliasing because of negative, blurry experiences in some games, but now I understand it's all about the implementation. According to Nvidia FXAA can be forced to all applications and the performance hit should be near 0%. I will always use FXAA when I have no performance to spare!

​

​

PVdqIRK.jpg

​Enemy Front with FXAA

Share this post


Link to post
Share on other sites

I believe "Hackable" should range from anything between an INI tweak to external program with Batchfiles.  It only gets to be "true" if the game supports it, with either an option in-game (Settings Menu), console command (if the console can be opened without tweaking the game), or keyboard shortcut (e.g. Alt-Enter for Window/Fullscreen toggle).

 

Drivers, external programs, INI tweaks, and all other kinds of tweaks that involve changing the game in a way that the game itself wasn't meant to be changed counts as "Hackable."

 

For example, using AMD Crimson to force Vertical Sync in Borderlands doesn't mean the game supports Vsync.  INI tweaks don't count, either.  If someone were on Intel HD, they can't tweak like that, and you have to open up the whole thing with a notepad editor (or NotepadPlusPlus if you value your sanity) to get the option working.

Share this post


Link to post
Share on other sites

I disagree with this one. First of all that becomes subjective matter, there are people who will refuse using post processing versions of AA. If your machine was bad, you most likely either don't use AA at all or use FXAA from GPU panel. And nothing stops from simply noting how to use better versions of AA, many do prefer using nvidia inspector methods. 

That was indeed -sort of- my point.

AA it's not just "enable/disable". There are load of trifles that couldn't be explained in game pages.

 

If AMD had the same level and quality built in, i'd have cards from both vendors probably. I knew that at some point AMD did have some kind of AA forcing functionality, but it was spotty and the quality wasn't that great.

I don't know what you are about, but for the bunch of -dx9- games I tried back in the days I never had problems on my 5770.

It was as simple as opening the control panel and enabling it. And quality improvement was pretty noticeable.

 

I just last year discovered this was due to every game having specific profiles, but I don't think normal user cares as long as it's this straightforward.

This is also the reason why there isn't really all that fuss around AA like for for nvidia cards imo, albeit I'm fairly sure you can tinker and fine tune too.

 

That was beautiful. I always assumed that AMD has similar driver-level AA forcing capabilities, are they completely lacking on that front?

​ I want to start playing games in 4K, when using Nvidia DSR the results are excellent. To me downsampling looks better that 4x MSAA, I see almost no aliasing. Can you tell me if actual 4K resolution will look as good as downsampled image?

If we are talking of DSR (≈SSAA≈FSAA) it's not like they both haven't supported it in the last 16 years.

 

I believe "Hackable" should range from anything between an INI tweak to external program with Batchfiles.

Yes, but you do see that by this logic even controller support becomes universally "hackable" with no special difficulty.

And given something as "rude" as post processing AA or DSR can be forced almost everywhere, you see people (like the sensible Mairo) start to feel sorry for the poor useless false attribute.

 

Also, I'm not sure "mentioning X vendor" (not specifically tied to game like, say, Soundblasters or Oculus) is any "fair" and/or neutral

You are expected to "evaluate" the game intrinsic problems. Not the outside world.

 

Which in turn doesn't mean you should resign, period.

But that you should point back to stuff like this

 

 

It only gets to be "true" if the game supports it, with either an option in-game (Settings Menu), console command (if the console can be opened without tweaking the game), or keyboard shortcut (e.g. Alt-Enter for Window/Fullscreen toggle).

Console commands are already considered hackable iirc.

I can't say I find this senseful, but I can't even say it doesn't make sense.

 

I hope all my double negations aren't giving somebody a cancer

 

 

For example, using AMD Crimson to force Vertical Sync in Borderlands doesn't mean the game supports Vsync.  INI tweaks don't count, either. 

Mhh, no, really. Thanks for the example and for pointing out how much they are entirely different things.

If you [have to] force vsync in the driver, then game definitively doesn't support it.

 

But if it's just a matter of editing .inis, the game supports it. Hackable adjective is qualifying the game indeed.

If someone were on Intel HD, they can't tweak like that, and you have to open up the whole thing with a notepad editor (or NotepadPlusPlus if you value your sanity) to get the option working.

It would be screwed, so you can't actually say the "game is" hackable.

Share this post


Link to post
Share on other sites

I'm going to weigh in on this issue as I've been thinking of adding something to the Editing guide to address this.

 

Video settings that can be forced through a video card's drivers can be considered general fixes that are application-agnostic. It's the type of solution that can be applied to practically every 3D game the wiki covers.

 

Based on that knowledge, I think we should not consider forcible video settings to be "hackable" in the context of any specific game. An article should be dedicated to game-exclusive fixes only. Allowing very general solutions tends to make an option field always "hackable", which I personally think does not add anything of value to the page.

 

The only exception where adding a general fix would make sense is if in-game support for the setting is fundamentally broken with no other possible workaround (ex. game has broken V-sync support). Otherwise, I would add general steps to the relevant video setting glossary pages on how to force each setting.

Share this post


Link to post
Share on other sites

Maybe we should add a tag named "Generic" (along with True, False, N/A, etc.) that links to relevant pages/page sections when clicked on?

Share this post


Link to post
Share on other sites

A distinct symbol for AA/AF/Vsync generic states is certainly an option. This could be handled by the template (so pages would specify "false" to get this result for those rows).

 

I don't know what sort of symbol would work best for that. Many mobile apps use a group of vertical dots or lines as a standard indicator for more options, so maybe something like that would be recognisable?

Share this post


Link to post
Share on other sites

It would need to be a symbol that tells the reader instinctively what to do, and in this case, they would need to click on the symbol for more info.

So yes, I think a series of dots would work. But then dots don't really remind people of the idea of "generic". Maybe something like the recycling symbol (with the three arrows) would be more appropriate.

Share this post


Link to post
Share on other sites

Maybe we should add a tag named "Generic" (along with True, False, N/A, etc.) that links to relevant pages/page sections when clicked on?

The only way I would see that working is if the Video Settings table had a dedicated field to enable showing generic instructions ("show_generic"). If the field is set to "true", then add a blurb to the relevant video settings along the lines of:

 

"Generic instructions for forcing <VIDEO SETTING> can be found in <LINK TO GLOSSARY SECTION>"

 

Even then, I don't like the idea of having a dedicated element in game articles for general fixes like that. I want to remove general solutions, not highlight them.

 

I already mentioned the best approach to this: Add one set of generic instructions to the glossary pages (Anisotropic filteringAnti-aliasingVertical sync) and remove said instructions from specific game pages. All tables are already linked to these glossary pages. If someone absolutely wants to force a specific video setting for a game, they can look there.

Share this post


Link to post
Share on other sites

The only way I would see that working is if the Video Settings table had a dedicated field to enable showing generic instructions ("show_generic"). If the field is set to "true", then add a blurb to the relevant video settings along the lines of:

 

"Generic instructions for forcing <VIDEO SETTING> can be found in <LINK TO GLOSSARY SECTION>"

 

Even then, I don't like the idea of having a dedicated element in game articles for general fixes like that. I want to remove general solutions, not highlight them.

 

I already mentioned the best approach to this: Add one set of generic instructions to the glossary pages (Anisotropic filteringAnti-aliasingVertical sync) and remove said instructions from specific game pages. All tables are already linked to these glossary pages. If someone absolutely wants to force a specific video setting for a game, they can look there.

Oh I forgot that tables are linked to glossary pages. I agree with your method, having instructions in glossary pages would be the most efficient.

However I still don't think it would be right to simply set the field to true when it require a general fix because in that case it's not strictly true.

Share this post


Link to post
Share on other sites

Special icon idea seems pretty.. awful.
And again the "what would you write there" question arises. There's simply too much stuff going on.
 
We already have the Glossary tags for what it matters. I'd just love people to go there, in case.
But if any, I see a point: they aren't as noticeable as needed.
 
In my ideal world, we'd just have to find a way to better make people realize they can just go there (ie: those pages aren't just jargon with useless "acronyms explanations")
Or.. instead of make the thing universal: le'ts keep everything as it is, and just do.. something when status is false.
 

Think: game isn't even 16:9, you open widescreen resolution page, and something refers you to Custom Resolution page.
This is something not even power users usually know for 4:3 only games (I mean.. you have a 1080p display and by default you can play only up to 1600x1200 or 1280x1024)

 

I'm out of ideas atm (an arrow in notes that points to the link? A note? A bigger button? A different color?) but I'm really looking forward to it.

Share this post


Link to post
Share on other sites

I have made an example implementation of a default note that is displayed when AA/AF/Vsync is false and has no note provided.

 

This is not a perfect solution since it would be shown for games where there really is no option (hence the tentative phrasing).

Share this post


Link to post
Share on other sites

@BONKERS How does forcing AA through drivers work with GoldSrc engine? I've enabled AA in Half-Life, there's a notable improvement with 32x CSAA, but aliasing is noticeable in 1920 x 1080 and looks more like 4x MSAA to me.

 

I found a HL screenshot, downscaled from 4K:

 

iRFdRbE.jpg

Share this post


Link to post
Share on other sites

@BONKERS How does forcing AA through drivers work with GoldSrc engine? I've enabled AA in Half-Life, there's a notable improvement with 32x CSAA, but aliasing is noticeable in 1920 x 1080 and looks more like 4x MSAA to me.

 

I found a HL screenshot, downscaled from 4K:

 

iRFdRbE.jpg

Unless you have a pre Maxwell GPU, CSAA doesn't do anything. And all it really is, is more memory efficient MSAA. Supposedly quality was supposed to be improved (like 8xCSAA gave you 8xMSAA at 4xMSAA cost) But I don't ever recall it actually ending up in a way that made a significant difference. (Since there are more problems than just geometry aliasing). Obviously, from a cost stand point it made sense and even more so on consoles(EQAA)

 

The GoldSRC engine runs on DX7 doesn't it? Or has that been updated to DX8/9 by now? (I haven't played vanilla HL in a long long time lol)

I'd imagine the UI at 4k would become painfully small.

Using Nvidia inspector, i'd try setting it up like this at native res instead of downsampling and seeing how it looks. (Probably don't need any compatibility bits)

649Untitled1.jpg

Share this post


Link to post
Share on other sites

Unless you have a pre Maxwell GPU, CSAA doesn't do anything. And all it really is, is more memory efficient MSAA. Supposedly quality was supposed to be improved (like 8xCSAA gave you 8xMSAA at 4xMSAA cost) But I don't ever recall it actually ending up in a way that made a significant difference. (Since there are more problems than just geometry aliasing). Obviously, from a cost stand point it made sense and even more so on consoles(EQAA)

 

The GoldSRC engine runs on DX7 doesn't it? Or has that been updated to DX8/9 by now? (I haven't played vanilla HL in a long long time lol)

I'd imagine the UI at 4k would become painfully small.

Using Nvidia inspector, i'd try setting it up like this at native res instead of downsampling and seeing how it looks. (Probably don't need any compatibility bits)

The UI is small to the point of being nearly unplayable, see that tiny "Steve Bond" near left corner?

I find odd that some objects (above doorway) are more aliased than others (HL with 32x CSAA through Nvidia CP @ 1920 x 1080), 8x MSAA yields similar results):

 

W0qlK5D.jpg

Share this post


Link to post
Share on other sites

OK, so I guess they really donked things up when they moved things to OpenGL.
Forcing AA is impossible(If you have MSAA or any kind of AA override set up, it's literally doing nothing. Placebo.), and the game has 4xMSAA on by default. The only way to turn it off is via the "-nomsaa" launch command.
Even with that, forcing is still not possible.

Enhancing, however. Is possible.

It's as good as it will get.

So you set up inspector like this,
eafsdfe.jpg

-nomsaa command (NoAA) http://u.cubeupload.com/MrBonk/hl20160727235703500.png
In game MSAA(I assume 4x based on how it looks) http://u.cubeupload.com/MrBonk/hl20160727235835997.png
8xMSAA+8xTrSSAA Enhanced (In OGL 8xTrSSAA is equivalent to the SGSSAA setting in D3D) http://u.cubeupload.com/MrBonk/hl20160728000803666.png

Share this post


Link to post
Share on other sites

Thanks BONKERS. I found out later that forcing AA is not possible, I toggled between downscaling and others methods and likely got confused.

 

Your screenshots look great, how much time does adjusting Nvidia Inspector generally take? Do you create every rule on per-game basis?

Share this post


Link to post
Share on other sites

So.. BONKERS, what about that thing I mentioned here?

Does Inspector support something like a "master list"?

 

 

 

Latest Steam version uses OpenGL. D3D was removed in OS X/Linux update.

And Aureal and EAX ;(

 

Share this post


Link to post
Share on other sites

Thanks BONKERS. I found out later that forcing AA is not possible, I toggled between downscaling and others methods and likely got confused.

 

Your screenshots look great, how much time does adjusting Nvidia Inspector generally take? Do you create every rule on per-game basis?

Adjusting inspector, once you have your base profile set, is pretty quick.

When it comes to AA however, there is no catch all, you can go by what is on the list(Which does sometimes have recommendations occasionally, and covers whether a game needs special bits and any notes to go with it , usually useful information in reference posts) So that adds a bit of time to it, and sometimes you need to tweak things to you preference, and maybe get creative. (Like for example combining multiple methods all together, like you have to with Red Faction Guerrilla. To quote myself

 

Red Faction Guerrilla, you can't force AA in this game. However you can Enhance the in game MSAA with various methods of AA to some decent results. But it shines in when you combine the in game AA+Enhanced AA and then downsampling while also enabling FXAA in the game Profile.

(FXAA works when Enhancing in game AA. It used to when Overriding as well, but has been broken since after 331.82. It is applied last in the chain so it doesn't cause conflicts with other AA though it's not recommended to use it at native over enhanced AA if that makes sense. Oversampling from Downsampling negates any smoothing issues)

 

This is a rather unique exception as most games don't yield this good of results.

 

Here is a few comparisons showing it off.

http://screenshotcomparison.com/comp....php?id=103126

This first one shows no AA by default | Vs | The game running at 2x2 Native resolution with 2xMSAA enabled in game with "Enhance Application Setting" enabled and set to 4xS (1x2 OGSSAA + 2xMSAA) together with 2xSGSSAA. Finally with FXAA enabled on the profile.

 

http://screenshotcomparison.com/comp....php?id=103127

This second one is cropped from the native 3200x1800 buffer with 2xMSAA+4xS+2xSGSSAA |Vs| That with FXAA also enabled showing that there are still some rough edges that FXAA cleans up before it is downsampled back to native 1600x900

 

 

The 3rd comparison shows 2x2 native resolution + 2xMSAA | Vs | 2x2 Native + 2xMSAA+4xS+2xSGSSAA+FXAA cropped and upsampled with point filtering by 2x2 to show how much more aliasing is tackled and resolved.

http://screenshotcomparison.com/comparison/161297

 

Inspector might be confusing at first, but once you get used to it, you can't go back to NVCP and stuff is very quick when you get accustomed to what is what.

 

 

I usually adjust Inspector on a per game basis for DX9 and below generally. Since there isn't anything that helps with AA in DX10+ or in newer OpenGL games), though since I have "Adaptive" set as my Power Management Mode, I have to manually set each game to "Prefer Maximum Performance", and create profiles for games that don't have one to do so as well. (Earth Defense Force 4.1 is a recent example). Because otherwise Adaptive performance mode causes the GPU to downclock while playing games. Which you do not want. But you have to have Adaptive set in order for your GPU to idle properly as well when not gaming. (Depending on the card)

 

http://forums.guru3d.com/showthread.php?p=5183388

 

So.. BONKERS, what about that thing I mentioned here?

Does Inspector support something like a "master list"?

 

 

 

And Aureal and EAX ;(

 

Oh sorry, must have missed this!

Inspector doesn't have anything like a master list of user specific values. By that I mean something like this list https://docs.google.com/spreadsheets/d/1ekUZsK2YXgd5XjjH1M7QkHIQgKO_i4bHCUdPeAd6OCo/pubhtml#

Where the values being the custom AA flags, are not supported directly in Inspector unless they are part of the lists that Inspector gets from the Drivers.

 

Profile Inspector is Open Source now, but I don't know about how difficult it would be to directly integrate these flags into predefined profiles , which opens up another set of problems because.

 

A.) Not all games have a profile in the driver, the user has to make one themselves.

B.) Many games have multiple flags available, and many game specific requirements or recommendations, or trade offs . To the average user, without the notes on usage as written in reference posts or Column D. Things might not work correctly or as expected or otherwise.

 

 

You mentioned

 

And mention it only in nvidia page, or a separate page.

I think this is a decent idea and something I recall maybe mentioning in the past.

Though ideally, a separate page like "Nvidia Anti Aliasing Flags" would either.

 

A.) Directly embed the Google Spreadsheet like in the thread here http://forums.guru3d.com/showthread.php?t=357956 this would not create any additional work for me. (IE: I can just update and maintain the document/list and it's kept up to date in both places)

 

B.) Create a similar page that is just formatted in columns like the document above (easy to search and easy to read), and I would just need to update both as I go.

(This method does open up the possibility of someone else editing the page and messing something up too though.)

I've been recently the last year or so adding more games to the list that not necessarily need bits to force AA (because that's useful information still).

I'd like to continue adding to that. So we have a comprehensive list of DX9 and below (And some OpenGL games if they work, usually older ones)

 

 

 

 

In either case, at the top of such a page would be some basic instructions on using Nvidia Profile Inspector, or a redirect to my thread Nvidia Inspector introduction and Guide - Guru3D.com Forums (I have been wanting to make just a page for this, more or less creating a wiki formatted copy of my thread, in that case it would direct to that page)

Do you think that'd be allowable?

Share this post


Link to post
Share on other sites
I have to manually set each game to "Prefer Maximum Performance", and create profiles for games that don't have one to do so as well. (Earth Defense Force 4.1 is a recent example). Because otherwise Adaptive performance mode causes the GPU to downclock while playing games.

Oh, cool. That's another cute thing that could be mentioned.

 

Oh sorry, must have missed this!

Inspector doesn't have anything like a master list of user specific values.

It's not like I'd like to contradict you, but what's the silentimport command and "backups" then?

 

Tfw you fear to see Blackbird from the bakery

 

 

Pretending 8 really different AA types to exist is kind of misleading.

You could work a bit with nesting and/or bullet points to highlight it's not actually like that.

 

Where the values being the custom AA flags, are not supported directly in Inspector unless they are part of the lists that Inspector gets from the Drivers.

I dunno why I'm just now consciously thinking to how inspector works (ie: driver to the dirty part in the end).

And hell, it's nothing different from AMD blb. And.. I wonder how hard it could be to made an AMD Inspector, lol.

 

EDIT: power to do basically everything seems there.

 

Profile Inspector is Open Source now, but I don't know about how difficult it would be to directly integrate these flags into predefined profiles , which opens up another set of problems because.

I wasn't thinking to integrating them directly into the program.. But hey, that's a pretty cool idea.

I mean, I don't think its place should be the program repo itself.. But creating yours and starting to "accept pull requests" would be really nice.

 

A.) Not all games have a profile in the driver, the user has to make one themselves.

B.) Many games have multiple flags available, and many game specific requirements or recommendations, or trade offs . To the average user, without the notes on usage as written in reference posts or Column D. Things might not work correctly or as expected or otherwise.

And this is why you are so important :D

 

I think this is a decent idea and something I recall maybe mentioning in the past.

Though ideally, a separate page like "Nvidia Anti Aliasing Flags" would either.

A, B

I don't see the convenience in duplicating or even embedding the spreadsheet here.

At least, this is not the "difficult level" I'm at ease to consider right for something so basic.

 

In either case, at the top of such a page would be some basic instructions on using Nvidia Profile Inspector, or a redirect to my thread Nvidia Inspector introduction and Guide - Guru3D.com Forums (I have been wanting to make just a page for this, more or less creating a wiki formatted copy of my thread, in that case it would direct to that page)

Do you think that'd be allowable?

If you think it's something wiki-ficable.. then, ok. I mean, assuming it's going to be so long not to fit comfortably Nvidia CP page (and anyway, why not ressurecting Nvidia Inspector page?)

But regardless, I guess we still need to finish that AA page affair.

For your convenience if any, not to have to rewrite that afterwards.

Share this post


Link to post
Share on other sites

Oh, cool. That's another cute thing that could be mentioned.

 

It's not like I'd like to contradict you, but what's the silentimport command and "backups" then?

 

Tfw you fear to see Blackbird from the bakery

 

 

Pretending 8 really different AA types to exist is kind of misleading.

You could work a bit with nesting and/or bullet points to highlight it's not actually like that.

 

I dunno why I'm just now consciously thinking to how inspector works (ie: driver to the dirty part in the end).

And hell, it's nothing different from AMD blb. And.. I wonder how hard it could be to made an AMD Inspector, lol.

 

I wasn't thinking to integrating them directly into the program.. But hey, that's a pretty cool idea.

I mean, I don't think its place should be the program repo itself.. But creating yours and starting to "accept pull requests" would be really nice.

 

And this is why you are so important :D

 

I don't see the convenience in duplicating or even embedding the spreadsheet here.

At least, this is not the "difficult level" I'm at ease to consider right for something so basic.

 

If you think it's something wiki-ficable.. then, ok. I mean, assuming it's going to be so long not to fit comfortably Nvidia CP page (and anyway, why not ressurecting Nvidia Inspector page?)

But regardless, I guess we still need to finish that AA page affair.

For your convenience if any, not to have to rewrite that afterwards.

 

Pretending 8 really different AA types to exist is kind of misleading.

You could work a bit with nesting and/or bullet points to highlight it's not actually like that.

Ah but there is no pretending. They DO exist.

 

Nvidia users have access from Inspector ; for games DX9 and less, and older OpenGL games; to force the following forms of AA (Depending on compatibility shown in list based on flag or without)

 

MSAA,CSAA,OGSSAA,HSAA,SGSSAA,TrSSAA,TrMSAA,MFAA and FXAA(Which is API agnostic)

 

Some can be used standalone, some can be combined with others, some require being combined.

 

MSAA (Multisampling)   2x,2xQ(Quincux),4x,8xQ Standard MSAA

 

CSAA (Coverage Sampling)  8x,16x,16xQ,32x. Requires same compatibility as MSAA. If it works with MSAA it will with CSAA generally. Though this is pointless as there is no gain in quality/perf these days really and you are limited to pre-Maxwell GPUs

 

OGSSAA (Ordered Grid Super Sampling)  1x2,2x1,2x2,3x3,4x4. Standard brute force SSAA with built in texture LOD adjustment.(Which in some games works against it). It has some downsides though, it can be very costly in some games and can be limited by the standard Box function resolve.

In one specific case, I remember OGSSAA worked better in Sonic All Star Racing Transformed, better than SGSSAA did. In L4D2, without any bits using 4x4 looks amazing, though slightly soft. Using bits brings it back to normal sharpness.

 

HSAA(Hybrid Sampling)  4xS,8xS,8xSQ,12xS,16xS,32xS Hybrid Sampling (Referred to internally in the driver as Mixed Sampling)combines OGSSAA with MSAA and the naming schemes are based on approximate level of SSAA the combined pairs give.

4xS - 1x2 OG + 2xMSAA (1x2x2=4)

8xS - 1x2 OG + 4xMSAA (1x2x4=8)

8xSQ - 2x2 OG + 2xMSAA (2x2x2=8)

12xS - 2x2 OG + 4xOGMSAA This one is a bit of an oddity, it is listed as using 4xOGMSAA, so internally I assume they are doing some trick to use the OG sample pattern for MSAA as well(rather than the standard rotated pattern), reducing cost but quality as well. When you use this mode IIRC, the effect is even visible on an OSD. With a slight offset

 

SGSSAA(Sparse Grid Super Sampling; aka FSSGSSAA)  2x,4x and 8x. SGSSAA requires the use of matching MSAA mode (IE: 8xCSSAA =/= 8xQMSAA) to work, though you can get away with 4xMSAA and 2xSGSSAA for some games as 2x+2xSG is not very good.

SGSSAA works by replaying the pixel shading for N number of samples per pixel for the entire scene. So 8xSGSSAA has 8 samples, each pixel has it's shading done 8 times.

SGSSAA is actually called FSSGSSAA, (Full Scene Sparse Grid Super Sampling Anti Aliasing) and TrSSAA is just SGSSAA. Since they are the same technique, but TrSSAA is selective SSAA of Alpha Tested objects only. But in OpenGL TrSSAA is actually FSSGSSAA. Confused yet?

There is no Automatic LOD adjustment here, some games need it. Some don't. Some people prefer it, some don't.

 

You can also use SGSSAA to trick the driver so you can force positive or negative LOD biases without actually using SGSSAA as well. (Say you have a game that has an excessively negative LOD bias for Texture MIPS, you set say 2xMSAA,2xSGSSAA, set it to application controlled;Enhance if that doesn't work; and then set a positive LOD bias for the game profile. And it will do the trick)

 

AMD also created their own version of this for use with Forward Plus rendering engines. Called SFAA https://github.com/GPUOpen-LibrariesAndSDKs/SSAA11/releases it works basically the same way. And includes options for auto LOD adjustment

 

You can read a little more about SGSSAA in the attached TXAA Power Point Presentation

http://www.mediafire.com/download/8kbi1mnc3ol2ko8/GTC-TXAA.ppt (Don't share this elswhere)

 

TrSSAA (Transparency Super Sampling; aka SGSSAA, in Open GL TrSSAA is FSSGSSAA)   2x,4x,8x. These require MSAA to be enabled(But not sample matched), and are the same as SGSSAA but only for Alpha Test surfaces. In OpenGL, TrSSAA functions the same as SGSSAA in D3D. Meaning you get FSSGSSAA

TrMSAA (Transparency Multisampling)  This is the same concept as TrSSAA, except instead of applying selective SSAA to Alpha Test surfaces, they are Multisampled by the current MSAA mode.

You can read about this here > https://www.nvidia.com/object/transparency_aa.html

FXAA (Fast Approximate) -Needs no introduction. API agnostic, however you can only enable it when forcing AA when you are using "Enhance Application Setting". You used to be able to do it when Overriding, but was broken in a driver.

 

 

HSAA can be combined with TrSSAA,TrMSAA or SGSSAA

MSAA can be combined with TrMSAA or TrSSAA. SGSSAA requires MSAA

 

 

These are all very real individual techniques that anyone with a decent Nvidia GPU has access to.

 

So there is no pretending here.

It's not like I'd like to contradict you, but what's the silentimport command and "backups" then?

No contradicting, you can in fact import profiles. However, all this will do is import pre-defined settings made by someone for a profile in Inspector.

If we were to do such a thing, there would have to be multiple profiles(One for MSAA,etc) made for every single game that requires special bits. Not accounting for games that have special requirements.

Just making profiles for every game that only import the Flags needed wouldn't solve the problem of telling people what they can do with those bits once they are imported.

 

I dunno why I'm just now consciously thinking to how inspector works (ie: driver to the dirty part in the end).

 

And hell, it's nothing different from AMD blb. And.. I wonder how hard it could be to made an AMD Inspector, lol.

Making an AMD inspector I don't imagine would be hard if the individual knew what they were doing and had the same access to query the driver I suppose.

I mean, if we went straight to Nvidia and had them put the list of flags in the driver. It still leaves the problem of , as i mentioned, what can the user do with it if they aren't familiar already?

Sucks, but I don't really see a way around that, save for some radical overhaul from Nvidia directly of NVCP that also integrates these bits and allowing multiple per given game and giving tool tip explanations.

 

If you think it's something wiki-ficable.. then, ok. I mean, assuming it's going to be so long not to fit comfortably Nvidia CP page (and anyway, why not ressurecting Nvidia Inspector page?)

But regardless, I guess we still need to finish that AA page affair.

 

For your convenience if any, not to have to rewrite that afterwards.

Yeah, my Inspector page would basically be a mile a long. And would be more of a regular Wiki page (Introduction, options by category) than the standard Game Page layout.(My thread was so long I had to get the admin of the site to extend the character limit)

.So standalone would work best. I'm gonna start on that.

 

As for the AA page affair, I think a simple Nvidia Anti Aliasing Bits page could work.(Things could be redirected there.)

I just need to basically format the page as a table with multiple columns.

 

 

 

That reminds me, for DSR as well, there are a few hacks you can use to bypass the pre-determined ratios to get basically any Resolution you want based on axis multipliers or resolution%(DSRTool). DSR can be combined with any of the above forced AA methods, and works functionally well up to 16x resolution (4x4).

DSR TOOL

Share this post


Link to post
Share on other sites

  • Who's Online   0 Members, 0 Anonymous, 109 Guests (See full list)

    There are no registered users currently online

  • Recently Browsing   0 members

    No registered users viewing this page.

  • Forum Statistics

    1,083
    Total Topics
    6,384
    Total Posts
×