Jump to content
Marioysikax

AA, AF, Vsync and forcing through GPU

Recommended Posts

http://pcgamingwiki.com/w/index.php?title=Recettear:_An_Item_Shop%27s_Tale&curid=224&diff=232273&oldid=231165 

 

So technically this is correct, you can force at least basic FXAA, AF and force enable/disable vsync in 99% of cases. Question is that should this be considered as hackable method at all? I would almost say that only if it's for some reason not possible to use GPU panel or it needs some extra steps like differend DX mode or compatibility flags, field should be left either true/false unless game has some game specific methods for things. Because if this was standard thing for all articles, there wouldn't be false status on any articles. 

 

Another one is borderless fullscreen mode. It's nice to have that kind of info available, but at least my testing with borderless fullscreen software has been that basically all games supporting windowed mode works with it, so it would maybe be better to only mention those programs if they do not work for some reason. 

Share this post


Link to post
Share on other sites

Pretty sure it's like that since video section lists if the OPTION is AVAILABLE. Not if the feature itself is present in the game. So if there is no AA option in the game and your only resort is 3rd-party program then it's hackable

Share this post


Link to post
Share on other sites

http://community.pcgamingwiki.com/topic/1120-editing-guide/page-2?do=findComment&comment=5431

Check point 10.

 

And, of course, here we enter into the philosophical thinking and all.

What should hackable mean? What about false?

 

If I put everything aside, and I just focus on AA row, I see its importance comes down from being able to note compatibility bits for Nvidia inspector and AMD thing (aside of mentioning the supported in-game modes of course)

Those specific and -tested- working for that game. And similar considerations also applies to windowed mode. At least for hackable.

 

When one of them is false though I notice an asymmetry: in the later, you'd just put false with no notes (correct me if the contrary ever happened). Even if you tried with DxWnd, who knows? Perhaps it was just a missed checkbox?

In AA case you'd explicitly state: can't be forced from drivers.

 

A question that arises is then: how do we hint or not to readers such implicit things?

We already have those cool tooltips in game data. And we already have pretty icons for DRM (and hopefully one day a big tag at the beginning of every Issues fixed section)

But what about AA injectors, borderless fullscreen tools, custom resolutions for 4:3 only games, FPS limiting or Vsync handling?

Ok, in fact we already have every one (more or less) in the row hyperlinks. But I think you get what I meant.

 

Last but not least, speaking of AF instead, I guess we should set a rule. Is it just about the ability to enable it?

Because in that case 2x is already enough to deserve "true". But personally I'd still think it sucks.

What if 16x was the hackable thing?

Should we give that tag only if the hack was "inside the game" (ie: no video driver involved)?

Once we answer these questions finally, we can ask the biggest one: how do we treat RAGE?

Share this post


Link to post
Share on other sites

Pretty sure it's like that since video section lists if the OPTION is AVAILABLE. Not if the feature itself is present in the game. So if there is no AA option in the game and your only resort is 3rd-party program then it's hackable

Yes, that's exactly right, but question is that should GPU panel be treated as hackable or should the hackable be reserved to that game specific solutions only? Because vsync forcing either on or off has basically always worked for me trough GPU control panel, but I haven't exactly put it as hackable on articles because every article would have it as hackable then. 

 

 

http://community.pcgamingwiki.com/topic/1120-editing-guide/page-2?do=findComment&comment=5431

Check point 10.

 

And, of course, here we enter into the philosophical thinking and all.

What should hackable mean? What about false?

 

If I put everything aside, and I just focus on AA row, I see its importance comes down from being able to note compatibility bits for Nvidia inspector and AMD thing (aside of mentioning the supported in-game modes of course)

Those specific and -tested- working for that game. And similar considerations also applies to windowed mode. At least for hackable.

 

When one of them is false though I notice an asymmetry: in the later, you'd just put false with no notes (correct me if the contrary ever happened). Even if you tried with DxWnd, who knows? Perhaps it was just a missed checkbox?

In AA case you'd explicitly state: can't be forced from drivers.

 

A question that arises is then: how do we hint or not to readers such implicit things?

We already have those cool tooltips in game data. And we already have pretty icons for DRM (and hopefully one day a big tag at the beginning of every Issues fixed section)

But what about AA injectors, borderless fullscreen tools, custom resolutions for 4:3 only games, FPS limiting or Vsync handling?

Ok, in fact we already have every one (more or less) in the row hyperlinks. But I think you get what I meant.

 

Last but not least, speaking of AF instead, I guess we should set a rule. Is it just about the ability to enable it?

Because in that case 2x is already enough to deserve "true". But personally I'd still think it sucks.

What if 16x was the hackable thing?

Should we give that tag only if the hack was "inside the game" (ie: no video driver involved)?

Once we answer these questions finally, we can ask the biggest one: how do we treat RAGE?

Yeah, AA and Windowed modes are bit harder, because they sometimes flat out fail or can't do more than basic FXAA, but forcing FXAA still counts as hackable on current formatting system. 

 

 

AF should still be true even if it's 2x only, then just note about it being 2x and for larger values force config, GPU panel, etc. 

Share this post


Link to post
Share on other sites

Yes, that's exactly right, but question is that should GPU panel be treated as hackable or should the hackable be reserved to that game specific solutions only? Because vsync forcing either on or off has basically always worked for me trough GPU control panel, but I haven't exactly put it as hackable on articles because every article would have it as hackable then. 

Actually forcing vsync off didn't work on one game I was making article for. IDK what game was it.

Dunno, but honestly forcing AA/AF through CP is still "hacky" IMO even if it works 99% of time.

Share this post


Link to post
Share on other sites

I think that anything requiring actions outside of the game itself, regardless of how mundane and/or safe those actions are, should be considered as "hackable" and not "true".

But I guess that decision is ultimately up to the higher authroities of this website.

Share this post


Link to post
Share on other sites

I think that anything requiring actions outside of the game itself, regardless of how mundane and/or safe those actions are, should be considered as "hackable" and not "true".

 

That was already standard.

 

But I guess that decision is ultimately up to the higher authroities of this website.

They are not gods jeez 39ocd3Y.png

Share this post


Link to post
Share on other sites

But then why are we having this discussion?

Because repeating "you can use gpu control panel" all over the world seems a bit redundant. 

 

I'll add another point then, less "OMG" and more practical: if I have an AMD card how could I speak for nvidia users?

And we cannot expect editors to own two cards from different vendors, or to rely on "googling people" either (especially considering how "profound" understanding some have). Personally, I wouldn't even consider external post processing AA.

Not to mention that vendors aren't even the greatest common divisor here. Sometimes it's just a matter of drivers or even card generations.

 

For this reason, I definitively see a problem with "hackable: check graphics driver". Or "hackable: use injectSMAA"

But even sticking with a "just if present in the game" rule and totally omitting stuff like nvidia inspectors bits, seems dumb.

 

At the same time I'd find lame to put them blatantly there too.

I mean: I quite love how we managed to keep all vendor specific stuff for v-syncing, fps limiting, scaling, custom resolutions (not to mention audio fixing/testing) into "external pages" [==> outside game one]. It's much more professional imo.

And neutral (especially in this case where one could only test and write just about a single brand).

 

So.. I'd start from this last point: how to properly present compatibility bits?

Share this post


Link to post
Share on other sites

It seems redundant because that's exactly what it is: a redundant solution.

 

As for your other point I don't think it would be such a bad thing to add sub-sections below the Video Settings table named "For AMD GPU users" or "For Nvidia GPU users" or "For Intel GPU users" since those compatibility issues are game-specific most of the time.

It wouldn't look fancy, but I don't think it would bloat the page either. It's important information after all.

Share this post


Link to post
Share on other sites

It seems redundant because that's exactly what it is: a redundant solution.

It's still a solution though.

 

As for your other point I don't think it would be such a bad thing to add sub-sections below the Video Settings table named "For AMD GPU users" or "For Nvidia GPU users" or "For Intel GPU users" since those compatibility issues are game-specific most of the time.

It wouldn't look fancy, but I don't think it would bloat the page either. It's important information after all.

If I wanted to be finicky, technically this could also apply to games where AA is true.

Enhancing built-in support isn't a crime.

 

Aside of this: is there only just a method that works? What if my computer was slow and -for as much as it sucks- the only acceptable AA was post processing one?

What if CSAA could be faster than MSAA and I was missing that?

 

So.. I'd like to write all the mental "flip flops" but I'll cut with the end result (for AA and hopefully even the rest). Please be as much critic as possible:

  • Unknown: editor couldn't test all the "expected to check" stuff*. Otherwise:
    • True: the "expected to check" stuff has been tested and the feature X works out of the box. Otherwise:
      • hackable: X isn't supported out of the box, but there are general (as in "works for everybody") game specific fixes**. Otherwise:
        • false. What to write in notes here is probably even more important than in the bullet points before, but better if I think to it in another future post.

Speaking of compatibility bits instead, I really like the generic list concept, like the one in the link in the previous post.

We may ask OP if he couldn't maintain his profile list on, say, github. Or I guess BONKERS could arrange these things for us. And mention it only in nvidia page, or a separate page.

 

This way users that "appreciate Inspector" would still benefit, while those that give no damns have no additional "reading burden" (especially if you consider a really good "coverage" it's not only about AA)

I thought a lot about it and I believe somebody that comes down into externally enabling AA (or windowed or others) is way likely to care for that in all his games.

 

But I guess it's not impossible for one to just care for one and only one game either (ie: I especially cared for AA in mass effect).

So I wonder: with all these "not necessarily better" tools, what about a "tested working" tooltip when mouse arrow is over option name?.

This could also apply to [borderless] windowed methods and all the other things that are quite "random" in success rate.

 

 

*= For starters of course not every editor can be expected to have the necessary time/hardware to check everything (just think to the hypothetical "input lag").

But aside of that my message would be: if you can't 100% test it, then don't write anything on it.

This to avoid situations (which I saw in the past) where people tested 5.1 surround sound and put "5.1 works" note. Yes, fine. But if the game was even supporting 7.1 you are giving an information too easy to mistake with "[only] 5.1 works".

 

**= Notice how this wording manages to exclude vendor specific stuff like control panel methods, and generic solutions like "use sweetFX".

But I'd like some feedback on how we should further refine it to either definitively include or exclude stuff like DxWnd

 

Share this post


Link to post
Share on other sites

It seems redundant because that's exactly what it is: a redundant solution.

This was exactly my point. It's so reduntant that because it's so reduntant and I would've included it with my editings, then all but maybe one or two articles would have those sections as hackable! That would make false statement reduntant as well. 

 

Actually forcing vsync off didn't work on one game I was making article for. IDK what game was it.

Dunno, but honestly forcing AA/AF through CP is still "hacky" IMO even if it works 99% of time.

It is hacky, but it's also always the same way for every game, while something like editing configuration file or using game specific mods differ from game to game. That's why the GPU panel stuff should be turned upside down: note if for some reason forcing stuff trough GPU panel doesn't have effect. 

So having that GPU forcing on generic troubleshooting article would be much better and it would also streamline compatibility flag noting on articles as well, remember seeing several times on game specific articles instructions how to use inspector. 

 

If I wanted to be finicky, technically this could also apply to games where AA is true.

Enhancing built-in support isn't a crime.

 

Aside of this: is there only just a method that works? What if my computer was slow and -for as much as it sucks- the only acceptable AA was post processing one?

What if CSAA could be faster than MSAA and I was missing that?

I disagree with this one. First of all that becomes subjective matter, there are people who will refuse using post processing versions of AA. If your machine was bad, you most likely either don't use AA at all or use FXAA from GPU panel. And nothing stops from simply noting how to use better versions of AA, many do prefer using nvidia inspector methods. 

Share this post


Link to post
Share on other sites

​Can you quickly describe when I'm able to force AA from the drivers (Nvidia)? I know forcing anti-aliasing doesn't work with the newer DirectX games, but I've had major success with older games.

​Linux uses OpenGL, does this mean any potential changes?

​

​be3IEzX.jpg

​

​Postal 2: Paradise Lost

Share this post


Link to post
Share on other sites

​Can you quickly describe when I'm able to force AA from the drivers (Nvidia)? I know forcing anti-aliasing doesn't work with the newer DirectX games, but I've had major success with older games.

​Linux uses OpenGL, does this mean any potential changes?

​

​

​Postal 2: Paradise Lost

It varies greatly, but mostly you need to take Nvidia Control Panel and chuck it out the window as it doesn't do the job.

For forcing AA, Nvidia Inspector is what is really necessary.

http://forums.guru3d.com/showthread.php?p=5183388

 

As for what works when, as I said it varies greatly. Generally games running on DX7/DX8 don't need special flags to force basic things like MSAA, sometimes SGSSAA/OGSSAA too.

Sometimes, flags are needed for DX8 games, such as Unreal Engine 2.

 

With DX9, 8/10 times you need a special flag to force AA of any kind. Hence the lists on Guru3D/3DCenter.

Even further, many games might require more specific instructions based on resolution/aspect ratio/whether AA fix is needed or causes problems, etc. Something specific is Unreal Engine 3 games of the last 5 years have had a very annoying trait.

They ship with FXAA only, and there is no way to turn it off in the menus. If there is, sometimes it doesn't work. And then your only other option is to disable something like Depth Of Field in the Engine.ini. Which acts as a super switch for ALL Post Processing. So you either get FXAA interfering with Forced AA and PP. Or no or only some PP with forced AA working correctly.

I try to keep on top of this in the list on Guru3D.

 

Everything after DX9 is a crap shoot unfortunately. I don't know if it was because of changes to the standard and DXGI that Microsoft made, but in DX10 specifically the number of functions available is vastly smaller and there isn't enough to produce any results.

So, with those sometimes it works without any bits. But it's very rare. You can enhance AA too, but it becomes dependent upon the MSAA implementation in the game itself, which with DX10/11 becomes very problematic. Especially with the rise of non forward rendering and the increased performance cost of MSAA in Deferred Lighting/Rendering.

Shortcuts are taken or hacky methods are concocted to gain back performance or some other objective. Which results in lesser quality MSAA or very poor enhancing capabilities.

Even in DX9, this can be a problem. But more often than not, things can be forced instead or specific wacky work arounds come in to play. (Read the AA section of the Inspector link above for examples with Red Faction 3 and FEAR 2)

 

Once upon a time, people had set up a petition on Nvidia's forum asking them for DX10/11 AA compatibility bits. This was stickied in their driver forum with people responding for over 2+ years with no communication from Nvidia nor a final answer other than the thread being unstickied at one point.

We begged and begged, but nothing came. It was a shame, because personally, Nvidia's driver level AA capabilities are a huge selling point and probably the only thing that has really kept me coming back to Nvidia.

If AMD had the same level and quality built in, i'd have cards from both vendors probably. I knew that at some point AMD did have some kind of AA forcing functionality, but it was spotty and the quality wasn't that great.

 

 

Now days we are entirely dependent upon Developers to give a rats ass to put some effort into AA. More often than not, it doesn't happen. And even when it does, you still have to essentially brute force Ordered Grid AA (DSR/VSR/Custom Resolutions/GeDoSaTo) on top to get the quality to a very good point.

DSR/VSR are only saved by their choice of good variable resampling functions for the final image resolved. But it still is only OGSSAA, not the most elegant solution performance wise. That's not to say forcing is any better, SGSSAA in some modern DX9 games has become obscenely expensive due to the hacky nature combined with how games are made.

If a game uses Unity (In DX10+, in DX9 can often be forced. But it's generally more performance sucking than it should be. For whatever reason, whether Unity is unoptimized crap;look at PS4 game performance. Is 7/10 times poor; or what.),you are basically screwed. Their idea of AA only extends to basic FXAA/MLAA/SMAA1x.

And from what I understand. Modifying the Engine directly to do their own isn't available or isn't easy. There was some tech demo a while ago that seemed to have some kind of Temporal AA, but dear god it was total garbage in every way.

Unreal 4's TAA is also equally poor for varying reasons. But Epic doesn't make it impossible for developers to implement their own. (See: Ethan Carter Redux on UE4)

 

I digress, moving on to the forgotten child OpenGL. There are no flags for compatibility like with DX8/9, so you are at the mercy of what feature level and how the engine is set up by developers per game.

When it does work, you just need to set it to override and it just works. With one caveat, for SGSSAA specifically the option labeled "Transparency Supersampling" is fundamentally different than in DirectX.

At the core level, both TrSSAA and SGSSAA options are actually SGSSAA. What separates them is how they work, TrSSAA(SGSSAA) replays the pixel shading of only Alpha Test/Coverage materials to AA them. While SGSSAA(FullSceneSGSSAA) replays the pixel shading for the entire scene unless specified to ignore.

In OGL there is only the one, FSSGSSAA. This is used as the option "Transparency Supersampling" And the sample counts need to match. (8xQ+8xTr|2x+2xTr).

 

Doom 3 OG and the Riddick games(I think Riddick even works on AMD cards https://www.youtube.com/watch?v=gn8EPiiPpMQ) are examples of AAA games off the top of my head that work with forced AA. While Doom 3 BFG does not or any other games made on IDTech after Brink. (Which ran in DX9 IIRC)

I wish we could have compatibility bits for OGL and Vulkan too, this would greatly benefit Linux and SteamOS.

 

AMD several years ago published a sample and source for their own SGSSAA implementation called "SSAA SF" (SampleFrequency) geared towards a Forward+ renderer.

https://github.com/GPUOpen-LibrariesAndSDKs/SSAA11/releases

The code sample is vendor agnostic and works just fine on Nvidia cards.

It includes some presentation and documentation on how it works("SSAA11/Doc"). It does automatic LOD adjustment, though if one implemented it in their game it could be adjustable.

But it works very well and is very performant. I Wish this was used in games. Even if not performant, it's available as a future proof option.

 

 

 

This was exactly my point. It's so reduntant that because it's so reduntant and I would've included it with my editings, then all but maybe one or two articles would have those sections as hackable! That would make false statement reduntant as well. 

 

It is hacky, but it's also always the same way for every game, while something like editing configuration file or using game specific mods differ from game to game. That's why the GPU panel stuff should be turned upside down: note if for some reason forcing stuff trough GPU panel doesn't have effect. 

So having that GPU forcing on generic troubleshooting article would be much better and it would also streamline compatibility flag noting on articles as well, remember seeing several times on game specific articles instructions how to use inspector. 

 

I disagree with this one. First of all that becomes subjective matter, there are people who will refuse using post processing versions of AA. If your machine was bad, you most likely either don't use AA at all or use FXAA from GPU panel. And nothing stops from simply noting how to use better versions of AA, many do prefer using nvidia inspector methods.

Anything that isn't in the game to use should be marked "Hackable" IMHO. Or a new icon and setting should be come up with to replace it. Like "Driver Override Nvidia" "Driver Override AMD" or "Driver Override Both"

If the driver offers a higher quality or in general better method than available in game, then it should be noted what is in game but higher quality is available.

Share this post


Link to post
Share on other sites

Pretty detailed message. 

 

Anything that isn't in the game to use should be marked "Hackable" IMHO. Or a new icon and setting should be come up with to replace it. Like "Driver Override Nvidia" "Driver Override AMD" or "Driver Override Both"
If the driver offers a higher quality or in general better method than available in game, then it should be noted what is in game but higher quality is available.

Yes, but that still isn't what I was proposing. Of course if feature requires actions outside of main game then it's hackable. 

 

I wasn't trying to change GPU forcing as true/false, I was suggesting of leaving it out completely unless there's some game specific stuff like compatibility flags or if it doesn't work with that particular game for some reason. 

Share this post


Link to post
Share on other sites

  • Found PCGamingWiki useful? Please consider making a Donation or visiting our Patreon.
  • Who's Online   1 Member, 0 Anonymous, 259 Guests (See full list)

  • Recently Browsing   0 members

    No registered users viewing this page.

  • Forum Statistics

    1,399
    Total Topics
    7,576
    Total Posts
×
×
  • Create New...