Jump to content

Welcome to the upgraded PCGamingWiki forums and files page. The current Wiki and Forum bridge is not functioning at the moment, therefore your Forum account currently has no password set. Please reset your Forum password via email check to generate a new password. If you have any issues please message Andytizer on Discord.

Marioysikax

AA, AF, Vsync and forcing through GPU

Recommended Posts

http://pcgamingwiki.com/w/index.php?title=Recettear:_An_Item_Shop%27s_Tale&curid=224&diff=232273&oldid=231165 

 

So technically this is correct, you can force at least basic FXAA, AF and force enable/disable vsync in 99% of cases. Question is that should this be considered as hackable method at all? I would almost say that only if it's for some reason not possible to use GPU panel or it needs some extra steps like differend DX mode or compatibility flags, field should be left either true/false unless game has some game specific methods for things. Because if this was standard thing for all articles, there wouldn't be false status on any articles. 

 

Another one is borderless fullscreen mode. It's nice to have that kind of info available, but at least my testing with borderless fullscreen software has been that basically all games supporting windowed mode works with it, so it would maybe be better to only mention those programs if they do not work for some reason. 

Share this post


Link to post
Share on other sites

Pretty sure it's like that since video section lists if the OPTION is AVAILABLE. Not if the feature itself is present in the game. So if there is no AA option in the game and your only resort is 3rd-party program then it's hackable

Share this post


Link to post
Share on other sites

http://community.pcgamingwiki.com/topic/1120-editing-guide/page-2?do=findComment&comment=5431

Check point 10.

 

And, of course, here we enter into the philosophical thinking and all.

What should hackable mean? What about false?

 

If I put everything aside, and I just focus on AA row, I see its importance comes down from being able to note compatibility bits for Nvidia inspector and AMD thing (aside of mentioning the supported in-game modes of course)

Those specific and -tested- working for that game. And similar considerations also applies to windowed mode. At least for hackable.

 

When one of them is false though I notice an asymmetry: in the later, you'd just put false with no notes (correct me if the contrary ever happened). Even if you tried with DxWnd, who knows? Perhaps it was just a missed checkbox?

In AA case you'd explicitly state: can't be forced from drivers.

 

A question that arises is then: how do we hint or not to readers such implicit things?

We already have those cool tooltips in game data. And we already have pretty icons for DRM (and hopefully one day a big tag at the beginning of every Issues fixed section)

But what about AA injectors, borderless fullscreen tools, custom resolutions for 4:3 only games, FPS limiting or Vsync handling?

Ok, in fact we already have every one (more or less) in the row hyperlinks. But I think you get what I meant.

 

Last but not least, speaking of AF instead, I guess we should set a rule. Is it just about the ability to enable it?

Because in that case 2x is already enough to deserve "true". But personally I'd still think it sucks.

What if 16x was the hackable thing?

Should we give that tag only if the hack was "inside the game" (ie: no video driver involved)?

Once we answer these questions finally, we can ask the biggest one: how do we treat RAGE?

Share this post


Link to post
Share on other sites

Pretty sure it's like that since video section lists if the OPTION is AVAILABLE. Not if the feature itself is present in the game. So if there is no AA option in the game and your only resort is 3rd-party program then it's hackable

Yes, that's exactly right, but question is that should GPU panel be treated as hackable or should the hackable be reserved to that game specific solutions only? Because vsync forcing either on or off has basically always worked for me trough GPU control panel, but I haven't exactly put it as hackable on articles because every article would have it as hackable then. 

 

 

http://community.pcgamingwiki.com/topic/1120-editing-guide/page-2?do=findComment&comment=5431

Check point 10.

 

And, of course, here we enter into the philosophical thinking and all.

What should hackable mean? What about false?

 

If I put everything aside, and I just focus on AA row, I see its importance comes down from being able to note compatibility bits for Nvidia inspector and AMD thing (aside of mentioning the supported in-game modes of course)

Those specific and -tested- working for that game. And similar considerations also applies to windowed mode. At least for hackable.

 

When one of them is false though I notice an asymmetry: in the later, you'd just put false with no notes (correct me if the contrary ever happened). Even if you tried with DxWnd, who knows? Perhaps it was just a missed checkbox?

In AA case you'd explicitly state: can't be forced from drivers.

 

A question that arises is then: how do we hint or not to readers such implicit things?

We already have those cool tooltips in game data. And we already have pretty icons for DRM (and hopefully one day a big tag at the beginning of every Issues fixed section)

But what about AA injectors, borderless fullscreen tools, custom resolutions for 4:3 only games, FPS limiting or Vsync handling?

Ok, in fact we already have every one (more or less) in the row hyperlinks. But I think you get what I meant.

 

Last but not least, speaking of AF instead, I guess we should set a rule. Is it just about the ability to enable it?

Because in that case 2x is already enough to deserve "true". But personally I'd still think it sucks.

What if 16x was the hackable thing?

Should we give that tag only if the hack was "inside the game" (ie: no video driver involved)?

Once we answer these questions finally, we can ask the biggest one: how do we treat RAGE?

Yeah, AA and Windowed modes are bit harder, because they sometimes flat out fail or can't do more than basic FXAA, but forcing FXAA still counts as hackable on current formatting system. 

 

 

AF should still be true even if it's 2x only, then just note about it being 2x and for larger values force config, GPU panel, etc. 

Share this post


Link to post
Share on other sites

Yes, that's exactly right, but question is that should GPU panel be treated as hackable or should the hackable be reserved to that game specific solutions only? Because vsync forcing either on or off has basically always worked for me trough GPU control panel, but I haven't exactly put it as hackable on articles because every article would have it as hackable then. 

Actually forcing vsync off didn't work on one game I was making article for. IDK what game was it.

Dunno, but honestly forcing AA/AF through CP is still "hacky" IMO even if it works 99% of time.

Share this post


Link to post
Share on other sites

I think that anything requiring actions outside of the game itself, regardless of how mundane and/or safe those actions are, should be considered as "hackable" and not "true".

But I guess that decision is ultimately up to the higher authroities of this website.

Share this post


Link to post
Share on other sites

I think that anything requiring actions outside of the game itself, regardless of how mundane and/or safe those actions are, should be considered as "hackable" and not "true".

 

That was already standard.

 

But I guess that decision is ultimately up to the higher authroities of this website.

They are not gods jeez 39ocd3Y.png

Share this post


Link to post
Share on other sites

But then why are we having this discussion?

Because repeating "you can use gpu control panel" all over the world seems a bit redundant. 

 

I'll add another point then, less "OMG" and more practical: if I have an AMD card how could I speak for nvidia users?

And we cannot expect editors to own two cards from different vendors, or to rely on "googling people" either (especially considering how "profound" understanding some have). Personally, I wouldn't even consider external post processing AA.

Not to mention that vendors aren't even the greatest common divisor here. Sometimes it's just a matter of drivers or even card generations.

 

For this reason, I definitively see a problem with "hackable: check graphics driver". Or "hackable: use injectSMAA"

But even sticking with a "just if present in the game" rule and totally omitting stuff like nvidia inspectors bits, seems dumb.

 

At the same time I'd find lame to put them blatantly there too.

I mean: I quite love how we managed to keep all vendor specific stuff for v-syncing, fps limiting, scaling, custom resolutions (not to mention audio fixing/testing) into "external pages" [==> outside game one]. It's much more professional imo.

And neutral (especially in this case where one could only test and write just about a single brand).

 

So.. I'd start from this last point: how to properly present compatibility bits?

Share this post


Link to post
Share on other sites

It seems redundant because that's exactly what it is: a redundant solution.

 

As for your other point I don't think it would be such a bad thing to add sub-sections below the Video Settings table named "For AMD GPU users" or "For Nvidia GPU users" or "For Intel GPU users" since those compatibility issues are game-specific most of the time.

It wouldn't look fancy, but I don't think it would bloat the page either. It's important information after all.

Share this post


Link to post
Share on other sites

It seems redundant because that's exactly what it is: a redundant solution.

It's still a solution though.

 

As for your other point I don't think it would be such a bad thing to add sub-sections below the Video Settings table named "For AMD GPU users" or "For Nvidia GPU users" or "For Intel GPU users" since those compatibility issues are game-specific most of the time.

It wouldn't look fancy, but I don't think it would bloat the page either. It's important information after all.

If I wanted to be finicky, technically this could also apply to games where AA is true.

Enhancing built-in support isn't a crime.

 

Aside of this: is there only just a method that works? What if my computer was slow and -for as much as it sucks- the only acceptable AA was post processing one?

What if CSAA could be faster than MSAA and I was missing that?

 

So.. I'd like to write all the mental "flip flops" but I'll cut with the end result (for AA and hopefully even the rest). Please be as much critic as possible:

  • Unknown: editor couldn't test all the "expected to check" stuff*. Otherwise:
    • True: the "expected to check" stuff has been tested and the feature X works out of the box. Otherwise:
      • hackable: X isn't supported out of the box, but there are general (as in "works for everybody") game specific fixes**. Otherwise:
        • false. What to write in notes here is probably even more important than in the bullet points before, but better if I think to it in another future post.

Speaking of compatibility bits instead, I really like the generic list concept, like the one in the link in the previous post.

We may ask OP if he couldn't maintain his profile list on, say, github. Or I guess BONKERS could arrange these things for us. And mention it only in nvidia page, or a separate page.

 

This way users that "appreciate Inspector" would still benefit, while those that give no damns have no additional "reading burden" (especially if you consider a really good "coverage" it's not only about AA)

I thought a lot about it and I believe somebody that comes down into externally enabling AA (or windowed or others) is way likely to care for that in all his games.

 

But I guess it's not impossible for one to just care for one and only one game either (ie: I especially cared for AA in mass effect).

So I wonder: with all these "not necessarily better" tools, what about a "tested working" tooltip when mouse arrow is over option name?.

This could also apply to [borderless] windowed methods and all the other things that are quite "random" in success rate.

 

 

*= For starters of course not every editor can be expected to have the necessary time/hardware to check everything (just think to the hypothetical "input lag").

But aside of that my message would be: if you can't 100% test it, then don't write anything on it.

This to avoid situations (which I saw in the past) where people tested 5.1 surround sound and put "5.1 works" note. Yes, fine. But if the game was even supporting 7.1 you are giving an information too easy to mistake with "[only] 5.1 works".

 

**= Notice how this wording manages to exclude vendor specific stuff like control panel methods, and generic solutions like "use sweetFX".

But I'd like some feedback on how we should further refine it to either definitively include or exclude stuff like DxWnd

 

Share this post


Link to post
Share on other sites

It seems redundant because that's exactly what it is: a redundant solution.

This was exactly my point. It's so reduntant that because it's so reduntant and I would've included it with my editings, then all but maybe one or two articles would have those sections as hackable! That would make false statement reduntant as well. 

 

Actually forcing vsync off didn't work on one game I was making article for. IDK what game was it.

Dunno, but honestly forcing AA/AF through CP is still "hacky" IMO even if it works 99% of time.

It is hacky, but it's also always the same way for every game, while something like editing configuration file or using game specific mods differ from game to game. That's why the GPU panel stuff should be turned upside down: note if for some reason forcing stuff trough GPU panel doesn't have effect. 

So having that GPU forcing on generic troubleshooting article would be much better and it would also streamline compatibility flag noting on articles as well, remember seeing several times on game specific articles instructions how to use inspector. 

 

If I wanted to be finicky, technically this could also apply to games where AA is true.

Enhancing built-in support isn't a crime.

 

Aside of this: is there only just a method that works? What if my computer was slow and -for as much as it sucks- the only acceptable AA was post processing one?

What if CSAA could be faster than MSAA and I was missing that?

I disagree with this one. First of all that becomes subjective matter, there are people who will refuse using post processing versions of AA. If your machine was bad, you most likely either don't use AA at all or use FXAA from GPU panel. And nothing stops from simply noting how to use better versions of AA, many do prefer using nvidia inspector methods. 

Share this post


Link to post
Share on other sites

​Can you quickly describe when I'm able to force AA from the drivers (Nvidia)? I know forcing anti-aliasing doesn't work with the newer DirectX games, but I've had major success with older games.

​Linux uses OpenGL, does this mean any potential changes?

​

​be3IEzX.jpg

​

​Postal 2: Paradise Lost

Share this post


Link to post
Share on other sites

​Can you quickly describe when I'm able to force AA from the drivers (Nvidia)? I know forcing anti-aliasing doesn't work with the newer DirectX games, but I've had major success with older games.

​Linux uses OpenGL, does this mean any potential changes?

​

​

​Postal 2: Paradise Lost

It varies greatly, but mostly you need to take Nvidia Control Panel and chuck it out the window as it doesn't do the job.

For forcing AA, Nvidia Inspector is what is really necessary.

http://forums.guru3d.com/showthread.php?p=5183388

 

As for what works when, as I said it varies greatly. Generally games running on DX7/DX8 don't need special flags to force basic things like MSAA, sometimes SGSSAA/OGSSAA too.

Sometimes, flags are needed for DX8 games, such as Unreal Engine 2.

 

With DX9, 8/10 times you need a special flag to force AA of any kind. Hence the lists on Guru3D/3DCenter.

Even further, many games might require more specific instructions based on resolution/aspect ratio/whether AA fix is needed or causes problems, etc. Something specific is Unreal Engine 3 games of the last 5 years have had a very annoying trait.

They ship with FXAA only, and there is no way to turn it off in the menus. If there is, sometimes it doesn't work. And then your only other option is to disable something like Depth Of Field in the Engine.ini. Which acts as a super switch for ALL Post Processing. So you either get FXAA interfering with Forced AA and PP. Or no or only some PP with forced AA working correctly.

I try to keep on top of this in the list on Guru3D.

 

Everything after DX9 is a crap shoot unfortunately. I don't know if it was because of changes to the standard and DXGI that Microsoft made, but in DX10 specifically the number of functions available is vastly smaller and there isn't enough to produce any results.

So, with those sometimes it works without any bits. But it's very rare. You can enhance AA too, but it becomes dependent upon the MSAA implementation in the game itself, which with DX10/11 becomes very problematic. Especially with the rise of non forward rendering and the increased performance cost of MSAA in Deferred Lighting/Rendering.

Shortcuts are taken or hacky methods are concocted to gain back performance or some other objective. Which results in lesser quality MSAA or very poor enhancing capabilities.

Even in DX9, this can be a problem. But more often than not, things can be forced instead or specific wacky work arounds come in to play. (Read the AA section of the Inspector link above for examples with Red Faction 3 and FEAR 2)

 

Once upon a time, people had set up a petition on Nvidia's forum asking them for DX10/11 AA compatibility bits. This was stickied in their driver forum with people responding for over 2+ years with no communication from Nvidia nor a final answer other than the thread being unstickied at one point.

We begged and begged, but nothing came. It was a shame, because personally, Nvidia's driver level AA capabilities are a huge selling point and probably the only thing that has really kept me coming back to Nvidia.

If AMD had the same level and quality built in, i'd have cards from both vendors probably. I knew that at some point AMD did have some kind of AA forcing functionality, but it was spotty and the quality wasn't that great.

 

 

Now days we are entirely dependent upon Developers to give a rats ass to put some effort into AA. More often than not, it doesn't happen. And even when it does, you still have to essentially brute force Ordered Grid AA (DSR/VSR/Custom Resolutions/GeDoSaTo) on top to get the quality to a very good point.

DSR/VSR are only saved by their choice of good variable resampling functions for the final image resolved. But it still is only OGSSAA, not the most elegant solution performance wise. That's not to say forcing is any better, SGSSAA in some modern DX9 games has become obscenely expensive due to the hacky nature combined with how games are made.

If a game uses Unity (In DX10+, in DX9 can often be forced. But it's generally more performance sucking than it should be. For whatever reason, whether Unity is unoptimized crap;look at PS4 game performance. Is 7/10 times poor; or what.),you are basically screwed. Their idea of AA only extends to basic FXAA/MLAA/SMAA1x.

And from what I understand. Modifying the Engine directly to do their own isn't available or isn't easy. There was some tech demo a while ago that seemed to have some kind of Temporal AA, but dear god it was total garbage in every way.

Unreal 4's TAA is also equally poor for varying reasons. But Epic doesn't make it impossible for developers to implement their own. (See: Ethan Carter Redux on UE4)

 

I digress, moving on to the forgotten child OpenGL. There are no flags for compatibility like with DX8/9, so you are at the mercy of what feature level and how the engine is set up by developers per game.

When it does work, you just need to set it to override and it just works. With one caveat, for SGSSAA specifically the option labeled "Transparency Supersampling" is fundamentally different than in DirectX.

At the core level, both TrSSAA and SGSSAA options are actually SGSSAA. What separates them is how they work, TrSSAA(SGSSAA) replays the pixel shading of only Alpha Test/Coverage materials to AA them. While SGSSAA(FullSceneSGSSAA) replays the pixel shading for the entire scene unless specified to ignore.

In OGL there is only the one, FSSGSSAA. This is used as the option "Transparency Supersampling" And the sample counts need to match. (8xQ+8xTr|2x+2xTr).

 

Doom 3 OG and the Riddick games(I think Riddick even works on AMD cards https://www.youtube.com/watch?v=gn8EPiiPpMQ) are examples of AAA games off the top of my head that work with forced AA. While Doom 3 BFG does not or any other games made on IDTech after Brink. (Which ran in DX9 IIRC)

I wish we could have compatibility bits for OGL and Vulkan too, this would greatly benefit Linux and SteamOS.

 

AMD several years ago published a sample and source for their own SGSSAA implementation called "SSAA SF" (SampleFrequency) geared towards a Forward+ renderer.

https://github.com/GPUOpen-LibrariesAndSDKs/SSAA11/releases

The code sample is vendor agnostic and works just fine on Nvidia cards.

It includes some presentation and documentation on how it works("SSAA11/Doc"). It does automatic LOD adjustment, though if one implemented it in their game it could be adjustable.

But it works very well and is very performant. I Wish this was used in games. Even if not performant, it's available as a future proof option.

 

 

 

This was exactly my point. It's so reduntant that because it's so reduntant and I would've included it with my editings, then all but maybe one or two articles would have those sections as hackable! That would make false statement reduntant as well. 

 

It is hacky, but it's also always the same way for every game, while something like editing configuration file or using game specific mods differ from game to game. That's why the GPU panel stuff should be turned upside down: note if for some reason forcing stuff trough GPU panel doesn't have effect. 

So having that GPU forcing on generic troubleshooting article would be much better and it would also streamline compatibility flag noting on articles as well, remember seeing several times on game specific articles instructions how to use inspector. 

 

I disagree with this one. First of all that becomes subjective matter, there are people who will refuse using post processing versions of AA. If your machine was bad, you most likely either don't use AA at all or use FXAA from GPU panel. And nothing stops from simply noting how to use better versions of AA, many do prefer using nvidia inspector methods.

Anything that isn't in the game to use should be marked "Hackable" IMHO. Or a new icon and setting should be come up with to replace it. Like "Driver Override Nvidia" "Driver Override AMD" or "Driver Override Both"

If the driver offers a higher quality or in general better method than available in game, then it should be noted what is in game but higher quality is available.

Share this post


Link to post
Share on other sites

Pretty detailed message. 

 

Anything that isn't in the game to use should be marked "Hackable" IMHO. Or a new icon and setting should be come up with to replace it. Like "Driver Override Nvidia" "Driver Override AMD" or "Driver Override Both"
If the driver offers a higher quality or in general better method than available in game, then it should be noted what is in game but higher quality is available.

Yes, but that still isn't what I was proposing. Of course if feature requires actions outside of main game then it's hackable. 

 

I wasn't trying to change GPU forcing as true/false, I was suggesting of leaving it out completely unless there's some game specific stuff like compatibility flags or if it doesn't work with that particular game for some reason. 

Share this post


Link to post
Share on other sites

It varies greatly, but mostly you need to take Nvidia Control Panel and chuck it out the window as it doesn't do the job.

For forcing AA, Nvidia Inspector is what is really necessary.

http://forums.guru3d.com/showthread.php?p=5183388

 

As for what works when, as I said it varies greatly. Generally games running on DX7/DX8 don't need special flags to force basic things like MSAA, sometimes SGSSAA/OGSSAA too.

Sometimes, flags are needed for DX8 games, such as Unreal Engine 2.

 

With DX9, 8/10 times you need a special flag to force AA of any kind. Hence the lists on Guru3D/3DCenter.

Even further, many games might require more specific instructions based on resolution/aspect ratio/whether AA fix is needed or causes problems, etc. Something specific is Unreal Engine 3 games of the last 5 years have had a very annoying trait.

They ship with FXAA only, and there is no way to turn it off in the menus. If there is, sometimes it doesn't work. And then your only other option is to disable something like Depth Of Field in the Engine.ini. Which acts as a super switch for ALL Post Processing. So you either get FXAA interfering with Forced AA and PP. Or no or only some PP with forced AA working correctly.

I try to keep on top of this in the list on Guru3D.

 

Everything after DX9 is a crap shoot unfortunately. I don't know if it was because of changes to the standard and DXGI that Microsoft made, but in DX10 specifically the number of functions available is vastly smaller and there isn't enough to produce any results.

So, with those sometimes it works without any bits. But it's very rare. You can enhance AA too, but it becomes dependent upon the MSAA implementation in the game itself, which with DX10/11 becomes very problematic. Especially with the rise of non forward rendering and the increased performance cost of MSAA in Deferred Lighting/Rendering.

Shortcuts are taken or hacky methods are concocted to gain back performance or some other objective. Which results in lesser quality MSAA or very poor enhancing capabilities.

Even in DX9, this can be a problem. But more often than not, things can be forced instead or specific wacky work arounds come in to play. (Read the AA section of the Inspector link above for examples with Red Faction 3 and FEAR 2)

 

Once upon a time, people had set up a petition on Nvidia's forum asking them for DX10/11 AA compatibility bits. This was stickied in their driver forum with people responding for over 2+ years with no communication from Nvidia nor a final answer other than the thread being unstickied at one point.

We begged and begged, but nothing came. It was a shame, because personally, Nvidia's driver level AA capabilities are a huge selling point and probably the only thing that has really kept me coming back to Nvidia.

If AMD had the same level and quality built in, i'd have cards from both vendors probably. I knew that at some point AMD did have some kind of AA forcing functionality, but it was spotty and the quality wasn't that great.

 

 

Now days we are entirely dependent upon Developers to give a rats ass to put some effort into AA. More often than not, it doesn't happen. And even when it does, you still have to essentially brute force Ordered Grid AA (DSR/VSR/Custom Resolutions/GeDoSaTo) on top to get the quality to a very good point.

DSR/VSR are only saved by their choice of good variable resampling functions for the final image resolved. But it still is only OGSSAA, not the most elegant solution performance wise. That's not to say forcing is any better, SGSSAA in some modern DX9 games has become obscenely expensive due to the hacky nature combined with how games are made.

If a game uses Unity (In DX10+, in DX9 can often be forced. But it's generally more performance sucking than it should be. For whatever reason, whether Unity is unoptimized crap;look at PS4 game performance. Is 7/10 times poor; or what.),you are basically screwed. Their idea of AA only extends to basic FXAA/MLAA/SMAA1x.

And from what I understand. Modifying the Engine directly to do their own isn't available or isn't easy. There was some tech demo a while ago that seemed to have some kind of Temporal AA, but dear god it was total garbage in every way.

Unreal 4's TAA is also equally poor for varying reasons. But Epic doesn't make it impossible for developers to implement their own. (See: Ethan Carter Redux on UE4)

 

I digress, moving on to the forgotten child OpenGL. There are no flags for compatibility like with DX8/9, so you are at the mercy of what feature level and how the engine is set up by developers per game.

When it does work, you just need to set it to override and it just works. With one caveat, for SGSSAA specifically the option labeled "Transparency Supersampling" is fundamentally different than in DirectX.

At the core level, both TrSSAA and SGSSAA options are actually SGSSAA. What separates them is how they work, TrSSAA(SGSSAA) replays the pixel shading of only Alpha Test/Coverage materials to AA them. While SGSSAA(FullSceneSGSSAA) replays the pixel shading for the entire scene unless specified to ignore.

In OGL there is only the one, FSSGSSAA. This is used as the option "Transparency Supersampling" And the sample counts need to match. (8xQ+8xTr|2x+2xTr).

 

Doom 3 OG and the Riddick games(I think Riddick even works on AMD cards https://www.youtube.com/watch?v=gn8EPiiPpMQ) are examples of AAA games off the top of my head that work with forced AA. While Doom 3 BFG does not or any other games made on IDTech after Brink. (Which ran in DX9 IIRC)

I wish we could have compatibility bits for OGL and Vulkan too, this would greatly benefit Linux and SteamOS.

 

AMD several years ago published a sample and source for their own SGSSAA implementation called "SSAA SF" (SampleFrequency) geared towards a Forward+ renderer.

https://github.com/GPUOpen-LibrariesAndSDKs/SSAA11/releases

The code sample is vendor agnostic and works just fine on Nvidia cards.

It includes some presentation and documentation on how it works("SSAA11/Doc"). It does automatic LOD adjustment, though if one implemented it in their game it could be adjustable.

But it works very well and is very performant. I Wish this was used in games. Even if not performant, it's available as a future proof option.

 

 

 

 

Anything that isn't in the game to use should be marked "Hackable" IMHO. Or a new icon and setting should be come up with to replace it. Like "Driver Override Nvidia" "Driver Override AMD" or "Driver Override Both"

If the driver offers a higher quality or in general better method than available in game, then it should be noted what is in game but higher quality is available.

That was beautiful. I always assumed that AMD has similar driver-level AA forcing capabilities, are they completely lacking on that front?

​I want to start playing games in 4K, when using Nvidia DSR the results are excellent. To me downsampling looks better that 4x MSAA, I see almost no aliasing. Can you tell me if actual 4K resolution will look as good as downsampled image?

Share this post


Link to post
Share on other sites

Technically no. Especially with modern games using complex lighting and shading. There was a great video some time last year of Epic showing off UE4's Infiltrator demo with specific rendering techniques. One of them was TemporalAA, with it off you can see how hideously harsh the raw shading is.

That's going to look aliased at just about any resolution.

 

On a high PPI, high res device on a game with simpler looking visuals. It will be less noticeable a bit. At least in Edge Quality. But it won't look as polished as an AA'd image.

 

 

When 1080p was newish, people used to say the same thing. About how they couldn't see aliasing any more, or that they couldn't see any aliasing when using 4xMSAA. (Which might be somewhat correct considering visuals were less complex and edge/geometry aliasing were the predominantly known forms of "Jaggies")

 

I haven't been able to test it, but a HQ PP/Temporal solution that isn't actually terrible at 4k native might actually look "good enough" .

Share this post


Link to post
Share on other sites

I would have never believed, but I have fallen in love with FXAA, Nvidia's driver-level FXAA. I disliked post-processing anti-aliasing because of negative, blurry experiences in some games, but now I understand it's all about the implementation. According to Nvidia FXAA can be forced to all applications and the performance hit should be near 0%. I will always use FXAA when I have no performance to spare!

​

​

PVdqIRK.jpg

​Enemy Front with FXAA

Share this post


Link to post
Share on other sites

I believe "Hackable" should range from anything between an INI tweak to external program with Batchfiles.  It only gets to be "true" if the game supports it, with either an option in-game (Settings Menu), console command (if the console can be opened without tweaking the game), or keyboard shortcut (e.g. Alt-Enter for Window/Fullscreen toggle).

 

Drivers, external programs, INI tweaks, and all other kinds of tweaks that involve changing the game in a way that the game itself wasn't meant to be changed counts as "Hackable."

 

For example, using AMD Crimson to force Vertical Sync in Borderlands doesn't mean the game supports Vsync.  INI tweaks don't count, either.  If someone were on Intel HD, they can't tweak like that, and you have to open up the whole thing with a notepad editor (or NotepadPlusPlus if you value your sanity) to get the option working.

Share this post


Link to post
Share on other sites

I disagree with this one. First of all that becomes subjective matter, there are people who will refuse using post processing versions of AA. If your machine was bad, you most likely either don't use AA at all or use FXAA from GPU panel. And nothing stops from simply noting how to use better versions of AA, many do prefer using nvidia inspector methods. 

That was indeed -sort of- my point.

AA it's not just "enable/disable". There are load of trifles that couldn't be explained in game pages.

 

If AMD had the same level and quality built in, i'd have cards from both vendors probably. I knew that at some point AMD did have some kind of AA forcing functionality, but it was spotty and the quality wasn't that great.

I don't know what you are about, but for the bunch of -dx9- games I tried back in the days I never had problems on my 5770.

It was as simple as opening the control panel and enabling it. And quality improvement was pretty noticeable.

 

I just last year discovered this was due to every game having specific profiles, but I don't think normal user cares as long as it's this straightforward.

This is also the reason why there isn't really all that fuss around AA like for for nvidia cards imo, albeit I'm fairly sure you can tinker and fine tune too.

 

That was beautiful. I always assumed that AMD has similar driver-level AA forcing capabilities, are they completely lacking on that front?

​ I want to start playing games in 4K, when using Nvidia DSR the results are excellent. To me downsampling looks better that 4x MSAA, I see almost no aliasing. Can you tell me if actual 4K resolution will look as good as downsampled image?

If we are talking of DSR (≈SSAA≈FSAA) it's not like they both haven't supported it in the last 16 years.

 

I believe "Hackable" should range from anything between an INI tweak to external program with Batchfiles.

Yes, but you do see that by this logic even controller support becomes universally "hackable" with no special difficulty.

And given something as "rude" as post processing AA or DSR can be forced almost everywhere, you see people (like the sensible Mairo) start to feel sorry for the poor useless false attribute.

 

Also, I'm not sure "mentioning X vendor" (not specifically tied to game like, say, Soundblasters or Oculus) is any "fair" and/or neutral

You are expected to "evaluate" the game intrinsic problems. Not the outside world.

 

Which in turn doesn't mean you should resign, period.

But that you should point back to stuff like this

 

 

It only gets to be "true" if the game supports it, with either an option in-game (Settings Menu), console command (if the console can be opened without tweaking the game), or keyboard shortcut (e.g. Alt-Enter for Window/Fullscreen toggle).

Console commands are already considered hackable iirc.

I can't say I find this senseful, but I can't even say it doesn't make sense.

 

I hope all my double negations aren't giving somebody a cancer

 

 

For example, using AMD Crimson to force Vertical Sync in Borderlands doesn't mean the game supports Vsync.  INI tweaks don't count, either. 

Mhh, no, really. Thanks for the example and for pointing out how much they are entirely different things.

If you [have to] force vsync in the driver, then game definitively doesn't support it.

 

But if it's just a matter of editing .inis, the game supports it. Hackable adjective is qualifying the game indeed.

If someone were on Intel HD, they can't tweak like that, and you have to open up the whole thing with a notepad editor (or NotepadPlusPlus if you value your sanity) to get the option working.

It would be screwed, so you can't actually say the "game is" hackable.

Share this post


Link to post
Share on other sites

I'm going to weigh in on this issue as I've been thinking of adding something to the Editing guide to address this.

 

Video settings that can be forced through a video card's drivers can be considered general fixes that are application-agnostic. It's the type of solution that can be applied to practically every 3D game the wiki covers.

 

Based on that knowledge, I think we should not consider forcible video settings to be "hackable" in the context of any specific game. An article should be dedicated to game-exclusive fixes only. Allowing very general solutions tends to make an option field always "hackable", which I personally think does not add anything of value to the page.

 

The only exception where adding a general fix would make sense is if in-game support for the setting is fundamentally broken with no other possible workaround (ex. game has broken V-sync support). Otherwise, I would add general steps to the relevant video setting glossary pages on how to force each setting.

Share this post


Link to post
Share on other sites

A distinct symbol for AA/AF/Vsync generic states is certainly an option. This could be handled by the template (so pages would specify "false" to get this result for those rows).

 

I don't know what sort of symbol would work best for that. Many mobile apps use a group of vertical dots or lines as a standard indicator for more options, so maybe something like that would be recognisable?

Share this post


Link to post
Share on other sites

It would need to be a symbol that tells the reader instinctively what to do, and in this case, they would need to click on the symbol for more info.

So yes, I think a series of dots would work. But then dots don't really remind people of the idea of "generic". Maybe something like the recycling symbol (with the three arrows) would be more appropriate.

Share this post


Link to post
Share on other sites

Maybe we should add a tag named "Generic" (along with True, False, N/A, etc.) that links to relevant pages/page sections when clicked on?

The only way I would see that working is if the Video Settings table had a dedicated field to enable showing generic instructions ("show_generic"). If the field is set to "true", then add a blurb to the relevant video settings along the lines of:

 

"Generic instructions for forcing <VIDEO SETTING> can be found in <LINK TO GLOSSARY SECTION>"

 

Even then, I don't like the idea of having a dedicated element in game articles for general fixes like that. I want to remove general solutions, not highlight them.

 

I already mentioned the best approach to this: Add one set of generic instructions to the glossary pages (Anisotropic filteringAnti-aliasingVertical sync) and remove said instructions from specific game pages. All tables are already linked to these glossary pages. If someone absolutely wants to force a specific video setting for a game, they can look there.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

  • Forum Statistics

    942
    Total Topics
    6079
    Total Posts
  • Who's Online   1 Member, 0 Anonymous, 94 Guests (See full list)



×