Jump to content

AA, AF, Vsync and forcing through GPU


Marioysikax
 Share

Recommended Posts

It varies greatly, but mostly you need to take Nvidia Control Panel and chuck it out the window as it doesn't do the job.

For forcing AA, Nvidia Inspector is what is really necessary.

http://forums.guru3d.com/showthread.php?p=5183388

 

As for what works when, as I said it varies greatly. Generally games running on DX7/DX8 don't need special flags to force basic things like MSAA, sometimes SGSSAA/OGSSAA too.

Sometimes, flags are needed for DX8 games, such as Unreal Engine 2.

 

With DX9, 8/10 times you need a special flag to force AA of any kind. Hence the lists on Guru3D/3DCenter.

Even further, many games might require more specific instructions based on resolution/aspect ratio/whether AA fix is needed or causes problems, etc. Something specific is Unreal Engine 3 games of the last 5 years have had a very annoying trait.

They ship with FXAA only, and there is no way to turn it off in the menus. If there is, sometimes it doesn't work. And then your only other option is to disable something like Depth Of Field in the Engine.ini. Which acts as a super switch for ALL Post Processing. So you either get FXAA interfering with Forced AA and PP. Or no or only some PP with forced AA working correctly.

I try to keep on top of this in the list on Guru3D.

 

Everything after DX9 is a crap shoot unfortunately. I don't know if it was because of changes to the standard and DXGI that Microsoft made, but in DX10 specifically the number of functions available is vastly smaller and there isn't enough to produce any results.

So, with those sometimes it works without any bits. But it's very rare. You can enhance AA too, but it becomes dependent upon the MSAA implementation in the game itself, which with DX10/11 becomes very problematic. Especially with the rise of non forward rendering and the increased performance cost of MSAA in Deferred Lighting/Rendering.

Shortcuts are taken or hacky methods are concocted to gain back performance or some other objective. Which results in lesser quality MSAA or very poor enhancing capabilities.

Even in DX9, this can be a problem. But more often than not, things can be forced instead or specific wacky work arounds come in to play. (Read the AA section of the Inspector link above for examples with Red Faction 3 and FEAR 2)

 

Once upon a time, people had set up a petition on Nvidia's forum asking them for DX10/11 AA compatibility bits. This was stickied in their driver forum with people responding for over 2+ years with no communication from Nvidia nor a final answer other than the thread being unstickied at one point.

We begged and begged, but nothing came. It was a shame, because personally, Nvidia's driver level AA capabilities are a huge selling point and probably the only thing that has really kept me coming back to Nvidia.

If AMD had the same level and quality built in, i'd have cards from both vendors probably. I knew that at some point AMD did have some kind of AA forcing functionality, but it was spotty and the quality wasn't that great.

 

 

Now days we are entirely dependent upon Developers to give a rats ass to put some effort into AA. More often than not, it doesn't happen. And even when it does, you still have to essentially brute force Ordered Grid AA (DSR/VSR/Custom Resolutions/GeDoSaTo) on top to get the quality to a very good point.

DSR/VSR are only saved by their choice of good variable resampling functions for the final image resolved. But it still is only OGSSAA, not the most elegant solution performance wise. That's not to say forcing is any better, SGSSAA in some modern DX9 games has become obscenely expensive due to the hacky nature combined with how games are made.

If a game uses Unity (In DX10+, in DX9 can often be forced. But it's generally more performance sucking than it should be. For whatever reason, whether Unity is unoptimized crap;look at PS4 game performance. Is 7/10 times poor; or what.),you are basically screwed. Their idea of AA only extends to basic FXAA/MLAA/SMAA1x.

And from what I understand. Modifying the Engine directly to do their own isn't available or isn't easy. There was some tech demo a while ago that seemed to have some kind of Temporal AA, but dear god it was total garbage in every way.

Unreal 4's TAA is also equally poor for varying reasons. But Epic doesn't make it impossible for developers to implement their own. (See: Ethan Carter Redux on UE4)

 

I digress, moving on to the forgotten child OpenGL. There are no flags for compatibility like with DX8/9, so you are at the mercy of what feature level and how the engine is set up by developers per game.

When it does work, you just need to set it to override and it just works. With one caveat, for SGSSAA specifically the option labeled "Transparency Supersampling" is fundamentally different than in DirectX.

At the core level, both TrSSAA and SGSSAA options are actually SGSSAA. What separates them is how they work, TrSSAA(SGSSAA) replays the pixel shading of only Alpha Test/Coverage materials to AA them. While SGSSAA(FullSceneSGSSAA) replays the pixel shading for the entire scene unless specified to ignore.

In OGL there is only the one, FSSGSSAA. This is used as the option "Transparency Supersampling" And the sample counts need to match. (8xQ+8xTr|2x+2xTr).

 

Doom 3 OG and the Riddick games(I think Riddick even works on AMD cards https://www.youtube.com/watch?v=gn8EPiiPpMQ) are examples of AAA games off the top of my head that work with forced AA. While Doom 3 BFG does not or any other games made on IDTech after Brink. (Which ran in DX9 IIRC)

I wish we could have compatibility bits for OGL and Vulkan too, this would greatly benefit Linux and SteamOS.

 

AMD several years ago published a sample and source for their own SGSSAA implementation called "SSAA SF" (SampleFrequency) geared towards a Forward+ renderer.

https://github.com/GPUOpen-LibrariesAndSDKs/SSAA11/releases

The code sample is vendor agnostic and works just fine on Nvidia cards.

It includes some presentation and documentation on how it works("SSAA11/Doc"). It does automatic LOD adjustment, though if one implemented it in their game it could be adjustable.

But it works very well and is very performant. I Wish this was used in games. Even if not performant, it's available as a future proof option.

 

 

 

 

Anything that isn't in the game to use should be marked "Hackable" IMHO. Or a new icon and setting should be come up with to replace it. Like "Driver Override Nvidia" "Driver Override AMD" or "Driver Override Both"

If the driver offers a higher quality or in general better method than available in game, then it should be noted what is in game but higher quality is available.

That was beautiful. I always assumed that AMD has similar driver-level AA forcing capabilities, are they completely lacking on that front?

​I want to start playing games in 4K, when using Nvidia DSR the results are excellent. To me downsampling looks better that 4x MSAA, I see almost no aliasing. Can you tell me if actual 4K resolution will look as good as downsampled image?

Link to comment
Share on other sites

Technically no. Especially with modern games using complex lighting and shading. There was a great video some time last year of Epic showing off UE4's Infiltrator demo with specific rendering techniques. One of them was TemporalAA, with it off you can see how hideously harsh the raw shading is.

That's going to look aliased at just about any resolution.

 

On a high PPI, high res device on a game with simpler looking visuals. It will be less noticeable a bit. At least in Edge Quality. But it won't look as polished as an AA'd image.

 

 

When 1080p was newish, people used to say the same thing. About how they couldn't see aliasing any more, or that they couldn't see any aliasing when using 4xMSAA. (Which might be somewhat correct considering visuals were less complex and edge/geometry aliasing were the predominantly known forms of "Jaggies")

 

I haven't been able to test it, but a HQ PP/Temporal solution that isn't actually terrible at 4k native might actually look "good enough" .

Link to comment
Share on other sites

I would have never believed, but I have fallen in love with FXAA, Nvidia's driver-level FXAA. I disliked post-processing anti-aliasing because of negative, blurry experiences in some games, but now I understand it's all about the implementation. According to Nvidia FXAA can be forced to all applications and the performance hit should be near 0%. I will always use FXAA when I have no performance to spare!

​

​

PVdqIRK.jpg

​Enemy Front with FXAA

Link to comment
Share on other sites

I believe "Hackable" should range from anything between an INI tweak to external program with Batchfiles.  It only gets to be "true" if the game supports it, with either an option in-game (Settings Menu), console command (if the console can be opened without tweaking the game), or keyboard shortcut (e.g. Alt-Enter for Window/Fullscreen toggle).

 

Drivers, external programs, INI tweaks, and all other kinds of tweaks that involve changing the game in a way that the game itself wasn't meant to be changed counts as "Hackable."

 

For example, using AMD Crimson to force Vertical Sync in Borderlands doesn't mean the game supports Vsync.  INI tweaks don't count, either.  If someone were on Intel HD, they can't tweak like that, and you have to open up the whole thing with a notepad editor (or NotepadPlusPlus if you value your sanity) to get the option working.

Link to comment
Share on other sites

I disagree with this one. First of all that becomes subjective matter, there are people who will refuse using post processing versions of AA. If your machine was bad, you most likely either don't use AA at all or use FXAA from GPU panel. And nothing stops from simply noting how to use better versions of AA, many do prefer using nvidia inspector methods. 

That was indeed -sort of- my point.

AA it's not just "enable/disable". There are load of trifles that couldn't be explained in game pages.

 

If AMD had the same level and quality built in, i'd have cards from both vendors probably. I knew that at some point AMD did have some kind of AA forcing functionality, but it was spotty and the quality wasn't that great.

I don't know what you are about, but for the bunch of -dx9- games I tried back in the days I never had problems on my 5770.

It was as simple as opening the control panel and enabling it. And quality improvement was pretty noticeable.

 

I just last year discovered this was due to every game having specific profiles, but I don't think normal user cares as long as it's this straightforward.

This is also the reason why there isn't really all that fuss around AA like for for nvidia cards imo, albeit I'm fairly sure you can tinker and fine tune too.

 

That was beautiful. I always assumed that AMD has similar driver-level AA forcing capabilities, are they completely lacking on that front?

​ I want to start playing games in 4K, when using Nvidia DSR the results are excellent. To me downsampling looks better that 4x MSAA, I see almost no aliasing. Can you tell me if actual 4K resolution will look as good as downsampled image?

If we are talking of DSR (≈SSAA≈FSAA) it's not like they both haven't supported it in the last 16 years.

 

I believe "Hackable" should range from anything between an INI tweak to external program with Batchfiles.

Yes, but you do see that by this logic even controller support becomes universally "hackable" with no special difficulty.

And given something as "rude" as post processing AA or DSR can be forced almost everywhere, you see people (like the sensible Mairo) start to feel sorry for the poor useless false attribute.

 

Also, I'm not sure "mentioning X vendor" (not specifically tied to game like, say, Soundblasters or Oculus) is any "fair" and/or neutral

You are expected to "evaluate" the game intrinsic problems. Not the outside world.

 

Which in turn doesn't mean you should resign, period.

But that you should point back to stuff like this

 

 

It only gets to be "true" if the game supports it, with either an option in-game (Settings Menu), console command (if the console can be opened without tweaking the game), or keyboard shortcut (e.g. Alt-Enter for Window/Fullscreen toggle).

Console commands are already considered hackable iirc.

I can't say I find this senseful, but I can't even say it doesn't make sense.

 

I hope all my double negations aren't giving somebody a cancer

 

 

For example, using AMD Crimson to force Vertical Sync in Borderlands doesn't mean the game supports Vsync.  INI tweaks don't count, either. 

Mhh, no, really. Thanks for the example and for pointing out how much they are entirely different things.

If you [have to] force vsync in the driver, then game definitively doesn't support it.

 

But if it's just a matter of editing .inis, the game supports it. Hackable adjective is qualifying the game indeed.

If someone were on Intel HD, they can't tweak like that, and you have to open up the whole thing with a notepad editor (or NotepadPlusPlus if you value your sanity) to get the option working.

It would be screwed, so you can't actually say the "game is" hackable.

Link to comment
Share on other sites

I'm going to weigh in on this issue as I've been thinking of adding something to the Editing guide to address this.

 

Video settings that can be forced through a video card's drivers can be considered general fixes that are application-agnostic. It's the type of solution that can be applied to practically every 3D game the wiki covers.

 

Based on that knowledge, I think we should not consider forcible video settings to be "hackable" in the context of any specific game. An article should be dedicated to game-exclusive fixes only. Allowing very general solutions tends to make an option field always "hackable", which I personally think does not add anything of value to the page.

 

The only exception where adding a general fix would make sense is if in-game support for the setting is fundamentally broken with no other possible workaround (ex. game has broken V-sync support). Otherwise, I would add general steps to the relevant video setting glossary pages on how to force each setting.

Link to comment
Share on other sites

Maybe we should add a tag named "Generic" (along with True, False, N/A, etc.) that links to relevant pages/page sections when clicked on?

Link to comment
Share on other sites

A distinct symbol for AA/AF/Vsync generic states is certainly an option. This could be handled by the template (so pages would specify "false" to get this result for those rows).

 

I don't know what sort of symbol would work best for that. Many mobile apps use a group of vertical dots or lines as a standard indicator for more options, so maybe something like that would be recognisable?

Link to comment
Share on other sites

It would need to be a symbol that tells the reader instinctively what to do, and in this case, they would need to click on the symbol for more info.

So yes, I think a series of dots would work. But then dots don't really remind people of the idea of "generic". Maybe something like the recycling symbol (with the three arrows) would be more appropriate.

Link to comment
Share on other sites

Maybe we should add a tag named "Generic" (along with True, False, N/A, etc.) that links to relevant pages/page sections when clicked on?

The only way I would see that working is if the Video Settings table had a dedicated field to enable showing generic instructions ("show_generic"). If the field is set to "true", then add a blurb to the relevant video settings along the lines of:

 

"Generic instructions for forcing <VIDEO SETTING> can be found in <LINK TO GLOSSARY SECTION>"

 

Even then, I don't like the idea of having a dedicated element in game articles for general fixes like that. I want to remove general solutions, not highlight them.

 

I already mentioned the best approach to this: Add one set of generic instructions to the glossary pages (Anisotropic filteringAnti-aliasingVertical sync) and remove said instructions from specific game pages. All tables are already linked to these glossary pages. If someone absolutely wants to force a specific video setting for a game, they can look there.

Link to comment
Share on other sites

The only way I would see that working is if the Video Settings table had a dedicated field to enable showing generic instructions ("show_generic"). If the field is set to "true", then add a blurb to the relevant video settings along the lines of:

 

"Generic instructions for forcing <VIDEO SETTING> can be found in <LINK TO GLOSSARY SECTION>"

 

Even then, I don't like the idea of having a dedicated element in game articles for general fixes like that. I want to remove general solutions, not highlight them.

 

I already mentioned the best approach to this: Add one set of generic instructions to the glossary pages (Anisotropic filteringAnti-aliasingVertical sync) and remove said instructions from specific game pages. All tables are already linked to these glossary pages. If someone absolutely wants to force a specific video setting for a game, they can look there.

Oh I forgot that tables are linked to glossary pages. I agree with your method, having instructions in glossary pages would be the most efficient.

However I still don't think it would be right to simply set the field to true when it require a general fix because in that case it's not strictly true.

Link to comment
Share on other sites

Special icon idea seems pretty.. awful.
And again the "what would you write there" question arises. There's simply too much stuff going on.
 
We already have the Glossary tags for what it matters. I'd just love people to go there, in case.
But if any, I see a point: they aren't as noticeable as needed.
 
In my ideal world, we'd just have to find a way to better make people realize they can just go there (ie: those pages aren't just jargon with useless "acronyms explanations")
Or.. instead of make the thing universal: le'ts keep everything as it is, and just do.. something when status is false.
 

Think: game isn't even 16:9, you open widescreen resolution page, and something refers you to Custom Resolution page.
This is something not even power users usually know for 4:3 only games (I mean.. you have a 1080p display and by default you can play only up to 1600x1200 or 1280x1024)

 

I'm out of ideas atm (an arrow in notes that points to the link? A note? A bigger button? A different color?) but I'm really looking forward to it.

Link to comment
Share on other sites

I have made an example implementation of a default note that is displayed when AA/AF/Vsync is false and has no note provided.

 

This is not a perfect solution since it would be shown for games where there really is no option (hence the tentative phrasing).

Link to comment
Share on other sites

  • 2 months later...

@BONKERS How does forcing AA through drivers work with GoldSrc engine? I've enabled AA in Half-Life, there's a notable improvement with 32x CSAA, but aliasing is noticeable in 1920 x 1080 and looks more like 4x MSAA to me.

 

I found a HL screenshot, downscaled from 4K:

 

iRFdRbE.jpg

Link to comment
Share on other sites

 Share

  • Found PCGamingWiki useful? Please consider making a Donation or visiting our Patreon.
  • Who's Online   0 Members, 0 Anonymous, 312 Guests (See full list)

    • There are no registered users currently online
  • Recently Browsing   0 members

    • No registered users viewing this page.
  • Forum Statistics

    1.8k
    Total Topics
    9.2k
    Total Posts
×
×
  • Create New...