Jump to content

BONKERS

Member
  • Content Count

    61
  • Joined

  • Last visited

  • Days Won

    14

Posts posted by BONKERS

  1. Glad you guys are figuring out a way to turn it off. The TAA in this game is so obnoxiously bad in quality it's almost offensive this is coming from an ultra high budget release.
    Just having seen some of the footage of the game and screenshots I was honestly shocked at not only how bad the resolve is (Ultra blurry resolve to hide how bad your TAA actually is =/= filmic. Coming from someone who has spent the last decade chasing the best AA for hundreds of games where ultra sharp =/= ground truth. Sharpen a bad resolve and you will reveal what's under the mask. Sharpen a good, but soft resolve and you can recover real information. ), but the abundant amount of artifacts and false positives that the TAA is creating that results in some cases worse than no AA at all. (Clamping artifacts, ghosting artifacts, depth discontinuity artifacts, no pixel history artifacts, noise breakup and poor resolve artifacts, surface smearing in motion compounded by motion blur,etc)

    Not only that but they are trying to rely on the TAA to filter and clean up low quality effects used elsewhere that just compounds the problem. The biggest problem being the Screen Space Reflections. (In the posted screenshot above of the hub cap on the tire. That's what you are seeing alongside the breakup artifacts on the silhouette of the car body)They look almost as if they are using some sort of primitive software based path/ray tracing but at a very very low RPP without any real denoise/filter step (Which would bring it's own problems) but instead try to use the TAA on it's own to filter it and that results in the massive amounts of point/high frequency noise (And accompanying temporal artifacts)on so many surfaces in the game.

    There are some cases where TAA is so bad , even on console it'd be better to have a toggle not to use it. You could always account for it on PC usually,but it's shocking they lock you in to using something so horrendously poor in quality without a choice for a higher quality solution (If they can waste 50% of your render budget optionally on RT effects, they could do the same with AA. No developer ever chooses to do so) or a toggle to turn it completely off.

    Hope you can find an elegant solution to disable it.

  2. I'm not sure you know what you are talking about.
    The specular lighting model in the RE engine is based on material properities, which is a physically based lighting model. (Of course some of it exaggerated to attain a certain look intentionally. FWIW even in real life cinema, the lighting is not at all true to real life. Everything is lit a certain way to achieve a certain look. Either in filming or post.)
    And in real life you can have a lot of objects have the same kind of shiny finish. It all depends on how it's made. (Matte vs Satin vs Gloss).

    What you propose would actually have to change the entire lighting model of the engine to ignore material properties and light everything differently than how it was designed. Not a simple modification.

    RE7 in particular is the earliest version of this engine and so the lighting model is not as refined, but quite literally it takes place mostly in a wet, humid and damp environment where water would be a regular problem. Hence a lot of the outdoor areas are wet looking. Inside, some types of wood look more glossy but that's because the lighting material is based on a high gloss wood finish.(Though that doesn't apply to all wood. Different wood is lit differently).

    RE2 features rain prominently so there is also a lot of wetness in those areas and the character models actually transition from a wet to dry state based on the environment.
     

  3. Maybe not the 3700x but the 3600x (Or even a cheaper 2600x. 12 threads is still pretty good). One of the R5 chips is cheap enough I feel you could justify buying it and if there is something better around the corner at some point that is still AM4 compatible you can easily buy the newer chip and sell off the cheaper older one and not lose a lot of money.

  4. I have to draw the line that forcing SSAA,MSAA,SGSSAA,TrSSAA should be considered Hackable because these all have to hooked in at the driver level at the appropriate state during rendering. (Hence the need for compatibility bits)

    It's not at all like FXAA or SMAA since those are Post Process shaders that are GPU agnostic.  You aren't just taking the final 2D Buffer like the MCable and slapping a filter on it. (Though using FXAA/SMAA with downsampling can be very beneficial https://imgsli.com/OTE5NQ)

    Forcing AA at the driver level for Nvidia cards is not a Post Process. And are essentially seen as a driver hack, they require special compatibility  bits to be set (Using a third party program) by most games in order to function correctly. Otherwise you'd be able to do it in DX10,11,newer OGL versions without issue. But you can't because Nvidia didn't bother building in the support into the driver to hook into those kinds of backends. (Because hardware level AA support by developers was decreasing significantly at the turn of the decade, due to moving to deferred rendering where it was claimed often that they couldn't support things like MSAA. Guess what? Nvidia has the capabilities to hook in MSAA support to a ton of DX9 deferred rendering games.)

    Often games will require specific things to be setup in addition to compatibility bits for things to work properly. Take FFXIV for example, there are multiple compatibility flags you can use, ffxiv201906212117583.pngThis image uses a flag that specifically tells the driver to skip the primary flip chain in order to not have SGSSAA process the UI elements. https://imgsli.com/MTAyNDI
    But did you also know that you have to use the depreciated DX9 backend to use SGSSAA and did you also know that if you change the in game Gamma setting to *anything* but 50/100 it will completely break forced Anti Aliasing?

    Take Crysis 3 for example, it runs on DX11 and you can't "Force" AA. But you can use the in game MSAA,SMAA S2x/4x or TXAA and the driver can hook into those passes  (MSAA derivatives) to "Enhance" the AA instead. This becomes highly dependent on the game engine implementation of those techniques and often is lower quality than forcing AA (It is the only option because there's nothing built into the driver to force AA in DX11) but it still has to be hooked in the game engine by the driver to work.
    You can enable MFAA, TrSSAA or SGSSAA on top of the above mentioned. Using SGSSAA causes a bug with grass rendering that depends on which AA you use as a basis. In all cases it cause blades of grass to become very soft and the overall quality is lacking due to the poor MSAA implementation in game. However doing all of this at a higher resolution and downsampling to your desired resolution can mitigate most of the problems or make them less obvious. Aside from FXAA or SMAA on top to clean up edges before resolve (As shown in example above) all of this has to happen at an engine level first. Does that not qualify as "hackable" ?
    https://imgsli.com/OTIzMA


    It's definitely not as often as simple as using SweetFX or Reshade.(And it gets a bit more complicated if you want to use modern Reshade in addition to forcing AA. As it requires an additional compatibility flag and the forced AA depending on which one will interact and change how the ReShade effects appear. SGSSAA with Reshade Sharpening for example will require much stronger settings than without SGSSAA because SGSSAA replays all shading for all aspects of rendering not just geometry like MSAA and so it will also effectively be anti aliasing the sharpening pass as well. Depending on what effects you are using it can get a little complicated)

    In my mind that qualifies as "Hackable" because the game has no support for it, but the driver has to hook into the game engine to make it work. People visit a page for a game because they want specific information for that game. They shouldn't have to dig through other pages to eventually find information on AA for that specific game that they probably have no idea may even exist in the first place. 


    Anisotropic Filtering, I feel the same way about because tons of games don't offer it at all, their in-engine version is of lower quality(Like Crysis 2/3 for example. Even at it's highest AF in Crysis 2 is significantly lower quality than the driver verison. Similar to this Just Cause 3 comparison) (http://images.nvidia.com/geforce-com/international/comparisons/just-cause-3/just-cause-3-nvidia-control-panel-anisotropic-filtering-interactive-comparison-001-on-vs-off-rev.html) or their in game option tops out at a lower setting. For games that don't have the option at all, I feel that hackable is appropriate because it's possible that your average user doesn't know they can set it up globally in the driver to override what game engines do.


    Maybe a better middle ground instead of "hackable", for any game that there is something possible for, there should be a link in the AA field that just says " See Nvidia Anti Aliasing compatibility "And that would be enough of an indication to the user to search that for information for that specific game. And only put this link on pages for games that there is Nvidia specific things you can do for AA as shown in the spreadsheet.  (Often the best quality performance trade off isn't just forcing AA from the driver it's actually a hybrid solution involving forcing AA+ other methods on top. Or enhancing a game's built in MSAA or MSAA derivative in addition to Downsampling which is OGSSAA. Things like this are listed for games with poor or no potential to force AA)

     For generalized explanations of what is what the glossary serves as fine information.
    But for game with specific instructions it is unsatisfactory to send people there to find out information for a specific game.

  5. Like mentioned, being on Steam is not going to stop piracy (lulz) and it's not going to stop piracy of the soundtrack.
    People can rip it from the game, make recordings of in game playback (Lossy encode of a lossy encode. Not great).

    But yeah this just really points to ignorance and just making a bunch of assumptions.

    No "Client store" will stop piracy. DRM won't stop piracy.

    Stop punishing your paying customers.

  6. I'm not sure if this is helpful or not but the embedded version of the sheet from my edit page on the document

    <iframe src="https://docs.google.com/spreadsheets/d/e/2PACX-1vT67ip9JZ7PQ0EtltSmfU71qObvgbNZa9SJQbkvAKHu83jyYOdhyvb8KatGWv2sVLacQMSXfaoT0bJX/pubhtml?widget=true&amp;headers=false"></iframe>

    That'd be great if you could get it to work though. It would save a lot of time and make keeping the list up to date universally.

  7. I have been thinking for a while about Anti Aliasing Compatibility for Nvidia GPUs.
    There is the existing forum thread NVidia Anti-Aliasing Guide (updated) - Guru3D.com Forums 
    But for the last year or so the forums there do not allow the spreadsheet to be embedded in the post anymore making it less visible and having to click a link to view it.
    https://docs.google.com/spreadsheets/d/1ekUZsK2YXgd5XjjH1M7QkHIQgKO_i4bHCUdPeAd6OCo/edit?usp=sharing

    I'd really like to get this information to other places as it's still very useful for games running on Direct X9 and even for some games that don't.

    My first idea was to edit the PCGW page for every single game listed with flags and a link to each reference post as in the spreadsheet. (As old as some of those posts are).
    Over the years i've added flags to several PCGW pages but it was just sparse here and there.
    This would take a long time to do on a game by game basis.

    My next idea would be if it was possible to embed the spreadsheet into it's own PCGW entry page "Anti Aliasing Compatibility for Nvidia GPUs"
    and rather than having a normal layout. The entire page listed is just  the spreadsheet embedded.



    Any ideas or thoughts?

  8. Not to terribly much reason to I think. Not many games have multiple AO options, and every game specific implementation is different from the last. Not many stick to standard techniques.

    SSAO is a generic blanket term if it's used in a game as an option. You don't know what else specific and unique to that one implementation in the game is doing different than another that uses "SSAO".

    Companies like Ubisoft often iterate their own offshoot, derivative techniques with just about every game.

    Unless they are using something set in stone to a certain degree like HBAO, HDAO, or HBAO+. There's just too much little variation than can be reported on.  (Even these can be different from one game to another in terms of how they look visually and how they perform depending on how they decide to set it up. )
    Tons of games just call their option "Ambient Occlusion". Which really means nothing other than giving you a description of what basic idea of a visual effect it enables.


    If you just want info on AO in general there are other web pages out there.

  9. More frames = more updates = less input lag = less tearing.

    Less input lag. But less tearing just isn't true. You will get noticeable tearing somewhere on the screen no matter what if you aren't using Vsync in FSE.

     

    Borderless windowed will increase your input lag while still running at an unlocked framerate. DWM will just toss what it wants wherever. So if you run at a very high framerate in BW, you might be able to counter the lag added from DWM Vsync, but the visuals will not be completely smooth.

     

     

    The only way to get no tearing with super low latency and high framerates is Freesync/Gsync. A consistent fixed update is still better IMO. Which you can use both to achieve. Cap the framerate with one of them, and you get a consistent smooth and even paced image with all the benefits of not using Vsync at all.

     

    You can also try fast sync with a capped framerate of 90,120,150,180,240. Some games it works very well with.

     

     

     

    In some cases directly capping the framerate can make the game run worse, sometimes VSync should be used as an alternative method if the game is running worse. (check the frametimes with something like MSI Afterburner to see if they game has micro stuttering)

    I'd love to see actual examples of this.

    I have never encountered it.

     

     

     

     

  10. The way Vsync works with the compositor is independent of the game frame rate. It doesn't  sync the game for you. So it will skip frames as it sees fit depending on the framerate. Even if you cap the framerate.
    So you will be getting non perfect Vsync but with similar input lag as FSE Vsync. Though if you don't notice then that's fine.
    For me, it detracts from one of the pluses of Windowed mode, which is less input lag because no Vsync if you don't want to be playing in FSE.
    So if you want to get that in W8/10, you have to use FSE only to get an uncapped framerate with the least latency. A minor quibble.

    Windows 7's compositor is infamously bad at vsyncing everything.
    What i've tried of W10's isn't too bad tbh.

  11. http://www.tomshardware.com/reviews/amd-freesync-versus-nvidia-g-sync-reader-event,4246.html

    I couldn't tell you about Guru3D registration.
    Could try https://www.guru3d.com/content-page/contact-us.html

     

     

    Or at least make .nip be able to carry more than a single game.

    It actually can. If I wanted, I could make a nip file with a flag on every profile of every compatible game.
    But again, doesn't help out cases where there are multiple flags, nor games with special instructions.
    There ultimately wouldn't be much point or benefit over just having someone copypaste the flags themselves.

  12.  

    Uh.. First of all, hats off to all that information. Really.

    Second.. Perhaps should I have said "individually different" types?

     

    Anyway, I dunno if you recently edited the table, it seems golden now. Bravo.

     

    I did recently edit the information in the Google Spreadsheet, I had totally forgotten that tab existed.

    I am not the one who created the document, but I am the one who has maintained and kept it up to date for last 3 years or so. It was in a very very sorry state before I took over.

     

     

    Well, the aim of the wiki would indeed be "having people not to bother with details" (for as much, if anyone want to deepen we shall have it covered too).

     

    Also.. I'm confused. Aren't "special requirements" the bits themselves?

    And do you even need different bits for each AA mode in the same game?

    What does "importing the flags" means?

    Not having to bother with the details would be great honestly. But it's not always that simple.

    That's why the google document mentions any special instructions that any individual game needs. 

    For bits, yes different ones provide different functions. For example, if you wanted to force MSAA. It requires a different flag most of the time than say SGSSAA.

    And with SGSSAA sometimes, different flags provide different results. Peoples choice might hinge on personal preference.

     

    Like with many later UE3 games. They have FXAA forced on no matter what unless you disable Post Processing completely. Which means you lose Tone Mapping, Bloom , DoF and many other effects. So the user has to decide if they are ok with FXAA potentially getting in the way of SGSSAA working as well as it is supposed to. Which might be mitigatable with additional downsampling. Or lose all those effects

     

    Another example is Dawn of War II series. One flag AA's everything fairly well. Including the UI. Another flag skips the UI and the resulting AA is slightly sharper, but has slightly worse aliasing. The game UI scales with resolution, so downsampling isn't very viable an alternative because the UI becomes too small.

    So the person has to decide on the look that they like. Whether they'd rather use sharpening after the fact or are fine with it as is.

    DOW II

    No AA https://abload.de/img/noaag6st4.png

    8xSGSSAA with flag that gets everything. https://abload.de/img/12c18xsgssaaans5e.png

    8xSGSSAA with flag that skips the UI/is sharper https://abload.de/img/12c48xsgssaa96s5j.png

     

     

    Importing flags means, you can export profiles from Nvidia Profile Inspector for a given application and any settings changed by the user (As long as the option to ignore pre-defined values is set) will be put into a file. Then someone else could import that file and then they'd have a profile for say

    Dead Rising 1 - With the AA flag set, and Enhance application setting and 4xSGSSAA set. Since forcing is impossible, the user would have to make sure 4xMSAA is enabled in game for this example.

     

    The problem lies in, that you'd have to make multiple profiles to export per game with each different setting set (Example. 1 profile with 4xSGSSAA set. One with 8xSGSSAA set, and so on) . Or just ones with the different AA flags set on the profile. The user would still need to know what they can do with that flag once it's imported and any special instructions from the game side.

     

    I mean I guess theoretically one could create a tabled page with profiles to import for different settings and any special instructions. Basically what the Google Spreadsheet already is, but with a lot of links to importable profiles for each different setting. Rather than just giving the user the information and letting them input and set it up themselves.

    Going that method, given that there are 571 Game entries including duplicates in the spreadsheet, just for SGSSAA alone if possible (2x,4x,8x) that's 1713 .nip files i'd have to make one at a time. Even if I was just making profiles with the different AA flags on them so people could import them and then set the AA manually. That's 571 .nip files i'd have to generate and then hope the user knows what to do afterward.

     

    That's just how the application works.

     

     

     

    Why not simply Nvidia inspector page?

    The NPI page I made was already flagged for being too long and too wordy. If I added a section at the bottom in a table format for all the AA flags, that'd increase the page length a significant amount.

     

    I mean, I COULD do this. Whether the Site admins/owners agree with it. Is another thing.

     

     

     

    Speaking of which, did anybody ever manage to have it working on Optimus laptops?

    And besides, were you aware of this?

    I am aware of this guy. Really shady character, unwilling to take criticism or skepticism. That we should just accept his word as fact. Rather than engaging in conversation and trying to help educate his point of view and ideas.

     

    He has multiple times tried to create these registry changes that supposedly make games run way better and other stuff But really didn't seem to amount to anything when people actually took the risk and installed them. Similar situation.

    http://forums.guru3d.com/showthread.php?t=405360

     

    I don't necessarily think everything was fake that he was talking about perhaps. But the whole situation could've gone a whole lot better.

     

    Whether Nvidia makes the Hardware Module purely as source of profit for G-Sync, is a toss up. There have been multiple blind tests done that show that people often pick out G-sync as the better of the two. We don't know the cost of these modules and the licensing they charge. Will they be in a manner that makes sure Nvidia makes money off of it? Sure that's what any business does. But it would be nice to see the price and point of entry come way down.

    I'd really kill for just a 60hz Gynsc basic 1080p monitor.

     

     

     

    AA settings have no effect in The Evil Within (id Tech 5): http://community.pcgamingwiki.com/gallery/album/90-the-evil-within-aa

     

    Is id Tech 5 engine this problematic? I recall hearing some complaints when Wolfenstein: The New Order launched (the game had no AA options).

    FXAA can't be forced from Nvidia control panel:

     

     

    You are right it doesn't, because IdTech5 uses OpenGL. So you are at the mercy of the developer to support Anti Aliasing. Most modern OpenGL games dont' allow for any kind of AA to be forced from the Driver for Nvidia cards. IdTech4 games worked, but not IdTech5. There are no compatibilty bits for OGL.

×
×
  • Create New...