0.200.6 brings quite a few changes, unfortunately couldn't link to changelog on downloads page because of some silly text size limit, but you can find full changelog here: https://github.com/emoose/DLSSTweaks/releases/tag/0.200.6
Biggest changes are probably the new ConfigTool, support for specifying resolutions directly, and a tweak to override sharpening level in pre-2.5.1 DLSS, bunch of minor issues were also fixed with it too.
The code around OverrideAppId also had some changes, found a new method which should mean OverrideAppId isn't necessary for changing DLSS presets any more, still included the setting for it just in case new method has any issues but I haven't seen any problems with it yet, could probably just remove that now really.
Depends on the game which names will work, but most games are fine with winmm or one of the xinput ones.
You could also try setting up the "2. DLSSTweaks for other games" file, that uses nvngx.dll filename which should work in every DLSS game, but needs the included reg file to be installed first for it to work.
Tried changing QUALITY dlss to 0.85% but this removes preset setting, what am I doing wrong? without tweaks to quality it uses E preset, which is too sharp to my taste
I use 3070 I use force DLAA and play at default 1080p RT max and all settings to ultra I get 20-30 fps idk what I m doin wrong but force DLAA minimizes ghosting
For some reason my preset is always at A no matter what preset I picked with the exception of force dlaa which use preset F instead. Does anyone know how to fix this?
I set DisableDevWatermark to True but I still get the ""Nvidia Confidential" message on the top left corner. Am I doing something wrong? I'm using DLSS version 3.1.30 from here https://www.techpowerup.com/download/nvidia-dlss-3-frame-generation-dll/
The 3.1.30 v2 on the techpowerup page should have the watermark removed, next DLSSTweaks version will also be able to remove from the first version too.
The watermark text is pretty much static so maybe would be able to cause burn-in if you left game running for a long time, not really sure why NV keep releasing these watermarked builds publicly, the text says things like "property of nvidia, do not distribute" when it's NV themselves that are distributing it...
This is great because with my tweaked graphical settings the "Balanced" mode would drop below 60 fps in the heavily wooded areas while the performance mode felt like it wasn't optimal. I was easily able to split the difference between two and maintain a locked 60. I set ultra performance to 0.5462963 (1180p) if anyone is interested, it seems to work well.
Also, what is the best preset that people are using?
I love this, I always wanted something a bit higher than the quality setting, playing at 1440p I set the lowest setting to be 0.6666667 (the default 'quality' setting) and used higher resolutions for the rest, i chose resolutions based on the list of resolutions available in the scaling in nvidia, but also made some of my own
0.85 (2176x1224 on 1440p) is needlessly high, this is a resolution available in the scalings from nvidia in the nvidia control panel, setting this close to native is almost pointless, it doesn't give you much more quality over the other settings, and the fps benefit isn't that great
0.71111111 (1820x1024) or 0.71428571 (1828x1028) this is a nice scale, it is noticeably better in visual quality from the default quality while still benefiting a lot from rendering at significantly below native 0.76944444 (1969x1108) is from the scalings nvidia lists when you turn on their scaling (it's called 77% there but this is the number they are aiming for), this is very close to 1920x1080, but it scales better and this is what i use now instead of playing on native with taa)
edit: 0.76944444 should actually be0.7619047619047619... (16/21) but i guess both will round off to the same actual resolution but it's better to get it right
0.75 would be exactly 1920x1440, this does actually look pretty good in the witcher 3, but dlss and nvidia's scaling in general seems to mostly look better when it's not working with such fractions like 3/4, 5/8 (although the default performance is exactly 1/2 so maybe it's a brain ghost of mine) so it's better to choose something a little above or below this
For W3 I use default presets (D for all but ultra perf, which is F i think). I wrote this when I was using a 2560x1440 monitor, these numbers might not be as useful for other resolutions... I bought myself a 4k oled tv (LG C2 42") that i now use as my main monitor, and I realize nvidia uses slightly different values internally for its scaling settings in 4k... I'm not at home right now so i don't have the numbers but if you underestand what i'm saying here i'm sure you can find it yourself. In 4k i'm no longer using DLSStweaks to get higher than "Quality" settings, instead i tend to want something near "balanced", for W3, slightly lower than "balanced" (which is 0.58837890625, which is 2260x1271 at 4k or 1506x847 at 1440p) something about the memory bandwith or RT limitations on 3080 12GB of the latest update (it was even worse before) makes it drop sharply in performance when i run it at higher than something like 0.575 (2208x1242) so i tend to run it at 2048x1224, which is really 0.53333333x0.56666667 so it has a little bit more vertical information to work with.
Also in the latest DLSSTweaks you can just input resolutions directly, which is great because you don't have to use the same screen ratio as your actual display... so you might go with 2048x1440 or something and run that on a 2560x1440 screen, this will basically be DLAA vertically and DLSS horizontally, some games don't like this though (the only one that I've found so far is cyberpunk2077, which gives a black output if you don't use the same ratio as the output) but most games seem fine with it... usually a little more data on the height might be better... since almost-horizontal objects will show aliasing with less horizontal data, i've even tried some games in 4k with 2160x2160... (on a normal 16:9 tv)
Thank you for that ! It should be implemented by default. Quality mode in Witcher had problem with proper AA - settings like 0.8-0.9 looks perfect with very little fps impact.
Such a great tool. Thank you for making this. Great for improving image quality in games that have DLSS, where you have the performance headroom. DLSS at Quality often looks better than native TAA, and with the option to go above that, it's even better.
Uncharted Collection looks stunning at 4K with DLSS at a render resolution of 80%.
46 comments
Biggest changes are probably the new ConfigTool, support for specifying resolutions directly, and a tweak to override sharpening level in pre-2.5.1 DLSS, bunch of minor issues were also fixed with it too.
The code around OverrideAppId also had some changes, found a new method which should mean OverrideAppId isn't necessary for changing DLSS presets any more, still included the setting for it just in case new method has any issues but I haven't seen any problems with it yet, could probably just remove that now really.
Depends on the game which names will work, but most games are fine with winmm or one of the xinput ones.
You could also try setting up the "2. DLSSTweaks for other games" file, that uses nvngx.dll filename which should work in every DLSS game, but needs the included reg file to be installed first for it to work.
or do I even need this mod?
but force DLAA minimizes ghosting
*edit*
Never mind. Figured out I didn't enable it.
I'm using DLSS version 3.1.30 from here https://www.techpowerup.com/download/nvidia-dlss-3-frame-generation-dll/
I made sure I ran EnableNvidiaSigOverride.reg as well.
DLAA is working so not sure what else I've missed.
The watermark text is pretty much static so maybe would be able to cause burn-in if you left game running for a long time, not really sure why NV keep releasing these watermarked builds publicly, the text says things like "property of nvidia, do not distribute" when it's NV themselves that are distributing it...
Also, what is the best preset that people are using?
I set the lowest setting to be 0.6666667 (the default 'quality' setting) and used higher resolutions for the rest, i chose resolutions based on the list of resolutions available in the scaling in nvidia, but also made some of my own
[DLSSQualityLevels]
Enable = true
UltraPerformance = 0.66666667 (2/3)
Performance = 0.71111111 (32/45) alternatively 0.71428571 (5/7)
Balanced = 0.76944444 <- edit: should be0.76190476 (16/21)
Quality = 0.85 (4/5)
0.85 (2176x1224 on 1440p) is needlessly high, this is a resolution available in the scalings from nvidia in the nvidia control panel, setting this close to native is almost pointless, it doesn't give you much more quality over the other settings, and the fps benefit isn't that great
0.71111111 (1820x1024) or 0.71428571 (1828x1028) this is a nice scale, it is noticeably better in visual quality from the default quality while still benefiting a lot from rendering at significantly below native
0.76944444 (1969x1108) is from the scalings nvidia lists when you turn on their scaling (it's called 77% there but this is the number they are aiming for), this is very close to 1920x1080, but it scales better and this is what i use now instead of playing on native with taa)
edit: 0.76944444 should actually be0.7619047619047619... (16/21) but i guess both will round off to the same actual resolution but it's better to get it right
0.75 would be exactly 1920x1440, this does actually look pretty good in the witcher 3, but dlss and nvidia's scaling in general seems to mostly look better when it's not working with such fractions like 3/4, 5/8 (although the default performance is exactly 1/2 so maybe it's a brain ghost of mine) so it's better to choose something a little above or below this
I wrote this when I was using a 2560x1440 monitor, these numbers might not be as useful for other resolutions... I bought myself a 4k oled tv (LG C2 42") that i now use as my main monitor, and I realize nvidia uses slightly different values internally for its scaling settings in 4k... I'm not at home right now so i don't have the numbers but if you underestand what i'm saying here i'm sure you can find it yourself. In 4k i'm no longer using DLSStweaks to get higher than "Quality" settings, instead i tend to want something near "balanced", for W3, slightly lower than "balanced" (which is 0.58837890625, which is 2260x1271 at 4k or 1506x847 at 1440p) something about the memory bandwith or RT limitations on 3080 12GB of the latest update (it was even worse before) makes it drop sharply in performance when i run it at higher than something like 0.575 (2208x1242) so i tend to run it at 2048x1224, which is really 0.53333333x0.56666667 so it has a little bit more vertical information to work with.
Also in the latest DLSSTweaks you can just input resolutions directly, which is great because you don't have to use the same screen ratio as your actual display... so you might go with 2048x1440 or something and run that on a 2560x1440 screen, this will basically be DLAA vertically and DLSS horizontally, some games don't like this though (the only one that I've found so far is cyberpunk2077, which gives a black output if you don't use the same ratio as the output) but most games seem fine with it... usually a little more data on the height might be better... since almost-horizontal objects will show aliasing with less horizontal data, i've even tried some games in 4k with 2160x2160... (on a normal 16:9 tv)
Uncharted Collection looks stunning at 4K with DLSS at a render resolution of 80%.